LeetQuiz Logo
Privacy Policy•contact@leetquiz.com
© 2025 LeetQuiz All rights reserved.
AWS Certified Solutions Architect - Professional

AWS Certified Solutions Architect - Professional

Get started today

Ultimate access to all questions.


A company operates an application that processes and stores image data on-premises. This application handles millions of new image files daily, each averaging 1 MB in size. The application processes these files in 1 GB batches, zipping them together before archiving them as a single file on an on-premises NFS server for long-term storage. The company has a Microsoft Hyper-V environment with available compute resources but lacks sufficient storage capacity. They aim to archive these images on AWS and require the capability to retrieve archived data within one week of a request. The company maintains a 10 Gbps AWS Direct Connect link between their on-premises data center and AWS, and they need to manage bandwidth limits and schedule data transfers to AWS during off-peak business hours. What is the most cost-effective solution to meet these requirements?

Exam-Like



Explanation:

Option C is the most cost-effective solution that meets the company's requirements. Deploying an AWS DataSync agent on a new general-purpose Amazon EC2 instance allows efficient transfer of the batch files from the on-premises NFS server to Amazon S3 Standard. Given that the company does not have sufficient storage capacity on-premises, moving data to S3 Standard provides immediate storage space. By configuring an S3 Lifecycle rule to transition objects to S3 Glacier Deep Archive after one day, the company can optimize storage costs further because S3 Glacier Deep Archive is a much cheaper long-term storage option and meets the requirement of retrieving data within a week. The use of DataSync ensures that bandwidth limits can be set and transfers can be scheduled during non-business hours, aligning with the company's needs.

Powered ByGPT-5