
Answer-first summary for fast verification
Answer: Deploy an AWS DataSync agent on a new general purpose Amazon EC2 instance. Configure the DataSync agent to transfer the batch files from the on-premises NFS server to Amazon S3 Standard. After the transfer, delete the data from the on-premises storage. Implement an S3 Lifecycle rule to transition objects from S3 Standard to S3 Glacier Deep Archive after one day.
Option C is the most cost-effective solution that meets the company's requirements. Deploying an AWS DataSync agent on a new general-purpose Amazon EC2 instance allows efficient transfer of the batch files from the on-premises NFS server to Amazon S3 Standard. Given that the company does not have sufficient storage capacity on-premises, moving data to S3 Standard provides immediate storage space. By configuring an S3 Lifecycle rule to transition objects to S3 Glacier Deep Archive after one day, the company can optimize storage costs further because S3 Glacier Deep Archive is a much cheaper long-term storage option and meets the requirement of retrieving data within a week. The use of DataSync ensures that bandwidth limits can be set and transfers can be scheduled during non-business hours, aligning with the company's needs.
Author: LeetQuiz Editorial Team
Ultimate access to all questions.
No comments yet.
A company operates an application that processes and stores image data on-premises. This application handles millions of new image files daily, each averaging 1 MB in size. The application processes these files in 1 GB batches, zipping them together before archiving them as a single file on an on-premises NFS server for long-term storage. The company has a Microsoft Hyper-V environment with available compute resources but lacks sufficient storage capacity. They aim to archive these images on AWS and require the capability to retrieve archived data within one week of a request. The company maintains a 10 Gbps AWS Direct Connect link between their on-premises data center and AWS, and they need to manage bandwidth limits and schedule data transfers to AWS during off-peak business hours. What is the most cost-effective solution to meet these requirements?
A
Deploy an AWS DataSync agent on a new GPU-based Amazon EC2 instance. Configure the DataSync agent to transfer the batch files from the on-premises NFS server to Amazon S3 Glacier Instant Retrieval. Subsequently, delete the data from the on-premises storage.
B
Deploy an AWS DataSync agent as a Hyper-V VM on premises. Configure the DataSync agent to transfer the batch files from the on-premises NFS server to Amazon S3 Glacier Deep Archive. After the transfer, delete the data from the on-premises storage.
C
Deploy an AWS DataSync agent on a new general purpose Amazon EC2 instance. Configure the DataSync agent to transfer the batch files from the on-premises NFS server to Amazon S3 Standard. After the transfer, delete the data from the on-premises storage. Implement an S3 Lifecycle rule to transition objects from S3 Standard to S3 Glacier Deep Archive after one day.
D
Deploy an AWS Storage Gateway Tape Gateway on premises in the Hyper-V environment. Connect the Tape Gateway to AWS and use automatic tape creation, specifying an Amazon S3 Glacier Deep Archive pool. Eject the tape once the batch of images is copied.