
Ultimate access to all questions.
You have 2 petabytes of historical data stored on-premises that needs to be migrated to Google Cloud Storage within six months. Given your outbound network capacity is limited to 20 Mb/sec, what is the most efficient method to achieve this migration?
A
Utilize gsutil cp with compression to reduce the data size before uploading to Cloud Storage
B
Generate a private URL for the data and employ Storage Transfer Service for the migration
C
Leverage Transfer Appliance to physically transport the data to Google Cloud Storage
D
Apply trickle or ionice alongside gsutil cp to restrict bandwidth usage below 20 Mb/sec, avoiding impact on production traffic