
Ultimate access to all questions.
You have a locally hosted data analytics toolkit that processes data files in memory for about 50 minutes each night, with data sizes ranging from 1GB to 20GB. What is the most efficient and cost-effective way to migrate this toolkit to Google Cloud?
A
Package the executables in a container and use Cloud Scheduler to trigger a Cloud Run task for the container.
B
Bundle the binary files in a container, deploy it on Google Kubernetes Engine (GKE), and use the Kubernetes scheduler to start the application.
C
Move the code to Cloud Functions and use Cloud Scheduler to initiate the application.
D
Migrate the entire setup to a Compute Engine virtual machine (VM) and use an instance schedule to start and stop the VM as needed.