
Answer-first summary for fast verification
Answer: Use managed export, and store the data in a Cloud Storage bucket using Nearline or Coldline class., Use managed export, and then import to Cloud Datastore in a separate project under a unique namespace reserved for that export.
The correct answers are A and B. Option A involves using managed export to store the data in a Cloud Storage bucket using Nearline or Coldline class, which is a cost-effective solution for long-term data growth and archival. Option B involves using managed export and then importing the data to Cloud Datastore in a separate project under a unique namespace reserved for that export. This method helps in creating snapshots for point-in-time (PIT) recovery and cloning a copy of the data in a different environment, which fits the requirements of PIT recovery and long-term archival. Options C, D, and E are not as suitable due to cost implications, limitations with entity filters and inappropriate storage solutions, as highlighted in the highly voted answers and referenced documentation.
Author: LeetQuiz Editorial Team
Ultimate access to all questions.
No comments yet.
You have chosen Cloud Datastore to ingest vehicle telemetry data in real time. Your goal is to create a storage system that supports long-term data growth while minimizing costs. Additionally, you require the ability to periodically create snapshots of this data to enable point-in-time (PIT) recovery or to clone the data for use in a different Cloud Datastore environment. You aim to archive these snapshots for long-term storage. Which two methods can achieve these objectives? (Choose two.)
A
Use managed export, and store the data in a Cloud Storage bucket using Nearline or Coldline class.
B
Use managed export, and then import to Cloud Datastore in a separate project under a unique namespace reserved for that export.
C
Use managed export, and then import the data into a BigQuery table created just for that export, and delete temporary export files.
D
Write an application that uses Cloud Datastore client libraries to read all the entities. Treat each entity as a BigQuery table row via BigQuery streaming insert. Assign an export timestamp for each export, and attach it as an extra column for each row. Make sure that the BigQuery table is partitioned using the export timestamp column.
E
Write an application that uses Cloud Datastore client libraries to read all the entities. Format the exported data into a JSON file. Apply compression before storing the data in Cloud Source Repositories.