
Answer-first summary for fast verification
Answer: Use managed export, and store the data in a Cloud Storage bucket using Nearline or Coldline class., Use managed export, and then import to Cloud Datastore in a separate project under a unique namespace reserved for that export.
**Option B** (Use managed export, and store the data in a Cloud Storage bucket using Nearline or Coldline class) is correct because it allows for exporting snapshots from Cloud Datastore to a Cloud Storage bucket, utilizing Nearline or Coldline storage classes for cost-effective, long-term archival. **Option C** (Use managed export, and then import to Cloud Datastore in a separate project under a unique namespace reserved for that export) is also correct as it supports snapshot creation for long-term archival by importing data into a separate project with a unique namespace. Other options are incorrect because: - **Option A** involves storing data in Cloud Source Repositories, which is not designed for long-term data archival. - **Option D** and **Option E** involve unnecessary complexity or inappropriate use of BigQuery for snapshot archival, which does not align with the requirements for long-term data retention.
Author: LeetQuiz Editorial Team
Ultimate access to all questions.
No comments yet.
You're planning to use Cloud Datastore for handling real-time vehicle telemetry data and need a storage solution that efficiently manages continuous data growth without high costs. Additionally, you want to create periodic snapshots for point-in-time recovery and data cloning across different environments, archiving these snapshots for a long duration. Which two strategies effectively meet these requirements for managing Cloud Datastore data snapshots? (Choose Two)
A
Write an application that uses Cloud Datastore client libraries to read all the entities. Format the exported data into a JSON file. Apply compression before storing the data in Cloud Source Repositories.
B
Use managed export, and store the data in a Cloud Storage bucket using Nearline or Coldline class.
C
Use managed export, and then import to Cloud Datastore in a separate project under a unique namespace reserved for that export.
D
Write an application that uses Cloud Datastore client libraries to read all the entities. Treat each entity as a BigQuery table row via BigQuery streaming insert. Assign an export timestamp for each export, and attach it as an extra column for each row. Make sure that the BigQuery table is partitioned using the export timestamp column.
E
Use managed export, and then import the data into a BigQuery table created just for that export, and delete temporary export files.