
Ultimate access to all questions.
Your teammate has asked you to review the code below. Its purpose is to efficiently add a large number of small rows to a BigQuery table.
BigQuery service = BigQueryOptions.newBuilder().build().getService();
public void writeToBigQuery(Collection<Map<String, String>> rows) {
for (Map<String, String> row : rows) {
InsertAllRequest insertRequest = InsertAllRequest.newBuilder(
"datasetId", "tableId",
InsertAllRequest.RowToInsert.of(row)).build();
service.insertAll(insertRequest);
}
}
BigQuery service = BigQueryOptions.newBuilder().build().getService();
public void writeToBigQuery(Collection<Map<String, String>> rows) {
for (Map<String, String> row : rows) {
InsertAllRequest insertRequest = InsertAllRequest.newBuilder(
"datasetId", "tableId",
InsertAllRequest.RowToInsert.of(row)).build();
service.insertAll(insertRequest);
}
}
Which improvement should you suggest your teammate make?
A
Include multiple rows with each request.
B
Perform the inserts in parallel by creating multiple threads.
C
Write each row to a Cloud Storage object, then load into BigQuery.
D
Write each row to a Cloud Storage object in parallel, then load into BigQuery.