Multiple Choice
An external customer provides you with a daily dump of data from their database. The data flows into Google Cloud Storage GCS as comma-separated values (CSV) files. You want to analyze this data in Google BigQuery, but the data could have rows that are formatted incorrectly or corrupted. How should you build this pipeline?
A) Use federated data sources, and check data in the SQL query.
B) Enable BigQuery monitoring in Google Stackdriver and create an alert.
C) Import the data into BigQuery using the gcloud CLI and set max_bad_records to 0 . Import the data into BigQuery using the gcloud CLI and set max_bad_records to 0 .
D) Run a Google Cloud Dataflow batch pipeline to import the data into BigQuery, and push errors to another dead-letter table for analysis.
Correct Answer:

Verified
Correct Answer:
Verified
Q165: Which of these are examples of a
Q166: Flowlogistic Case Study Company Overview Flowlogistic is
Q167: You need to migrate a 2TB relational
Q168: You want to use Google Stackdriver Logging
Q169: Which Java SDK class can you use
Q171: Which of the following are feature engineering
Q172: What is the HBase Shell for Cloud
Q173: MJTelco Case Study Company Overview MJTelco is
Q174: Which row keys are likely to cause
Q175: What are two of the characteristics of