Multiple Choice
Your company is in the process of migrating its on-premises data warehousing solutions to BigQuery. The existing data warehouse uses trigger-based change data capture (CDC) to apply updates from multiple transactional database sources on a daily basis. With BigQuery, your company hopes to improve its handling of CDC so that changes to the source systems are available to query in BigQuery in near-real time using log-based CDC streams, while also optimizing for the performance of applying changes to the data warehouse. Which two steps should they take to ensure that changes are available in the BigQuery reporting table with minimal latency while reducing compute overhead? (Choose two.)
A) Perform a DML INSERT, UPDATE, or DELETE to replicate each individual CDC record in real time directly on the reporting table.
B) Insert each new CDC record and corresponding operation type to a staging table in real time.
C) Periodically DELETE outdated records from the reporting table.
D) Periodically use a DML MERGE to perform several DML INSERT, UPDATE, and DELETE operations at the same time on the reporting table.
E) Insert each new CDC record and corresponding operation type in real time to the reporting table, and use a materialized view to expose only the newest version of each unique record.
Correct Answer:

Verified
Correct Answer:
Verified
Q178: You work for a large fast food
Q179: Which of the following IAM roles does
Q180: Your company is streaming real-time sensor data
Q181: Your software uses a simple JSON format
Q182: When you store data in Cloud Bigtable,
Q184: You want to build a managed Hadoop
Q185: You designed a database for patient records
Q186: You are migrating your data warehouse to
Q187: Cloud Bigtable is Google's _ Big Data
Q188: Your company's on-premises Apache Hadoop servers are