Multiple Choice
Your company is currently setting up data pipelines for their campaign. For all the Google Cloud Pub/Sub streaming data, one of the important business requirements is to be able to periodically identify the inputs and their timings during their campaign. Engineers have decided to use windowing and transformation in Google Cloud Dataflow for this purpose. However, when testing this feature, they find that the Cloud Dataflow job fails for the all streaming insert. What is the most likely cause of this problem?
A) They have not assigned the timestamp, which causes the job to fail
B) They have not set the triggers to accommodate the data coming in late, which causes the job to fail
C) They have not applied a global windowing function, which causes the job to fail when the pipeline is created
D) They have not applied a non-global windowing function, which causes the job to fail when the pipeline is created
Correct Answer:

Verified
Correct Answer:
Verified
Q22: You are designing storage for very large
Q23: You are building a model to make
Q24: You are implementing security best practices on
Q25: You have a requirement to insert minute-resolution
Q26: You have a data stored in BigQuery.
Q28: Which Cloud Dataflow / Beam feature should
Q29: You have a query that filters a
Q30: Which is not a valid reason for
Q31: Your analytics team wants to build a
Q32: You are designing an Apache Beam pipeline