Multiple Choice
Your company is migrating their 30-node Apache Hadoop cluster to the cloud. They want to re-use Hadoop jobs they have already created and minimize the management of the cluster as much as possible. They also want to be able to persist data beyond the life of the cluster. What should you do?
A) Create a Google Cloud Dataflow job to process the data.
B) Create a Google Cloud Dataproc cluster that uses persistent disks for HDFS.
C) Create a Hadoop cluster on Google Compute Engine that uses persistent disks.
D) Create a Cloud Dataproc cluster that uses the Google Cloud Storage connector.
E) Create a Hadoop cluster on Google Compute Engine that uses Local SSD disks.
Correct Answer:

Verified
Correct Answer:
Verified
Q239: You are using Google BigQuery as your
Q240: You need to create a new transaction
Q241: Your financial services company is moving to
Q242: You have some data, which is shown
Q243: You decided to use Cloud Datastore to
Q245: Your company produces 20,000 files every hour.
Q246: You work for an economic consulting firm
Q247: You are designing storage for 20 TB
Q248: You are integrating one of your internal
Q249: MJTelco Case Study Company Overview MJTelco is