Google Certified Professional - Cloud Architect (GCP) Professional-Cloud-Architect Question # 7 Topic 1 Discussion

Google Certified Professional - Cloud Architect (GCP) Professional-Cloud-Architect Question # 7 Topic 1 Discussion

Professional-Cloud-Architect Exam Topic 1 Question 7 Discussion:
Question #: 7
Topic #: 1

For this question, refer to the TerramEarth case study.

TerramEarth's 20 million vehicles are scattered around the world. Based on the vehicle's location its telemetry data is stored in a Google Cloud Storage (GCS) regional bucket (US. Europe, or Asia). The CTO has asked you to run a report on the raw telemetry data to determine why vehicles are breaking down after 100 K miles. You want to run this job on all the data. What is the most cost-effective way to run this job?


A.

Move all the data into 1 zone, then launch a Cloud Dataproc cluster to run the job.


B.

Move all the data into 1 region, then launch a Google Cloud Dataproc cluster to run the job.


C.

Launch a cluster in each region to preprocess and compress the raw data, then move the data into a multi region bucket and use a Dataproc cluster to finish the job.


D.

Launch a cluster in each region to preprocess and compress the raw data, then move the data into a region bucket and use a Cloud Dataproc cluster to finish the jo


Get Premium Professional-Cloud-Architect Questions

Contribute your Thoughts:


Chosen Answer:
This is a voting comment (?). It is better to Upvote an existing comment if you don't have anything to add.