Pre-Winter Sale Limited Time 65% Discount Offer - Ends in 0d 00h 00m 00s - Coupon code: pass65

Google Professional Data Engineer Exam Professional-Data-Engineer Question # 57 Topic 6 Discussion

Google Professional Data Engineer Exam Professional-Data-Engineer Question # 57 Topic 6 Discussion

Professional-Data-Engineer Exam Topic 6 Question 57 Discussion:
Question #: 57
Topic #: 6

Your company is migrating their 30-node Apache Hadoop cluster to the cloud. They want to re-use Hadoop jobs they have already created and minimize the management of the cluster as much as possible. They also want to be able to persist data beyond the life of the cluster. What should you do?


A.

Create a Google Cloud Dataflow job to process the data.


B.

Create a Google Cloud Dataproc cluster that uses persistent disks for HDFS.


C.

Create a Hadoop cluster on Google Compute Engine that uses persistent disks.


D.

Create a Cloud Dataproc cluster that uses the Google Cloud Storage connector.


E.

Create a Hadoop cluster on Google Compute Engine that uses Local SSD disks.


Get Premium Professional-Data-Engineer Questions

Contribute your Thoughts:


Chosen Answer:
This is a voting comment (?). It is better to Upvote an existing comment if you don't have anything to add.