Pre-Winter Sale Limited Time 65% Discount Offer - Ends in 0d 00h 00m 00s - Coupon code: pass65

Google Professional Data Engineer Exam Professional-Data-Engineer Question # 46 Topic 5 Discussion

Google Professional Data Engineer Exam Professional-Data-Engineer Question # 46 Topic 5 Discussion

Professional-Data-Engineer Exam Topic 5 Question 46 Discussion:
Question #: 46
Topic #: 5

Your company’s on-premises Apache Hadoop servers are approaching end-of-life, and IT has decided to migrate the cluster to Google Cloud Dataproc. A like-for-like migration of the cluster would require 50 TB of Google Persistent Disk per node. The CIO is concerned about the cost of using that much block storage. You want to minimize the storage cost of the migration. What should you do?


A.

Put the data into Google Cloud Storage.


B.

Use preemptible virtual machines (VMs) for the Cloud Dataproc cluster.


C.

Tune the Cloud Dataproc cluster so that there is just enough disk for all data.


D.

Migrate some of the cold data into Google Cloud Storage, and keep only the hot data in Persistent Disk.


Get Premium Professional-Data-Engineer Questions

Contribute your Thoughts:


Chosen Answer:
This is a voting comment (?). It is better to Upvote an existing comment if you don't have anything to add.