New Year Sale Limited Time 70% Discount Offer - Ends in 0d 00h 00m 00s - Coupon code: simple70

Google Cloud Associate Data Practitioner (ADP Exam) Associate-Data-Practitioner Question # 26 Topic 3 Discussion

Google Cloud Associate Data Practitioner (ADP Exam) Associate-Data-Practitioner Question # 26 Topic 3 Discussion

Associate-Data-Practitioner Exam Topic 3 Question 26 Discussion:
Question #: 26
Topic #: 3

You work for an online retail company. Your company collects customer purchase data in CSV files and pushes them to Cloud Storage every 10 minutes. The data needs to be transformed and loaded intoBigQuery for analysis. The transformation involves cleaning the data, removing duplicates, and enriching it with product information from a separate table in BigQuery. You need to implement a low-overhead solution that initiates data processing as soon as the files are loaded into Cloud Storage. What should you do?


A.

Use Cloud Composer sensors to detect files loading in Cloud Storage. Create a Dataproc cluster, and use a Composer task to execute a job on the cluster to process and load the data into BigQuery.


B.

Schedule a direct acyclic graph (DAG) in Cloud Composer to run hourly to batch load the data from Cloud Storage to BigQuery, and process the data in BigQuery using SQL.


C.

Use Dataflow to implement a streaming pipeline using anOBJECT_FINALIZEnotification from Pub/Sub to read the data from Cloud Storage, perform the transformations, and write the data to BigQuery.


D.

Create a Cloud Data Fusion job to process and load the data from Cloud Storage into BigQuery. Create anOBJECT_FINALIZE notification in Pub/Sub, and trigger a Cloud Run function to start the Cloud Data Fusion job as soon as new files are loaded.


Get Premium Associate-Data-Practitioner Questions

Contribute your Thoughts:


Chosen Answer:
This is a voting comment (?). It is better to Upvote an existing comment if you don't have anything to add.