Google Professional Data Engineer Exam Professional-Data-Engineer Question # 17 Topic 2 Discussion

Google Professional Data Engineer Exam Professional-Data-Engineer Question # 17 Topic 2 Discussion

Professional-Data-Engineer Exam Topic 2 Question 17 Discussion:
Question #: 17
Topic #: 2

An aerospace company uses a proprietary data format to store its night data. You need to connect this new data source to BigQuery and stream the data into BigQuery. You want to efficiency import the data into BigQuery where consuming as few resources as possible. What should you do?


A.

Use a standard Dataflow pipeline to store the raw data in BigQuery and then transform the format later when the data is used.


B.

Write a shell script that triggers a Cloud Function that performs periodic ETL batch jobs on the new data source


C.

Use Apache Hive to write a Dataproc job that streams the data into BigQuery in CSV format


D.

Use an Apache Beam custom connector to write a Dataflow pipeline that streams the data into BigQuery in Avro format


Get Premium Professional-Data-Engineer Questions

Contribute your Thoughts:


Chosen Answer:
This is a voting comment (?). It is better to Upvote an existing comment if you don't have anything to add.