Big Halloween Sale Limited Time 70% Discount Offer - Ends in 0d 00h 00m 00s - Coupon code: simple70

Databricks Certified Associate Developer for Apache Spark 3.5 – Python Databricks-Certified-Associate-Developer-for-Apache-Spark-3.5 Question # 5 Topic 1 Discussion

Databricks Certified Associate Developer for Apache Spark 3.5 – Python Databricks-Certified-Associate-Developer-for-Apache-Spark-3.5 Question # 5 Topic 1 Discussion

Databricks-Certified-Associate-Developer-for-Apache-Spark-3.5 Exam Topic 1 Question 5 Discussion:
Question #: 5
Topic #: 1

39 of 55.

A Spark developer is developing a Spark application to monitor task performance across a cluster.

One requirement is to track the maximum processing time for tasks on each worker node and consolidate this information on the driver for further analysis.

Which technique should the developer use?


A.

Broadcast a variable to share the maximum time among workers.


B.

Configure the Spark UI to automatically collect maximum times.


C.

Use an RDD action like reduce() to compute the maximum time.


D.

Use an accumulator to record the maximum time on the driver.


Get Premium Databricks-Certified-Associate-Developer-for-Apache-Spark-3.5 Questions

Contribute your Thoughts:


Chosen Answer:
This is a voting comment (?). It is better to Upvote an existing comment if you don't have anything to add.