New Year Sale Limited Time 70% Discount Offer - Ends in 0d 00h 00m 00s - Coupon code: simple70

Databricks Certified Associate Developer for Apache Spark 3.5 – Python Databricks-Certified-Associate-Developer-for-Apache-Spark-3.5 Question # 21 Topic 3 Discussion

Databricks Certified Associate Developer for Apache Spark 3.5 – Python Databricks-Certified-Associate-Developer-for-Apache-Spark-3.5 Question # 21 Topic 3 Discussion

Databricks-Certified-Associate-Developer-for-Apache-Spark-3.5 Exam Topic 3 Question 21 Discussion:
Question #: 21
Topic #: 3

A Spark application suffers from too many small tasks due to excessive partitioning. How can this be fixed without a full shuffle?

Options:


A.

Use the distinct() transformation to combine similar partitions


B.

Use the coalesce() transformation with a lower number of partitions


C.

Use the sortBy() transformation to reorganize the data


D.

Use the repartition() transformation with a lower number of partitions


Get Premium Databricks-Certified-Associate-Developer-for-Apache-Spark-3.5 Questions

Contribute your Thoughts:


Chosen Answer:
This is a voting comment (?). It is better to Upvote an existing comment if you don't have anything to add.