Databricks Certified Associate Developer for Apache Spark 3.5 – Python Databricks-Certified-Associate-Developer-for-Apache-Spark-3.5 Question # 32 Topic 4 Discussion

Databricks Certified Associate Developer for Apache Spark 3.5 – Python Databricks-Certified-Associate-Developer-for-Apache-Spark-3.5 Question # 32 Topic 4 Discussion

Databricks-Certified-Associate-Developer-for-Apache-Spark-3.5 Exam Topic 4 Question 32 Discussion:
Question #: 32
Topic #: 4

Which Spark configuration controls the number of tasks that can run in parallel on the executor?

Options:


A.

spark.executor.cores


B.

spark.task.maxFailures


C.

spark.driver.cores


D.

spark.executor.memory


Get Premium Databricks-Certified-Associate-Developer-for-Apache-Spark-3.5 Questions

Contribute your Thoughts:


Chosen Answer:
This is a voting comment (?). It is better to Upvote an existing comment if you don't have anything to add.