The Spark configuration spark.executor.cores defines how many concurrent tasks can be executed within a single executor process.
Each executor is assigned a number of CPU cores.
Each core executes one task at a time.
Therefore, increasing spark.executor.cores allows an executor to run more tasks concurrently.
Example:
--conf spark.executor.cores=4
→ Each executor can run 4 parallel tasks.
Why the other options are incorrect:
B (spark.task.maxFailures): Sets retry attempts for failed tasks.
C (spark.executor.memory): Sets executor memory, not concurrency.
D (spark.sql.shuffle.partitions): Defines number of shuffle partitions, not executor concurrency.
[References:, Spark Configuration Guide — Executor cores, tasks, and parallelism., Databricks Exam Guide (June 2025): Section “Apache Spark Architecture and Components” — executor configuration, CPU cores, and parallel task execution., , , ]
Submit