Comprehensive and Detailed Explanation From Exact Extract:
The Spark UI is the standard and most effective way to inspect executor logs, task time, input size, and shuffles.
From the Databricks documentation:
“You can monitor job execution via the Spark Web UI. It includes detailed logs and metrics, including task-level execution time, shuffle reads/writes, and executor memory usage.”
(Source: Databricks Spark Monitoring Guide)
Option A is incorrect: logs are not guaranteed to be in/tmp, especially in cloud environments.
B.—verbosehelps during job submission but doesn't give detailed executor logs.
D.spark-sqlis a CLI tool for running queries, not for inspecting logs.
Hence, the correct method is using the Spark UI → Stages tab → Executor logs.
Submit