
Answer-first summary for fast verification
Answer: Task
The Spark execution hierarchy consists of Jobs, Stages, and Tasks. A Job is created when an action is called (e.g., collect, save). Each Job is divided into Stages based on wide transformations (shuffles). Stages are further divided into Tasks, which are the smallest units of execution and correspond to partitions of data processed on executors. Executors (B) are JVM processes running on Nodes (C), while Slots (E) represent parallel task execution capacity within an executor. The most granular level in this hierarchy is the Task (A).
Author: LeetQuiz Editorial Team
Ultimate access to all questions.
No comments yet.