
Answer-first summary for fast verification
Answer: Spark configuration properties set for an interactive cluster with the Clusters UI will impact all notebooks attached to that cluster.
Option D is correct because Spark configuration properties set at the cluster level (via the Clusters UI) apply to all notebooks attached to that cluster. Cluster-level settings act as defaults unless overridden at the notebook or job level. Option B is incorrect because Spark configurations set within a notebook (e.g., via `spark.conf.set`) affect only the current SparkSession. While notebooks attached to the same cluster share the same SparkSession by default, creating a new SparkSession in a notebook would isolate its configurations. Option A is false because modifying cluster configurations typically requires a restart, interrupting active jobs. Option C is incorrect because notebook-level settings override cluster-level configurations, not the other way around.
Author: LeetQuiz Editorial Team
Ultimate access to all questions.
Which statement about Spark configuration in Databricks is correct?
A
The Databricks REST API can be used to modify the Spark configuration properties for an interactive cluster without interrupting jobs currently running on the cluster.
B
Spark configurations set within a notebook will affect all SparkSessions attached to the same interactive cluster.
C
When the same Spark configuration property is set for an interactive cluster and a notebook attached to that cluster, the notebook setting will always be ignored.
D
Spark configuration properties set for an interactive cluster with the Clusters UI will impact all notebooks attached to that cluster.
No comments yet.