
Answer-first summary for fast verification
Answer: Three new jobs named "Ingest new data" will be defined in the workspace, but no jobs will be executed.
The /jobs/create endpoint in Databricks is designed to create a new job definition each time it is called. When the provided JSON configuration is posted to this endpoint three times, it results in the creation of three separate job definitions in the workspace, all named "Ingest new data". However, simply creating a job does not automatically trigger its execution. The jobs will only execute if they are manually triggered or scheduled. The configuration specifies an existing_cluster_id, indicating that if the jobs are triggered, they will run on the specified all-purpose cluster. Therefore, the correct outcome is that three new jobs are defined in the workspace, but none are executed as a result of the create operations.
Author: LeetQuiz Editorial Team
Ultimate access to all questions.
No comments yet.
A junior data engineer has configured a workload that submits the following JSON to the Databricks REST API endpoint 2.0/jobs/create:
{
"name": "Ingest new data",
"existing_cluster_id": "6015-954420-peace720",
"notebook_task": {
"notebook_path": "/Prod/ingest.py"
}
}
{
"name": "Ingest new data",
"existing_cluster_id": "6015-954420-peace720",
"notebook_task": {
"notebook_path": "/Prod/ingest.py"
}
}
Assuming all configurations and referenced resources are available, what is the outcome of running this workload three times?
A
The logic defined in the referenced notebook will be executed three times on the referenced existing all purpose cluster.
B
The logic defined in the referenced notebook will be executed three times on new clusters with the configurations of the provided cluster ID.
C
Three new jobs named "Ingest new data" will be defined in the workspace, but no jobs will be executed.
D
One new job named "Ingest new data" will be defined in the workspace, but it will not be executed.