
Answer-first summary for fast verification
Answer: `fs`
The correct command group is **fs**. The `databricks fs cp` subcommand is specifically designed to copy files or directories between the local filesystem and DBFS (or Unity Catalog volumes). To make a Python Wheel available for a job, you would use a command similar to: `databricks fs cp ./my_library.whl dbfs:/FileStore/jars/`. **Analysis of other commands:** * **configure**: Only used to set up CLI authentication profiles and connection settings. * **workspace**: Used for importing, exporting, or managing notebooks and workspace folders, not for transferring raw files to DBFS. * **libraries**: Used to trigger the installation of a library onto a cluster from an existing source (like Maven or a file already in DBFS), but it cannot perform the initial upload of the file itself. * **jobs**: Used for managing job definitions, schedules, and runs, but does not provide file transfer functionality.
Author: LeetQuiz Editorial Team
Ultimate access to all questions.
No comments yet.
A data engineer has developed a custom Python library packaged as a Wheel file. After properly configuring the Databricks CLI, which command group should be used to upload this file from a local filesystem to DBFS (Databricks File System) so it can be referenced in a production job?
A
configure
B
workspace
C
fs
D
libraries
E
jobs