
Answer-first summary for fast verification
Answer: Provide the explicit storage path using the `LOCATION` clause or the `.option("path", ...)` method alongside `USING DELTA` when creating the table.
To create an **external (unmanaged)** table in Databricks, you must explicitly provide a storage path (such as an S3, ADLS, or GCS URI) using the `LOCATION` keyword in SQL or the `.option("path", ...)` parameter in the DataFrame API. When a table is created with a specific location, Databricks registers the metadata in the metastore while the data files remain at the external path. This satisfies the architectural policy because dropping an external table removes only the metadata, leaving the actual data files intact. * **Why other options are incorrect:** * Using `LOCATION` at the database level only sets the default root for managed tables; it doesn't make them external. * Mounting storage makes it easier to reference paths but does not change the table's management status by itself. * The `UNMANAGED` keyword does not exist in Spark SQL; the distinction between managed and external is determined by whether a `LOCATION` is provided.
Author: LeetQuiz Editorial Team
Ultimate access to all questions.
No comments yet.
A data architect has implemented a policy requiring all tables in the Lakehouse to be configured as external (unmanaged) Delta Lake tables. Which approach should a data engineer use to ensure compliance with this requirement?
A
Define the LOCATION keyword at the database level to set a global default storage path for all tables.
B
Ensure that all external cloud object storage is mounted during workspace configuration.
C
Include the UNMANAGED keyword in the CREATE TABLE statement when defining the schema.
D
Provide the explicit storage path using the LOCATION clause or the .option("path", ...) method alongside USING DELTA when creating the table.