Ultimate access to all questions.
Upgrade Now 🚀
Sign in to unlock AI tutor
What is the correct method to inject Python variables table_name and database_name into a SQL query for execution using PySpark?
table_name
database_name
A
spark.sql(f“SELECT * FROM [database_name].[table_name]“)
B
spark.sql(f“SELECT * FROM {database_name}.{table_name}“)
C
spark.sql(f“SELECT * FROM (database_name).(table_name)“)
D
spark.sql(f“SELECT * FROM {database_name}+{table_name}“)
E
spark.sql(“SELECT * FROM .“)