
Ultimate access to all questions.
What is the correct method to inject Python variables table_name and database_name into a SQL query for execution using PySpark?
A
spark.sql(f“SELECT * FROM [database_name].[table_name]“)*
B
spark.sql(f“SELECT * FROM {database_name}.{table_name}“)*
C
spark.sql(f“SELECT * FROM (database_name).(table_name)“)*
D
spark.sql(f“SELECT * FROM {database_name}+{table_name}“)*
E
spark.sql(“SELECT * FROM .“)*