
Ultimate access to all questions.
How can you configure a Delta Lake table ingesting records from multiple Kafka topics (with schema: key BINARY, value BINARY, topic STRING, partition LONG, offset LONG, timestamp LONG) to meet these requirements:
A
All data should be deleted biweekly; Delta Lake's time travel functionality should be leveraged to maintain a history of non-PII information.
B
Data should be partitioned by the registration field, allowing ACLs and delete statements to be set for the PII directory.
C
Data should be partitioned by the topic field, allowing ACLs and delete statements to leverage partition boundaries.
D
Separate object storage containers should be specified based on the partition field, allowing isolation at the storage level.