
Ultimate access to all questions.
You are working on a data pipeline that processes data from a retail company. The data includes inventory records with product information. You have been tasked with ensuring the data quality of the inventory records dataset. Describe the steps you would take to run data quality checks on the inventory records dataset and explain how you would define data quality rules to ensure the data is up-to-date and accurate.
A
Run data quality checks by manually inspecting each inventory record and identifying any outdated or inaccurate information.
B
Use AWS Glue to run data quality checks by writing custom scripts that identify outdated or inaccurate information in the inventory records.
C
Define data quality rules using AWS Glue DataBrew by creating a new project, selecting the inventory records dataset, and specifying rules to ensure the data is up-to-date and accurate.
D
Ignore data quality checks and assume the data is up-to-date and accurate.