
Ultimate access to all questions.
You are working on a data pipeline that processes data from a financial services company. The data includes transaction records with information about customer transactions and account balances. You have been tasked with ensuring the data quality of the transaction records dataset. Describe the steps you would take to run data quality checks on the transaction records dataset and explain how you would define data quality rules to identify and resolve data inconsistencies related to account balances.
A
Run data quality checks by manually inspecting each transaction record and identifying inconsistencies in account balances.
B
Use AWS Glue to run data quality checks by writing custom scripts that identify inconsistencies in account balances based on specific conditions.
C
Define data quality rules using AWS Glue DataBrew by creating a new project, selecting the transaction records dataset, and specifying rules to identify and resolve data inconsistencies related to account balances.
D
Ignore data quality checks and assume the account balances are consistent.