
Ultimate access to all questions.
Your application needs to process credit card transactions and must comply with Payment Card Industry Data Security Standard (PCI DSS) regulations. To minimize the scope of PCI compliance while still enabling the analysis of transactional data and trends related to payment methods, how should you design your architecture?
A
Create a tokenizer service and store only tokenized data
B
Create separate projects that only process credit card data
C
Create separate subnetworks and isolate the components that process credit card data
D
Streamline the audit discovery phase by labeling all of the virtual machines (VMs) that process PCI data
E
Enable Logging export to Google BigQuery and use ACLs and views to scope the data shared with the auditor
Explanation:
The correct answer is A: Create a tokenizer service and store only tokenized data. Tokenization replaces sensitive credit card information with unique, randomly-generated tokens that cannot be used fraudulently. This approach significantly reduces the scope of PCI DSS compliance since only the tokenization service would need to comply, instead of the entire application. This not only minimizes the amount of sensitive data requiring protection but also simplifies auditing and compliance processes. Analyzing transactional data and trends does not require the actual credit card numbers, making this approach effective for your needs.