
Ultimate access to all questions.
Consider a scenario where you need to implement an image moderation solution using Azure AI Content Safety for an e-commerce website that allows users to upload product images. How would you configure the service to detect inappropriate content, ensure high accuracy, and manage the moderation workflow? Describe the process in detail, including any necessary Azure services and their roles.
A
Use Azure AI Content Safety directly without additional services, set low sensitivity thresholds, and manually review all images.
B
Integrate Azure AI Content Safety with Azure Functions for automated processing, set high sensitivity thresholds, and use Azure Storage for image management.
C
Deploy Azure AI Content Safety standalone, ignore sensitivity settings, and automatically approve or reject images based on default settings.
D
Configure Azure AI Content Safety with low accuracy settings, use Azure Blob Storage for image hosting, and manually intervene for every decision.