
Ultimate access to all questions.
You are tasked with implementing an image moderation solution using Azure AI Content Safety to automatically detect and filter out inappropriate images from a large dataset of images stored in a cloud storage service. Describe the steps you would take to process the dataset using the Azure AI Content Safety service and how you would handle any images that are flagged as inappropriate.
A
Use the Azure AI Content Safety service's REST API to send individual image URLs from the cloud storage service to the service for moderation, and manually review and handle any images flagged as inappropriate.
B
Use the Azure AI Content Safety service's REST API to send a batch of image URLs from the cloud storage service to the service for moderation, and automatically block any images flagged as inappropriate without further review.
C
Use the Azure AI Content Safety service's REST API to send a batch of image URLs from the cloud storage service to the service for moderation, and set up a workflow to review and handle any images flagged as inappropriate based on predefined criteria, including moving them to a separate storage container.
D
Use the Azure AI Content Safety service's REST API to send a batch of image URLs from the cloud storage service to the service for moderation, and automatically delete any images flagged as inappropriate without any review or handling process.