Learn how to use Content Moderator, an Al workload that can detect images that contain adult content or depict gory, violent scenes, to implement an Al solution for social media companies.
Table of Contents
Question
You work for a social media company and you need to implement an Al solution that can detect images that contain adult content or depict gory, violent scenes before they are posted and shared. Which Al workload should you use?
A. Object detection
B. Form recognizer
C. Content moderator
Answer
C. Content moderator
Explanation
The correct answer is C. Content moderator. Content moderator is an Al workload that can detect images that contain adult content or depict gory, violent scenes before they are posted and shared. Content moderator can also filter text and videos for inappropriate content, such as profanity, personal data, or hate speech. Content moderator can help social media companies to comply with their policies and regulations, and to protect their users from harmful or offensive content.
Microsoft Azure AI Fundamentals AI-900 certification exam practice question and answer (Q&A) dump with detail explanation and reference available free, helpful to pass the Microsoft Azure AI Fundamentals AI-900 exam and earn Microsoft Azure AI Fundamentals AI-900 certification.