You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
NudeDetect is a Python-based tool for detecting nudity and adult content in images. This project combines the capabilities of the NudeNet library, EasyOCR for text detection, and the Better Profanity library for identifying offensive language in text.
Step-by-Step tutorial that teaches you how to use Azure Safety Content - the prebuilt AI service that helps ensure that content sent to user is filtered to safeguard them from risky or undesirable outcomes
Study Buddy is a user-friendly AI-powered web app that helps students generate safe, factual study notes and Q&A on any topic. It features user accounts, study history, and strong content safety filters—making learning interactive and secure.
Impact Analyzer is a web app that helps you detect toxicity and analyze nuance in your writing before publishing, ensuring your content is respectful, clear, and aligned with your intent.
A 3-tier diagnostic application designed for hands-on learning about securing AI systems across identity, network, application, and content safety domains.
This tutorial demonstrates how to use the Google Cloud Natural Language API for text moderation. It provides a step-by-step guide to detecting and managing harmful content while promoting responsible AI practices.
Real-time speech-to-text system with toxic content detection and filtering. Transcribes live audio using multiple ASR options while automatically detecting and masking harmful language.