A free, open source, and privacy-focused browser extension to block “not safe for work” content built using TypeScript and TensorFlow.js.
-
Updated
Aug 7, 2024 - TypeScript
A free, open source, and privacy-focused browser extension to block “not safe for work” content built using TypeScript and TensorFlow.js.
A JavaScript image classifier used to identify explicit/pornographic content written in TypeScript.
This Node.js backend server provides AI-driven content moderation using the NSFW library (powered by TensorFlow) and OpenAI’s free moderation endpoint. The system is designed to automatically detect and flag inappropriate content in images and text.
AI Content Moderation Tool to detect and flag NSFW images and text.
A powerful image analysis platform that provides automated insights through advanced computer vision and machine learning capabilities.
Add a description, image, and links to the nsfw-detection topic page so that developers can more easily learn about it.
To associate your repository with the nsfw-detection topic, visit your repo's landing page and select "manage topics."