86 skills found · Page 1 of 3
infinitered / NsfwjsNSFW detection on the client-side via TensorFlow.js
alganzory / HaramBlurA Browser extension that enables you to navigate the web with respect for your Islamic values, protect your privacy and reduce browsing distractions by auto detecting and blurring "Haram" content.
nsfw-filter / Nsfw FilterA free, open source, and privacy-focused browser extension to block “not safe for work” content built using TypeScript and TensorFlow.js.
NsfwSpy / NsfwSpy.NETA .NET image and video classifier used to identify explicit/pornographic content written in C#.
vladmandic / NudenetNudeNet: NSFW Object Detection for TFJS and NodeJS
Priler / SamuraiSimple, yet effective auto-moderator bot for Telegram. With reports, logs, profanity filter, anti-spam AI, NSFW detection AI and more :3
minto5050 / NSFW DetectionTrained tensorflow model for detecting nudity in images
steelcityamir / Safe Content AIA fast accurate API for detecting NSFW images.
helloxz / Nsfw轻量级NSFW识别方案,支持私有部署和HTTP API调用。
andresribeiro / Nsfwjs DockerDocker-Powered Self-Hosted NSFW Detection API
TheHamkerCat / SpamProtectionRobotAnti Spam/NSFW Telegram Bot Written In Python With Pyrogram.
SashiDo / Content Moderation Image ApiAn NSFW Image Classification REST API for effortless Content Moderation built with Node.js, Tensorflow, and Parse Server
TheHamkerCat / NSFW Detection APIRest API Written In Python To Classify NSFW Images.
LAION-AI / LAION SAFETYAn open toolbox for NSFW & toxicity detection
photoprism / Photoprism VisionComputer Vision Playground ⚡️
amirzenoozi / Nsfw ClassificationA Simple NSFW Classifier Based on Keras and Tensorflow
lakshaychhabra / NSFW Detection DLThis repo contains Deep learning Implementation for identifying NSFW images.
im-syn / SafeVisionSafeVision is a professional Python script designed to detect and blur nudity in both videos and images.
NsfwSpy / NsfwSpy.jsA JavaScript image classifier used to identify explicit/pornographic content written in TypeScript.
trumanwong / ComfyUI NSFW DetectionThis project is designed to detect whether images generated by ComfyUI are Not Safe For Work (NSFW). It uses a machine learning model to classify images as either safe or not safe for work. If an image is classified as NSFW, an alternative image is returned.