Skip to content

coxapp/nsfw_model

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

28 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

NSFW Detection Model

Trained on 60+ Gigs of data to identify:

  • drawings - safe for work drawings (including anime)
  • hentai - hentai and pornographic drawings
  • neutral - safe for work neutral images
  • porn - pornographic images, sexual acts
  • sexy - sexually explicit images, not pornography

Current Status:

90% Accuracy with the following confusion matrix, based on Inception V3. nsfw confusion matrix

Review the _art folder for previous incarnations of this model.

Download

Please feel free to use this model to help your products!

If you'd like to say thanks for creating this, I'll take a donation for hosting costs.

Repo Contents

Simple description of the scripts used to create this model:

  • train_inception_model.py - The code used to train the Keras based Inception V3 Transfer learned model.
  • visuals.py - The code to create the confusion matrix graphic
  • self_clense.py - The training data came down with some significant inaccuracy. Self clense helped me use early iterations of the mode, to cross validate errors in the training data in reasonable time. The better the model got, the better I could use it to clean the training data manually. Most importantly, this also allowed me to clean the validation dataset, and get a real indication of generalized performance.

Extra Info

There's no easy way to distribute the training data, but if you'd like to help with this model or train other models, get in touch with me and we can work together.

My twitter is @GantLaborde - I'm a School Of AI Wizard New Orleans. I run the twitter account @FunMachineLearn

Learn more about me and the company I work for.

Special thanks to the nsfw_data_scraper for the training data.

About

Keras model of NSFW detector

Resources

License

Stars

Watchers

Forks

Packages

No packages published

Languages

  • Python 100.0%