NSFW Detection Machine Learning Model

Trained on 60+ Gigs of data to identify:
drawings
- safe for work drawings (including anime)
hentai
- hentai and pornographic drawings
neutral
- safe for work neutral images
porn
- pornographic images, sexual acts
sexy
- sexually explicit images, not pornography
This model powers NSFW JS - More Info
Current Status:
93% Accuracy with the following confusion matrix, based on Inception V3.

Review the _art
folder for previous incarnations of this model.
Requirements:
keras (tested with versions > 2.0.0)
tensorflow >= 2.1.0
Usage
For programmatic use of the library.
from nsfw_detector import predict
model = predict.load_model('./nsfw_mobilenet2.224x224.h5')
predict.classify(model, '2.jpg')
predict.classify(model, ['/Users/bedapudi/Desktop/2.jpg', '/Users/bedapudi/Desktop/6.jpg'])
predict.classify(model, '/Users/bedapudi/Desktop/')
If you've installed the package or use the command-line this should work, too...
nsfw-predict --saved_model_path mobilenet_v2_140_224 --image_source test.jpg
nsfw-predict --saved_model_path mobilenet_v2_140_224 --image_source images
python3 nsfw_detector/predict.py --saved_model_path mobilenet_v2_140_224 --image_source test.jpg
Download
Please feel free to use this model to help your products!
If you'd like to say thanks for creating this, I'll take a donation for hosting costs.
Latest Models Zip (v1.1)
https://github.com/GantMan/nsfw_model/releases/tag/1.1.0
Original Inception v3 Model (v1.0)
Original Mobilenet v2 Model (v1.0)
PyTorch Version
Kudos to the community for creating a PyTorch version with resnet!
https://github.com/yangbisheng2009/nsfw-resnet
Training Folder Contents
Simple description of the scripts used to create this model:
inceptionv3_transfer/
- Folder with all the code to train the Keras based Inception v3 transfer learning model. Includes constants.py
for configuration, and two scripts for actual training/refinement.
mobilenetv2_transfer/
- Folder with all the code to train the Keras based Mobilenet v2 transfer learning model.
visuals.py
- The code to create the confusion matrix graphic
self_clense.py
- If the training data has significant inaccuracy, self_clense
helps cross validate errors in the training data in reasonable time. The better the model gets, the better you can use it to clean the training data manually.
e.g.
cd training
python inceptionv3_transfer/train_initialization.py
python inceptionv3_transfer/train_fine_tune.py
python visuals.py
There's no easy way to distribute the training data, but if you'd like to help with this model or train other models, get in touch with me and we can work together.
Advancements in this model power the quantized TFJS module on https://nsfwjs.com/
My twitter is @GantLaborde - I'm a School Of AI Wizard New Orleans. I run the twitter account @FunMachineLearn
Learn more about me and the company I work for.
Special thanks to the nsfw_data_scraper for the training data. If you're interested in a more detailed analysis of types of NSFW images, you could probably use this repo code with this data.
If you need React Native, Elixir, AI, or Machine Learning work, check in with us at Infinite Red, who make all these experiments possible. We're an amazing software consultancy worldwide!
Cite
@misc{man,
title={Deep NN for NSFW Detection},
url={https://github.com/GantMan/nsfw_model},
journal={GitHub},
author={Laborde, Gant}}
Contributors
Thanks goes to these wonderful people (emoji key):
This project follows the all-contributors specification. Contributions of any kind welcome!
Changes
1.1.1
- break out numpy (nd array) function
- remove classic app run modes for argparse
- one more example in README for running
- turn down verbosity in image load via file
- fix requirements for clean system (needs PIL)
1.1.0
- update to tensorflow 2.1.0 and updated mobilenet-based model
1.0.0