
Research
Two Malicious Rust Crates Impersonate Popular Logger to Steal Wallet Keys
Socket uncovers malicious Rust crates impersonating fast_log to steal Solana and Ethereum wallet keys from source code.
react-native-nsfw-filter
Advanced tools
A React Native library for detecting NSFW (Not Safe For Work) content in images using TensorFlow.js. This package provides an easy-to-use interface for classifying images into categories like Drawing, Hentai, Neutral, Porn, and Sexy.
npm install react-native-nsfw-filter
✅ Now Available on npm! The package has been published and is ready for use.
You'll need to install these dependencies if they're not already in your project:
npm install @tensorflow/tfjs @tensorflow/tfjs-react-native expo-image-manipulator expo-gl @react-native-async-storage/async-storage react-native-fs
For Expo projects:
npx expo install expo-image-manipulator expo-gl @react-native-async-storage/async-storage react-native-fs
Note: For React Native CLI projects, you may need to run
cd ios && pod install
after installingreact-native-fs
.
.bin
model files by creating/updating metro.config.js
in your project root:// metro.config.js
const { getDefaultConfig } = require("expo/metro-config");
/** @type {import('expo/metro-config').MetroConfig} */
const config = getDefaultConfig(__dirname);
// Add .bin to the asset extensions
config.resolver.assetExts.push("bin");
module.exports = config;
App.js
or App.tsx
):import "@tensorflow/tfjs-react-native";
import React, { useEffect, useState } from "react";
import { View, Button, Image, Text } from "react-native";
import * as ImagePicker from "expo-image-picker";
import { NSFWFilter, NSFWPrediction } from "react-native-nsfw-filter";
// Import your model files
const modelJson = require("./assets/model/model.json");
const modelWeights = [require("./assets/model/group1-shard1of1.bin")];
const App = () => {
const [nsfwFilter, setNsfwFilter] = useState<NSFWFilter | null>(null);
const [predictions, setPredictions] = useState<NSFWPrediction[]>([]);
const [imageUri, setImageUri] = useState<string>("");
useEffect(() => {
const initializeFilter = async () => {
const filter = new NSFWFilter();
await filter.loadModel(modelJson, modelWeights);
setNsfwFilter(filter);
};
initializeFilter();
}, []);
const pickAndAnalyzeImage = async () => {
const result = await ImagePicker.launchImageLibraryAsync({
mediaTypes: ImagePicker.MediaTypeOptions.Images,
allowsEditing: true,
quality: 1,
});
if (!result.canceled && nsfwFilter) {
const uri = result.assets[0].uri;
setImageUri(uri);
// Get detailed predictions
const predictions = await nsfwFilter.classifyImage(uri);
setPredictions(predictions);
// Or just check if it's NSFW
const isNSFW = await nsfwFilter.isImageNSFW(uri, 0.6);
console.log("Is NSFW:", isNSFW);
}
};
return (
<View style={{ flex: 1, padding: 20 }}>
<Button title="Pick and Analyze Image" onPress={pickAndAnalyzeImage} />
{imageUri && (
<Image source={{ uri: imageUri }} style={{ width: 200, height: 200 }} />
)}
{predictions.map((prediction, index) => (
<Text key={index}>
{prediction.className}: {(prediction.probability * 100).toFixed(2)}%
</Text>
))}
</View>
);
};
export default App;
import { NSFWFilter, NSFWClass } from "react-native-nsfw-filter";
// Create filter with custom options
const nsfwFilter = new NSFWFilter({
imageSize: { width: 224, height: 224 },
topK: 3, // Only return top 3 predictions
});
// Load model
await nsfwFilter.loadModel(modelJson, modelWeights);
// Get specific class confidence
const pornConfidence = await nsfwFilter.getClassConfidence(
imageUri,
NSFWClass.Porn
);
// Check with custom threshold
const isNSFW = await nsfwFilter.isImageNSFW(imageUri, 0.8); // 80% threshold
// Clean up when done
nsfwFilter.dispose();
NSFWFilter
The main class for NSFW content detection.
new NSFWFilter(options?: NSFWFilterOptions)
Options:
imageSize?: { width: number; height: number }
- Image size for model input (default: 224x224)topK?: number
- Number of top predictions to return (default: 5)loadModel(modelJson: any, modelWeights: any[]): Promise<void>
Load the NSFW detection model.
isModelLoaded(): boolean
Check if the model is loaded and ready for inference.
classifyImage(imageUri: string): Promise<NSFWPrediction[]>
Classify an image and return predictions with probabilities.
isImageNSFW(imageUri: string, threshold?: number): Promise<boolean>
Check if an image is likely NSFW. Default threshold is 0.6 (60%).
getClassConfidence(imageUri: string, className: NSFWClass): Promise<number>
Get the confidence score for a specific class.
dispose(): void
Clean up the model and free memory.
NSFWPrediction
interface NSFWPrediction {
className: string;
probability: number;
}
NSFWClass
enum NSFWClass {
Drawing = "Drawing",
Hentai = "Hentai",
Neutral = "Neutral",
Porn = "Porn",
Sexy = "Sexy",
}
You'll need to include the model files in your project. The model consists of:
model.json
- Model architecturegroup1-shard1of1.bin
- Model weightsPlace these files in your assets/model/
directory and import them as shown in the usage examples.
dispose()
when you're done with the filter to free up memoryContributions are welcome! Please feel free to submit a Pull Request.
MIT
This library is built upon the excellent work by Infinite Red and their NSFWJS project. Special thanks to:
This React Native implementation builds upon their work, with fixes and optimizations specifically for React Native environments using TensorFlow.js React Native.
FAQs
NSFW content detection for React Native using TensorFlow.js
The npm package react-native-nsfw-filter receives a total of 6 weekly downloads. As such, react-native-nsfw-filter popularity was classified as not popular.
We found that react-native-nsfw-filter demonstrated a healthy version release cadence and project activity because the last version was released less than a year ago. It has 1 open source maintainer collaborating on the project.
Did you know?
Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.
Research
Socket uncovers malicious Rust crates impersonating fast_log to steal Solana and Ethereum wallet keys from source code.
Research
A malicious package uses a QR code as steganography in an innovative technique.
Research
/Security News
Socket identified 80 fake candidates targeting engineering roles, including suspected North Korean operators, exposing the new reality of hiring as a security function.