
Research
/Security News
11 Malicious Go Packages Distribute Obfuscated Remote Payloads
Socket uncovered 11 malicious Go packages using obfuscated loaders to fetch and execute second-stage payloads via C2 domains.
newd
analyzes images and identifies specific NSFW body parts with high accuracy. It can also optionally censor detected areas.
pip install newd
from newd import detect
# Standard detection with default settings
results = detect('path/to/image.jpg')
print(results)
# Faster detection with slightly reduced accuracy
results = detect('image.jpg', mode="fast")
# Adjust detection sensitivity
results = detect('image.jpg', min_prob=0.3) # Lower threshold catches more potential matches
# Combine options
results = detect('image.jpg', mode="fast", min_prob=0.3)
The detect()
function accepts:
cv2
)Detection results are returned as a list of dictionaries:
[
{
'box': [x1, y1, x2, y2], # Bounding box coordinates (top-left, bottom-right)
'score': 0.825, # Confidence score (0-1)
'label': 'EXPOSED_BREAST_F' # Classification label
},
# Additional detections...
]
When importing newd
for the first time, it will download a 139MB model file to your home directory (~/.newd/
). This happens only once.
newd.censor()
masks detected NSFW regions with solid black rectangles. Use it when you need to create a safe-for-work version of an image.
from newd import censor
# Censor all detected areas and write the result
censored_img = censor(
'image.jpg',
out_path='image_censored.jpg' # file will be written to disk
)
# Only censor specific labels (e.g. exposed anus & male genitals)
selected_parts = ['EXPOSED_ANUS_F', 'EXPOSED_GENITALIA_M']
censored_img = censor(
'image.jpg',
out_path='image_censored.jpg',
parts_to_blur=selected_parts
)
Function parameters:
Parameter | Type | Description |
---|---|---|
img_path | str / Path | Source image or path. |
out_path | str / Path, optional | Destination path; if omitted you can still obtain the result via the return value when visualize=True . |
visualize | bool, default False | If True , the censored numpy.ndarray image is returned for display (cv2.imshow , etc.). |
parts_to_blur | List[str], optional | Restrict censoring to given label names. When empty, all detected labels are censored. |
If neither out_path
nor visualize=True
is supplied, the function exits early because there is nowhere to deliver the censored image.
FAQs
Nudity detection through deep learning
We found that newd demonstrated a healthy version release cadence and project activity because the last version was released less than a year ago. It has 1 open source maintainer collaborating on the project.
Did you know?
Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.
Research
/Security News
Socket uncovered 11 malicious Go packages using obfuscated loaders to fetch and execute second-stage payloads via C2 domains.
Security News
TC39 advances 11 JavaScript proposals, with two moving to Stage 4, bringing better math, binary APIs, and more features one step closer to the ECMAScript spec.
Research
/Security News
A flawed sandbox in @nestjs/devtools-integration lets attackers run code on your machine via CSRF, leading to full Remote Code Execution (RCE).