Image classifier for detecting lewd images
Top 30.7% on sourcepulse
This project provides a pretrained image classifier for detecting lewd content, based on Bumble's internal model. It's designed for developers and researchers needing to implement or fine-tune NSFW image detection, offering a robust solution built on EfficientNet-v2.
How It Works
The model utilizes an EfficientNet-v2 architecture, trained on a proprietary dataset of lewd images. This approach leverages transfer learning for efficient and accurate classification of potentially sensitive content. The model is released in SavedModel format, facilitating straightforward deployment across various platforms.
Quick Start & Requirements
conda env create -f environment.yaml
and conda activate private_detector
.python3 inference.py --model saved_model/ --image_paths <path_to_images>
.private_detector.zip
archive containing SavedModel and checkpoint files.Highlighted Details
Maintenance & Community
This is an open-source release from Bumble. Further community engagement details (e.g., Discord, Slack) are not specified in the README.
Licensing & Compatibility
The README does not explicitly state a license. Compatibility for commercial use or closed-source linking is not specified.
Limitations & Caveats
The model is trained on an internal dataset, and its performance on diverse datasets may vary. The specific dataset composition and potential biases are not detailed.
1 year ago
Inactive