Quickly identify given image or gif if it is sensitive content or nsfw (not safe for work ) using pre-trained AI. Check uploaded images/gifs by user before using them. This plugin is a tool for identifying NSFW but Bubble's terms still apply on acceptable content.
Usage Features: - Automatically analyze images and GIFs using a pre-trained AI to detect sensitive or NSFW content.
- Seamlessly check user-uploaded media before displaying or saving it.
- Enhance user safety and content moderation with minimal setup.
- Works with both static images and animated GIFs.
- Returns classification results as plugin states for easy workflow integration.
- Fast detection powered by efficient server-side processing.
- Can be used in combination with Bubble’s workflows to reject or flag inappropriate uploads.

Feedback
FAQWhat does this plugin do?It analyzes uploaded images and GIFs using AI to detect NSFW (Not Safe For Work) or sensitive content before displaying or saving them in your app.
Can it detect NSFW content in both images and GIFs?Yes, the plugin supports detection for both static images and animated GIFs.
How do I use the detection result?The plugin provides detection results as exposed states that you can use in workflows to approve, reject, or flag content.
Is this plugin enough to fully moderate content in my app?The plugin helps identify unsafe content, but you are still responsible for ensuring compliance with Bubble's terms and your platform's content policies.

Other plugins from Zeroqode-
Progress Bar Detection Plugin for Bubble -
Touch and Mouse Events Detection Plugin for Bubble -
Dark Mode Detection Pro Plugin for Bubble-
Face, Body and Pose Detection Plugin for Bubble
SupportIf you still have questions or unresolved issues, you can
contact us.
Risk-free TrialThe most risk-free way to try out this plugin is to subscribe to it. If you unsubscribe a few days later you will be charged on pro-rata basis, so for example if the plugin monthly price is $5 then you’d pay only 17¢ per day ($5/30 days)!