Image Moderation
Image Moderation API
Moderate user images with one API call and clear categories.
Use the same policy system you already use for text.
What it detects
- • Nudity & sexual content
- • Violence & gore
- • Self-harm
- • CSAM detection
- • Scams & fraud
- • Custom rules
Why developers choose Vettly
- • Image and video under the same endpoint
- • Policy thresholds per category
- • Webhooks for async review
- • Evidence storage and audit trails
Example request
bashcurl -X POST https://api.vettly.dev/v1/check \
-H "Authorization: Bearer YOUR_KEY" \
-H "Content-Type: application/json" \
-d '{"content": "https://example.com/image.jpg", "contentType": "image"}'Example response
json{
"flagged": true,
"action": "review",
"categories": {
"sexual": 0.88,
"violence": 0.04
},
"policy": "marketplace-safe",
"latency_ms": 318
}Compared to point solutions
Vettly pairs detection with actions, dashboards, and policy control so your team can ship faster.
Keep exploring
Get an API key
Start making decisions in minutes with a Developer plan and clear upgrade paths.
Start on Developer