CSAM Response Policy
This policy explains Vibble’s zero-tolerance approach to Child Sexual Abuse Material (CSAM), including rapid removal, mandatory reporting, and cooperation with law enforcement.
1. Zero Tolerance
Any CSAM or sexual exploitation of minors on Vibble is strictly prohibited. This includes real, staged, edited, or AI-generated material that depicts or sexualizes minors.
2. Detection & Prevention
Vibble uses multiple safeguards:
- Hash matching against known CSAM databases.
- AI-based detection of suspicious patterns and media features.
- High-priority review of child-safety user reports.
3. Immediate Actions
When CSAM is detected or reported:
- Content is removed or blocked from access as quickly as possible.
- Accounts involved are suspended or permanently banned.
- Relevant data is preserved for legal and investigative purposes.
4. Mandatory Reporting
Vibble reports CSAM to appropriate child-safety authorities and law enforcement in relevant jurisdictions, such as national hotlines and cybercrime units, as required by law.
5. Grooming & Solicitation
Attempts to solicit sexual content from minors, grooming behaviors, or any exploitative conduct toward minors are treated as CSAM-adjacent violations and handled with maximum severity.
6. Reporting CSAM
Users must not download or distribute suspected CSAM. Instead, report it immediately:
Child Safety & CSAM: safety@vibble.org
Emergency Escalation: emergency@vibble.org
7. User Safety & Support
Where possible, we provide guidance and resources for victims or concerned users, including referrals to child-protection organizations.
8. Policy Updates
Vibble will continuously evolve CSAM detection, reporting, and response capabilities in partnership with industry and governmental organizations.