CSAM Response Policy
This policy outlines Vexor’s zero-tolerance approach to Child Sexual Abuse Material (CSAM), mandatory reporting obligations, detection systems, grooming prevention, and global law-enforcement cooperation. These rules apply to all users, creators, advertisers, employees, and third-party partners.
1. Zero-Tolerance Statement
Vexor maintains an absolute zero-tolerance policy toward Child Sexual Abuse Material, attempted exploitation of minors, grooming behavior, or any form of child endangerment. Violations are treated as severe criminal activity and result in:
- Immediate removal of the content
- Permanent account termination
- Preservation of evidence for law enforcement
- Mandatory reporting to appropriate authorities
This includes depictions of minors, AI-generated minors, digitally manipulated images, or any material that sexualizes, exploits, or harms children.
2. Mandatory Reporting
Vexor complies with all global child-protection laws and reporting mandates. All confirmed CSAM is immediately reported to:
- NCMEC – U.S. National Center for Missing & Exploited Children (CyberTipline)
- Local and international child-safety agencies
- National and regional law-enforcement entities
We also comply with preservation requests, subpoenas, warrants, and cross-border cooperation through MLAT frameworks as required by law.
3. Detection & Removal
Vexor uses multiple layers of safety technology and review processes, including:
- AI-driven classifiers trained to identify CSAM patterns, exploitation, or suspicious content
- PhotoDNA and NCMEC hash-matching to detect known illegal content
- Automated real-time scanning upon upload or during livestreams
- Specialized human moderators trained for sensitive high-risk content
Content suspected of child harm receives the highest priority in our moderation queues and is escalated within minutes for removal and reporting.
4. Grooming & Predatory Behavior
Vexor actively monitors for patterns of grooming, manipulation, or predatory conduct. Behaviors that trigger investigation include:
- Attempting to coerce minors into sharing private information
- Requests for sexual images, videos, or conversations
- Attempts to move communication off-platform to evade detection
- Manipulative behavior intended to exploit or endanger minors
- Persistent or unwanted contact toward presumed minors
Accounts engaging in grooming or predatory behavior are immediately suspended and escalated for law-enforcement review.
5. User Reporting Tools
Users can report suspected CSAM or potential child exploitation through any of the following channels:
- In-app reporting tools (available on all videos, profiles, and messages)
- Emailing the Vexor Safety Team: safety@vexor.to
- Emergency escalation for imminent threats: emergency@vexor.to
All CSAM-related reports are treated as high-priority emergencies and reviewed immediately by specialist teams.
6. Contact
CSAM Reporting: safety@vexor.to
Emergency Escalation: emergency@vexor.to