UK Online Safety Act Compliance Statement

This statement explains how Vexor complies with the UK Online Safety Act (OSA), including obligations around illegal content removal, youth protection, risk assessments, transparency, and user redress mechanisms. Vexor is committed to full alignment with OFCOM regulatory standards.

1. Introduction

The UK Online Safety Act establishes a comprehensive regulatory framework for digital platforms operating in the United Kingdom. It requires companies to implement preventative and responsive safety systems, provide clear and accessible reporting processes, and protect users—especially minors—from illegal or harmful online content.

As a platform hosting user-generated content, Vexor is classified under applicable OSA categories and adheres to all regulatory requirements, including:

  • Mandatory safety risk assessments
  • Proactive detection and mitigation of illegal content
  • Enhanced protections for minors and vulnerable individuals
  • Transparency reporting and audit readiness
  • User complaints, appeals, and dispute resolution pathways

Vexor works collaboratively with OFCOM, safety organizations, and UK regulatory partners to maintain platform integrity and user wellbeing.

2. Illegal Content Detection

Under the OSA, Vexor must take robust measures to detect, prevent, and remove illegal content. The platform uses a hybrid moderation model combining AI detection, automated removal workflows, and trained human reviewers.

Illegal content removed under OSA compliance includes (but is not limited to):

  • CSAM – Child Sexual Abuse Material (absolute zero-tolerance)
  • Terrorism, radicalization, or extremist propaganda
  • Threats, harassment, blackmail, or incitement to violence
  • Fraud, scams, financial deception, impersonation schemes

Where appropriate, Vexor preserves evidence securely and escalates cases to UK law enforcement via the appropriate channels (e.g., NCA, CEOP).

3. Child Safety Duties

The Online Safety Act places heightened obligations on platforms to protect minors. Vexor implements advanced, multi-layered safeguards designed to reduce exposure to harm and prevent exploitation of young users.

  • Age Verification Systems: Automated and manual checks aligned with OSA standards.
  • Default Privacy Protections: Private accounts for under-16 users; restricted discovery.
  • Feature Limitations: Messaging, gifting, and live features restricted for minors.
  • AI Grooming Detection: Behavioral models that identify patterns of predatory intent.
  • Human Safety Teams: Specialist moderators trained in child protection protocols.

Vexor also complies with UK safeguarding expectations and safety-by-design requirements.

4. Risk Assessments

In accordance with OFCOM regulatory duties, Vexor conducts formal risk assessments covering:

  • Child safety vulnerabilities and exposure points
  • Illegal content dissemination risks
  • Algorithmic amplification and recommender-system risk profiles
  • Self-harm, suicide, and harmful behavioral content risks
  • Potential for platform misuse, evasion, or systematic abuse

These assessments are reviewed periodically, updated alongside product changes, and integrated into platform-wide safety engineering processes.

5. Complaints & Appeals

Users located in the United Kingdom may submit:

  • Content removal complaints
  • Appeals against moderation actions
  • Reports of illegal or harmful content
  • Concerns regarding child safety violations

All complaints are reviewed by trained moderation teams, and appeal decisions include clear explanations in line with OSA transparency obligations. Where applicable, users may also pursue independent dispute resolution mechanisms recognized in the UK.

6. Contact

For questions or regulatory correspondence regarding UK Online Safety Act compliance:

OFCOM Compliance Office: ukcompliance@vexor.to
Legal Team: legal@vexor.to

هل كانت المقالة مفيدة ؟ 0 أعضاء وجدوا هذه المقالة مفيدة (0 التصويتات)