Child Safety & Exploitation Policy
1) Scope & who we are
This policy explains how Mad2Moi prevents, detects, and responds to any behavior or content involving minors across the entire platform (profiles, media, lives, messages, groups, ads, and any future features).
- Adults-only: You must be 18 or older to use Mad2Moi.
- Jurisdiction: We cooperate with competent authorities and recognized hotlines in any country where we operate.
2) Zero-tolerance statement
We prohibit and will act on any of the following:
- Images, videos, audio, text, emojis, or links that depict, sexualize, or otherwise exploit minors (anyone under 18), including AI-generated or “deepfake” depictions.
- Grooming or solicitation of a minor, sextortion, trafficking, or any attempt to arrange sexual contact with a minor.
- Age-play or role-playing that sexualizes minors (e.g., “teen”, “schoolgirl/boy”, “under 18”), even if presented as fiction.
- Any content suggesting incest, familial sexualization, or child endangerment.
- Attempts to obtain sexual imagery from a person believed to be a minor.
Confirmed CSAM/grooming → content removed, accounts terminated, device/payment identifiers may be blocked, and cases reported to authorities/hotlines.
3) 18+ verification & prevention by design
- Age gate & verification: Government-ID + liveness may be required under risk signals (suspicious activity, reports, random audits).
- Profile & media controls: Prohibited terms (e.g., teen, underage, minor, schoolgirl/boy, under 18) are blocked in bios, usernames, hashtags, captions, and live titles.
- Content filtering: Images/videos are screened with automated models and industry-standard hashing to prioritize suspected CSAM for immediate human review.
- Grooming signals: Rate limits, spam/link controls, and behavior-based risk scoring in messaging.
- User tools: Easy Block, Mute, and Report actions on every profile, message, and media viewer.
4) How to report concerns (in-app & email)
- Use the in-app “Report” button (available on profiles, messages, photos/videos, and lives). Choose a category: Minor or suspected minor, Potentially illegal image/video (CSAM), Grooming / sexual solicitation, or Other.
- Or email [email protected] with details and links/screenshots (do not forward illegal imagery).
Emergency: If a child is in immediate danger, contact your local law enforcement first, then inform us.
5) Our response & timelines (SLA)
- P0 – Imminent harm / suspected CSAM: Immediate action: content hidden/removed, account frozen; human review begins at once; lawful evidence preserved; external escalation without delay.
- P1 – Grooming / solicitation / age-play: Target within 24 hours: content removed; account restricted or banned after verification.
- P2 – Other safety concerns: Target within 72 hours: investigation and enforcement as appropriate.
6) What happens after you report
- Triage & containment (hide content, lock account, restrict messaging).
- Evidence preservation (timestamps, IPs, device info, metadata) under strict access control.
- External escalation to competent law-enforcement and/or recognized hotlines (e.g., INHOPE members; NCMEC where applicable).
- Outcome: Permanent bans; device/phone/payment fingerprints may be blocked to prevent evasion.
- Victim-first approach via appropriate authorities where lawful.
7) Data handling & privacy (child-safety cases)
- Minimization: Collect/store only what’s necessary to investigate and comply with the law.
- Security: Encryption in transit and at rest; restricted, audited access.
- Retention: CSAM evidence kept only as required by law; other safety reports typically 12–24 months for repeat-offender detection/compliance, then minimized/anonymized.
- User rights: Access/erasure may be limited where it would interfere with crime prevention or legal duties.
8) Enforcement & repeat offenders
- Permanent bans for CSAM or grooming.
- Device/identifier bans may be applied subject to law and platform rules.
- Appeals: Narrow appeal path for non-CSAM safety actions where legally appropriate. No appeal for confirmed CSAM.
9) Staff training & accountability
Safety, moderation, and support teams receive regular training on child-safety recognition, trauma-informed practices, and legal escalation. We run post-incident reviews and update tools, keywords, and workflows accordingly.
10) Your responsibilities
- Do not use Mad2Moi if you are under 18.
- Do not post, request, share, or search for any content sexualizing minors.
- Report anything suspicious using in-app tools or by emailing [email protected].
11) Contact
Child Safety Team
Email: [email protected]
We aim to acknowledge P0 reports immediately and other child-safety reports within 24 hours.
12) Updates to this policy
We may update this page to reflect legal or operational changes. Material changes will be announced in-app and on our website. The latest version is always available at this URL.