Back

Child Safety & CSAM Prevention Standards

Our zero‑tolerance approach and safeguards to protect minors across AnglerIn

Last updated: 10/16/2025

AnglerIn maintains a zero‑tolerance policy toward child sexual abuse material (CSAM) and any sexual exploitation of children. These externally published standards explain how we prevent, detect, report, and remove CSAM and how users can contact us. This page is intended to satisfy Google Play’s child safety standards requirement and to clearly communicate our practices to users, partners, and authorities.

Scope

  • Applies to all AnglerIn products: websites, mobile apps, APIs, and community features.
  • Covers user‑generated content (UGC), including images, videos, text, profiles, messages, comments, and listings.

Prohibited Content & Conduct

The following are strictly forbidden and will result in immediate account action and escalation to authorities:

  • Any CSAM: imagery, video, audio, text, or links depicting or describing sexual exploitation of a minor.
  • Sexualization of minors, grooming, solicitation, or encouragement of exploitation/trafficking.
  • Attempts to share, request, trade, sell, or distribute CSAM or tools to obtain it.
  • Impersonation of a minor for sexual purposes or attempts to contact a minor with sexual intent.

Prevention & Moderation

  • Age‑appropriate design: AnglerIn is intended for users 13+ (or higher per local law). We do not allow accounts for children below the applicable age.
  • On‑platform safeguards: Default privacy settings for new accounts, restricted messaging for suspicious activity, and block/report tools on posts, profiles, and messages.
  • Automated detection: Machine‑assisted signals and industry hash‑matching (where supported) to flag suspected CSAM and harmful behavior.
  • Human review: A trained Trust & Safety team reviews escalations and takes enforcement actions promptly.

Reporting CSAM or Child Safety Concerns

Report content or accounts via the in‑app Report option on posts, profiles, and messages, or email us directly.

  • Contact: support@anglerin.com
  • What to include: Links, usernames, screenshots, and a brief description (do not download or share illegal content).

Our Response Process

  • Immediate action: We disable access to suspected CSAM upon detection or report and preserve evidence securely.
  • Review time: Triage begins as soon as we receive a report, with priority handling for CSAM (target: within hours).
  • Mandatory reporting: Where legally required (e.g., via NCMEC in the U.S. or relevant local authority), we file reports and cooperate with law enforcement.
  • Account enforcement: Permanent bans for CSAM; additional device/IP measures may be applied.

Safeguarding Minors

  • We prohibit public display of personal contact details for minors and encourage privacy‑preserving settings.
  • We limit features that could expose minors to unsolicited contact; users can block and restrict others at any time.
  • We provide educational tips in‑app about staying safe online and recognizing grooming behavior.

Data Handling & Retention

  • Evidence of violations is retained as required to meet legal obligations and to support investigations.
  • Otherwise, we minimize data collection and retain personal information only as long as necessary, in accordance with our Privacy Policy.

Appeals

If you believe your content/account was removed in error (and it does not involve CSAM), contact support@anglerin.com from the registered email to request a review.

Law‑Enforcement & NGO Cooperation

We respond to valid requests from law enforcement and collaborate with recognized child‑protection organizations, consistent with applicable law and user privacy.

Contact

Designated point of contact for CSAM prevention practices and compliance:

Email: support@anglerin.com

For details on general data practices, please see our Privacy Policy.