Combatting CSAM

HornyWildGirls fights to combat the creation and distribution of child sexual abuse materials (CSAM).

HornyWildGirls is building the safest digital media platform in the world. We do not tolerate CSAM on our platform, and actively work to block it. The creation or distribution of CSAM is immoral, wrong, illegal and against our Terms of Service and our Acceptable Use Policy.

We have a dedicated team of people who work around the clock to prevent and swiftly remove any suspected CSAM from our platform.

What is CSAM?

  • CSAM is any image or video of sexually explicit conduct, including nudity, involving a person less than 18 years old. These images amount to child sexual abuse and exploitation.

    How often does CSAM appear on HornyWildGirls?

  • HornyWildGirls aggressively targets and reports people who create, or try to use our platform to distribute, CSAM.
  • Incidents of suspected CSAM make up less than 0.0002% of all content submitted by creators to be posted on HornyWildGirls. We report all suspected incidents of CSAM to NCMEC. Each suspected image or video is blocked and removed from the platform while we investigate it. Many of those suspected images do not, in fact, turn out to be CSAM and/or are blocked before they ever appear on HornyWildGirls.

    How does HornyWildGirls identify CSAM on its platform?

  • We continuously scan our platform to prevent the posting of CSAM. All our content moderators are trained to identify and swiftly report any suspected CSAM.
  • Before content can appear on a newsfeed, we inspect it with state-of-the-art digital technologies to check whether the content is allowed on the platform. All content that passes this initial review is then manually reviewed by our trained human moderators within 24 hours. Our trained moderators identify, and immediately escalate, any content which they suspect may be CSAM.

    What is HornyWildGirls looking for when trying to identify CSAM?

  • Before any content can appear on a creator’s newsfeed, we compare it against databases and tools used by law enforcement to prevent the distribution of known CSAM.
  • CSAM can be harder to identify if it is “new” CSAM, meaning it is CSAM which is not already part of databases and tools used by law enforcement. So we closely inspect images, text, and sound files to look for possible “new” CSAM. Our technology and our people work together to report suspected CSAM which has not previously been identified. When confirmed, we pass this information to law enforcement and non-governmental organizations to help identify the perpetrators.

    What happens when HornyWildGirls find suspected CSAM on its platform?

  • When we identify suspected CSAM, we immediately remove it, and make a “CyberTipline” report to the National Center for Missing & Exploited Children (NCMEC). You can find out the number of reports made to NCMEC were made by HornyWildGirls and other digital media companies here.
  • NCMEC reviews these reports and shares them with the appropriate law enforcement around the world so they can combat CSAM. We work closely with law enforcement to investigate, prosecute, and punish any person who tries to use our platform to create or distribute CSAM.
  • We immediately investigate any user who tries to share suspected CSAM on HornyWildGirls and we take appropriate action. We ban anyone who tries to create or distribute CSAM on HornyWildGirls.

    How can HornyWildGirls tell if a direct message or other private post contains CSAM material, are these posts and content encrypted?

  • HornyWildGirls does not use end-to-end encryption. Everything on the site is visible to our team of trained reviewers. There are no “hidden” posts, “secret” areas, or disappearing messages on HornyWildGirls.
  • We can review and remove any image or video shared on HornyWildGirls at any time, including in all direct messages.

    Does HornyWildGirls’ subscription model enable the distribution of CSAM?

  • No. Our subscription model makes it more difficult for people to create and distribute CSAM.
  • To subscribe to content, or to post content on HornyWildGirls users need to pass our strict identity verification checks. Unlike many other digital media platforms, this means we know the legal identity of all our users. No one can post anonymously on HornyWildGirls.
  • Because our users are not anonymous, and we do not have end to end encryption, HornyWildGirls users are less likely to try to create or distribute CSAM. If any user does try to create or distribute CSAM material on our platform, we know who they are, we report them, and we ban them from HornyWildGirls.

    How do I report suspected CSAM?

  • Each post and account on HornyWildGirls has a report button. If you see any content on HornyWildGirls which you suspect could be CSAM, please immediately click the report button or tell us what you saw by email support.

    How can I trust that HornyWildGirls takes this issue seriously?

  • HornyWildGirls is committed to building the safest social media platform in the world. We take responsibility for our actions, and we regularly publish data which proves the steps we say we are taking. This data is found in our Transparency Reports which you can find here.
  • We have also gone a step further and we chose to put in place an independent third party monitor to check the processes we have in place to fight against the creation and distribution of CSAM.

    What else do you do to prevent the creation or distribution of CSAM?

  • We work closely with governments, regulators, law enforcement, non- governmental organizations, charities, and other companies in the fight against CSAM.
  • We receive and act on intelligence from our safety partners on emerging trends relating to child safety and offender prevention. We also work proactively with researchers and analysts to identify the tactics, techniques and procedures used by bad actors who create and distribute CSAM online to prevent them from succeeding.
  • As well as working closely with NCMEC to report any instances of CSAM on our platform through NCMEC's CyberTipline, we are also part of NCMEC's TakeItDown initiative. TakeItDown is a support service available for individuals under the age of 18 who are worried about self generated explicit content (which is considered CSAM) being shared online. Individuals can use the tool to 'hash' the image they are concerned may be shared and prevent it being posted online. Take It Down (ncmec.org) If you would like to know more about our fight against the creation and distribution of CSAM please email us .