Help Center · Safety & Community Standards

Built on safety.

Child Safety at Adrenaline Realm

A plain-English summary of how we keep child exploitation material off the platform, what we automate, and what we won't ever do. The full legal version lives in Section 8 of the Privacy Policy.

Protecting what matters most.

Why this exists

Adrenaline Realm is built for car enthusiasts. But once you let strangers upload images on the internet, you inherit a responsibility you didn't ask for: making sure no part of your platform becomes a place where children are harmed.

We take that responsibility seriously. Every major platform — Meta, Snap, TikTok, Discord — has tools in place to detect child sexual abuse material (CSAM). Federal law (18 U.S.C. § 2258A) requires online platforms to report this material to the authorities when they find it. We follow the same standards.

What we do

An automatic scan at upload

When you post a photo or video, our backend runs an industry-standard scan on the file. The scan creates a one-way "fingerprint" of the image and compares it against established databases of fingerprints of known child exploitation material maintained by child safety organizations.

A few important details:

  • The fingerprint is one-way. It cannot be turned back into your image — it's a hash, not a copy.
  • The scan happens inside our own infrastructure. Your image is not sent to any third party as part of the scan.
  • The scan only looks for known material. It does not classify nudity, violence, drugs, or anything else — only specific fingerprints of content already documented in child safety databases.

For nearly every upload, the scan finishes in seconds with no match and your post publishes normally. You won't notice it.

What happens if something matches

If the scan flags a match, the following happens automatically:

  • The content is removed from public view immediately. No one else on the platform can see it.
  • The file is moved to a secure, isolated storage system that nothing public can reach. It stays there as legally-required evidence.
  • Our human safety team is alerted to review.

A trained member of our team then reviews the metadata (not the image directly) to verify the match is real. If it is, we report it to the federally-designated reporting authority as the law requires. They forward verified reports to law enforcement.

Why we don't tell the uploader

If your upload is flagged, we will not notify you, your followers, or anyone else. There's no "Your post was reported" notification. There's no error message.

This isn't to be unfriendly — it's the law and it's the right thing to do. Notifying someone whose content matched a known CSAM hash would tip off a potential offender and let them destroy evidence, delete other accounts, or harm a child further. Every major platform handles it the same way.

What we will never do

  • Scan private DMs for anything unrelated to safety. The hash scan that runs at upload is the entire scope of automated content checks. We do not read your messages.
  • Share your data with third parties for marketing or advertising. Not now, not later. The only data sharing we do in this area is the legally-required report to the designated child safety reporting authority when there is a confirmed match.
  • Hand over data voluntarily to law enforcement. Outside of the narrow reporting requirement above, we only respond to lawful, properly-served legal process (subpoenas, court orders, search warrants). Each request gets legal review on its own merits.
  • Let automation make the final call. Every match that may result in an external report is reviewed by a human on our team first. We do not auto-submit reports to authorities.

If you see something on Adrenaline Realm

If you encounter content on the platform that you believe sexually exploits a child:

  1. Do not screenshot, save, share, or download it. Even with good intentions, possessing this material is itself illegal in most jurisdictions. Don't put yourself at risk.
  2. Report it in-app immediately. Tap the three-dot menu (⋯) on the post or comment, choose Report, and select "Child safety" as the reason. Submit. Our safety team reviews these reports as the highest priority.
  3. You can also report directly to the authorities. The U.S. national tip line for reporting child exploitation online is the CyberTipline at report.cybertip.org. It is operated by the National Center for Missing & Exploited Children (NCMEC).
  4. If a child is in immediate danger: Call 911 (or your local emergency number) before doing anything else. Emergency services move faster than any tip line.

If you or someone you know is a survivor

If you or someone close to you has been affected by child sexual exploitation or abuse, these organizations exist to help:

Where to learn more

If you have a question this page doesn't answer, email safety@adrenalinerealm.com directly. We read every message.