ForumGuard reviews user posts for potential safeguarding risks and explains the reasoning behind each flag — helping you build proportionate systems, protect users, and maintain auditable records in line with the Online Safety Act.
Signals linked to self-harm, abuse, grooming and mental-health distress — so moderators can intervene early and proportionately.
Risk factors referenced in the Online Safety Act (e.g. hate speech, terrorism, illegal activity) with explainable, audit-friendly flags.
See which harms affect your community most, track volumes over time, and export reports for stakeholders and audits.
ForumGuard assists moderation and does not provide legal advice.
A quick call to understand your community, moderation setup, and goals. No commitment.
We work with your online community team or tech team to plug in via RSS or API. We only take what’s needed to triage.
Get up and running quickly — with immediate assurance and a clear evidence trail for reviewers and audits.
Tell us about your forum and safeguarding goals. We’ll show how ForumGuard can support you.