Safeguarding and safety
Professional AI designed for child protection, not entertainment
The crisis is here: Young people are turning to AI chatbots
New research reveals the urgent need for safe, monitored AI support in schools.
40% already using AI for support
Almost 4 in 10 young people aged 11-18 have turned to AI chatbots for advice, support or companionship. (OnSide Youth Charity, 5,035 respondents, November 2025)
19% find it easier than real people
Nearly 1 in 5 young people say it is easier to talk to AI than a real person. 9% feel embarrassed talking to adults. 6% have no one else to talk to.
76% exposed to harmful content
Three quarters of young people have been exposed to upsetting content online - a 7% increase from 2024. Fake news (+4%), hate speech (+4%), and sexual content (+6%) are all rising.
Professional tool for safeguarding support
Quinly is not a consumer app adapted for schools. It is a professional tool designed from day one to support your safeguarding work with AI-powered crisis detection and UK compliance.
Constitutional AI safety
Every response filtered through Claude 4 Sonnet's Constitutional AI, specifically designed to protect vulnerable children, not maximise engagement.
Crisis detection and signposting
Detects 30 crisis categories. Quinly suggests trusted adults first, then signposts to UK support services including Childline, Samaritans, CEOP, Revenge Porn Helpline, and NHS mental health.
Zero data retention
Stateless architecture means no conversation history, no personal data stored. Just anonymous aggregate analytics for your DSL dashboard.
Not a substitute friend or DSL
Quinly opens with "I'm not a real person." It is a calm first voice that helps a child put feelings into words and points them towards a trusted human, never a replacement for a counsellor, friend or Designated Safeguarding Lead.
Soft break reminder
After a sustained period of conversation (15 minutes by default, configurable per school for younger pupils) Quinly gently suggests a short break and a chat with a trusted adult. The conversation is never interrupted or ended.
Crisis detection and response
Real-time crisis identification with immediate professional signposting
30 crisis categories
AI & Deepfake harm, suicide and self-harm, sexual abuse, county lines, trafficking, cyberbullying, eating disorders, domestic abuse, and more.
1,000+ trigger phrases
Covering British colloquialisms and youth slang. Context-aware to prevent false positives whilst catching genuine crises.
UK service signposting
Provides information about Childline (0800 1111), Samaritans (116 123), CEOP, StayAlive app, and NHS mental health services when appropriate.
Anti-grooming architecture
Stateless design prevents harmful relationships from forming
Stateless design
Zero conversation history stored between sessions. Cannot build the persistent relationships that enable grooming.
No personal data
Anonymous and aggregate analytics only. No child profiles, no behavioural tracking, no data to breach.
Privacy by design
100% UK DPA 2018 and Children's Code compliant. Zero data retention is not a feature, it is our architecture.
Professional oversight
Real-time dashboard for your safeguarding team
Real-time dashboard
Designated Safeguarding Leads see crisis patterns, sentiment trends, and emerging risks across the school community.
Severity scoring
CRITICAL, HIGH, MEDIUM, LOW classification helps DSLs triage concerns and allocate resources effectively.
Anonymous aggregate data
Insights without identification. Your safeguarding team gets the intelligence they need whilst protecting pupil privacy.
Evidence-Based Safety
Real-world validation: Quinly has supported 3,289 child conversations (July 2025 to January 2026) with:
- Zero incidents of harmful content
- Zero grooming behaviours
- Zero inappropriate responses
- 100% appropriate crisis signposting to Childline, Samaritans, and UK support services
This is what child-first design looks like in practice.
Why don't DSLs see individual conversations?
It is the most common question we get from safeguarding leads, and the answer matters.
Anonymity is the unlock
Every UK and international study of child helplines (Childline, Kooth, The Mix) shows the same pattern. The moment a child suspects a named adult at their school might read their words, they stop telling the truth. Showing transcripts to DSLs would gut the very thing that makes Quinly work.
The law requires data minimisation
The UK Children's Code (Standard 7) and the DfE Generative AI Product Safety Standards (January 2026) both require that we collect and share only what is strictly necessary. The same safeguarding outcome can be reached with anonymous patterns, so individual transcripts are not lawful to share.
Quinly is a child's tool, not a staff tool
CPOMS and MyConcern are concern-recording systems used by adults observing children, and they rightly name names. Quinly is a private support tool used by the child themselves, in the same category as Childline. Childline does not email school DSLs after a call, and neither do we.
What DSLs do get: pattern intelligence that drives action
The dashboard gives you the school-wide picture you need to act, without ever exposing what an individual child typed.
Category trends over time
See, for example, that bullying disclosures are up 40 percent this month, or that sextortion mentions have appeared for the first time. That is the cue for an assembly, a staff briefing, or extra eyes on the corridors.
Time-of-day and day-of-week patterns
Sunday evening spikes on anxiety. Monday morning spikes on home-life concerns. Friday afternoon spikes on peer conflict. These patterns tell you when and where your safeguarding capacity is most needed.
High-severity volume tracking
Counts of CRITICAL-tier disclosures (suicide, self-harm, abuse, sextortion) over time, benchmarked against the national pilot baseline of 3,289 conversations across six schools.
Language and cohort breakdowns
See which of Quinly's five supported languages are being used, and which year groups are engaging most. Useful for inclusion planning, EAL support, and PSHE curriculum design.
How the individual safeguarding loop closes
For any child in crisis, Quinly does the thing that actually works. It tells the child, in plain language, to go and speak to their Designated Safeguarding Lead, gives them Childline's number, and tells them how to call 999 if they are in immediate danger. The child then walks into the DSL's office and has the disclosure conversation with a human being. That is the safeguarding pathway Ofsted and KCSIE expect.
Quinly is the bridge to the DSL, not a surveillance tool that hands them a file. The school gets the radar. The DSL has the conversation.