Universities, K-12 districts, EdTech vendors, and online learning platforms run on student data, classroom AI tools, and a long tail of third-party software. FERPA, COPPA, GDPR, and DPDP minor-data rules turn every plugin and every vendor into an audit obligation. Safeguard makes that obligation a live query.
Regulator, parent, and operational expectations are collapsing into one continuous student-data evidence requirement.
US student records are governed by FERPA, under-13 platforms by COPPA, and the rest of the world by GDPR and DPDP minor-data rules. Each of these expects continuous, auditable evidence — not a one-page policy PDF refreshed at accreditation time.
Teachers, students, and administrators have all adopted AI tooling. Emerging US state-level and EU AI Act rules now treat classroom AI as a regulated category. An institution without an AI-BOM and policy enforcement is exposed by default.
Grant-funded research data — health, genetics, NIH-flagged, EU Horizon — has residency and disclosure obligations that follow the data, not the institution. Preprint servers and shared notebooks routinely leak what the grant office promised.
Student information systems and EdTech vendors are now a regular ransomware target. Districts and universities are publicly named when SIS, LMS, or assessment vendors are compromised — and parents read those headlines first.
Lino runs locally on faculty and IT laptops. Code review, syllabus drafting, and lesson-plan agents never send student data to a third-party cloud. The institution keeps custody of FERPA-scoped records by construction.
Every classroom AI tool — chatbot, grader, tutor agent — gets an AI-BOM with model SHA, training-data scope, and policy attestation. Parents, regulators, and accreditors can be answered with a query, not a memo.
Districts, K-12 schools, and university units operate under different rules. Policy is scoped per institution and per program — COPPA for K-5, FERPA for higher ed, EU GDPR for international campuses — without one global toggle.
Districts often share five or six EdTech vendors industry-wide. A single transitive dependency in an LMS plugin, an SIS connector, or an assessment runtime cascades across hundreds of institutions before procurement notices.
Pre-mapped control narratives and evidence in the formats your accreditor, district counsel, and parent body already expect.
Per-institution control plane, classroom-AI MCP-server allowlisting, vendor trust packets for procurement, and a parent-facing audit log export.
Each district, university, or K-12 unit gets a logically isolated control plane. No cross-tenant queries, no shared key material, no leakage between institutions sharing the same EdTech vendor.
Classroom AI tools register as MCP servers behind a policy gate. Capability scoping, prompt audit, and per-grade-level allowlists are enforced before any student-facing prompt is dispatched.
Every EdTech vendor onboarded by the institution gets a signed trust packet — SBOM, AI-BOM, residency, FERPA / COPPA posture. Procurement runs a query, not a 90-day questionnaire.
Read-only export endpoint for the records a parent or guardian is entitled to see. FERPA disclosure, COPPA opt-in lineage, and AI-tool usage history — generated on demand, signed, and retained.
Students treat the classroom AI as an adversarial sandbox. Prompt injection through worksheets, shared assignments, and uploaded files can bypass guardrails and exfiltrate teacher or student data.
A handful of vendors run the SIS, LMS, and assessment layer for thousands of districts. One compromised dependency in any of them cascades through the entire EdTech supply chain.
Grant-funded research datasets leak through preprint servers, public notebooks, and unvetted plugin connectors. NIH and EU Horizon obligations live with the data, not the institution.
The SIS is the highest-value target on campus. Attendance, grades, IEPs, financial-aid records — a single ransomware event can shut down a district and end up on the front page.
Numbers from institutional deployments. Same accreditor, same EdTech stack, dramatically less spreadsheet.
| Metric | Before Safeguard | With Safeguard |
|---|---|---|
| FERPA audit prep | 6 weeks | 1 day |
| EdTech vendor monitoring | Quarterly | Continuous |
| Classroom AI governance audit prep | 3 weeks | 1 hour |
| Tooling footprint | 5 vendors | 1 (free tier for accredited) |
| Alert noise | ~75% | ~5% |
| Vendor questionnaire turn-around | 10 days | 4 hours |
| Parent disclosure prep | Manual | Automated |
Talk to the team about FERPA evidence pipelines, COPPA guardrails for classroom AI, and a deployment shape that lives inside your institution's perimeter.