Legislators, researchers, schools, journalists, funders — if you need independent, research-backed, tech-experienced guidance on youth digital safety, we want to talk. Read our Ethical Research Constitution →
We provide legislative testimony, help write laws, and give regulators the independent expertise they need to hold platforms accountable. If you're working on youth digital safety legislation, we can help you ask the right questions and make the right demands.
We partner with academic and research organizations on age assurance, parental controls, AI safety, harm taxonomies, and youth wellbeing. We bring inside-platform experience to research design — we know what the systems actually measure and where the data has historically been missing.
We work with school systems on AI use policies, digital safety curriculum, and educator guidance grounded in behavioral evidence. We also partner with schools to pilot our teen mentorship program, where older students teach younger ones real digital safety skills.
Court cases, academic findings, and industry developments that affect children deserve clear, honest public explanation. Anneke is available for interviews, op-eds, and expert commentary on children's digital safety, AI, social media, and platform accountability.
CLARA Standards is seeking funding to complete the Digital Safety Companion app, scale the teen mentorship program, and expand our research and advocacy work. We accept funding from foundations, impact investors, and family safety funders. Technology company funding is accepted under strict independence protections. Full governance →