About CLARA Standards

Independent. Research-backed. Built by people who know these systems from the inside.

CLARA Standards was founded on a simple premise: children's digital safety requires independent voices — people who understand how these platforms actually work, who don't depend on tech industry funding, and who can speak plainly about what's broken and what can fix it.

CLARA Standards (nonprofit, EIN 41-4174040) and CLARA PBC (public benefit corporation, EIN 41-3927635) are structurally separate entities, incorporated in New Jersey in 2026.

Anneke Buffone, PhD
Founder

Anneke is a behavioral and computational social scientist. Before founding CLARA Standards, she spent nearly seven years at Meta, with her final two years focused on cross-platform age assurance research across Instagram, Facebook, and WhatsApp. That experience showed her that technical and behavioral influences on child safety must be studied together — and that the work must be done independently of the industry it evaluates.


Courtney Froehlig, PhD
Policy Lead

Courtney built TikTok's product risk models — the systems that evaluated safety exposure across features before launch. She led the policy assessments that translated behavioral risk into engineering requirements. Her deep knowledge of where platform safety policy breaks down is exactly the kind of inside experience that independent evaluation needs.

Zack Prager
Co-Founder, Growth Engineer

Zack leads all technical development for CLARA Standards, including the Digital Safety Companion app, platform scoring infrastructure, and the data systems underlying CLARA Standards' research work. He holds a Master of Applied Positive Psychology (MAPP) from the University of Pennsylvania — an unusual combination of engineering and developmental science that shapes how CLARA Standards' tools are built.

Current Work & Affiliations

Where we're active now.

Harm Taxonomy Development
Collaborative research with USC Neely Center, Stanford Social Media Lab, and NYU Stern (Jonathan Haidt's group) on developing a comprehensive harm taxonomy for children's digital experiences.
Active
Youth AI Usage Research
One grant submitted, and two major grants in progress with Dr. Lyle Ungar (UPenn Psychology) on youth AI usage patterns and safety implications for families and educators. Collaborators: Stanford Social Media Lab.
In Progress
Parent Seminars on Social Media & AI Safety
Two seminars planned for parents of middle and high school students on social media safety, AI safety, and parental controls.
Active
Future Proofing Human Flourishing Advisory Board
Anneke Buffone serves as an advisory board member for Innovate Edu's Future Proofing Human Flourishing initiative.
Active
California Privacy & Consumer Protection Committee
Testified on age assurance evidence, circumvention research, and parental controls effectiveness before the California Assembly Privacy & Consumer Protection Committee. Watch the hearing (Anneke's testimony begins at 1:11:00)
Completed — March 2026
All Tech Is Human — Parental Controls Podcast
Conversation on parental controls research and the limits of current safety systems, with David Polgar and Vaishnavi J. Watch on YouTube
Published
Integrity Institute Member
Anneke Buffone is a member of the Integrity Institute, a peer community of integrity professionals from 80+ major technology companies across 23 countries — engineers, researchers, and policy experts working toward a digital world that helps people, societies, and democracies thrive.
Active
Montclair Digital Life Committee
Contributing to digital safety policy and education at the municipal level.
Ongoing

Collaborators & Affiliations

Lyle Ungar, PhD
Professor of Computer Science, University of Pennsylvania
Dunigan Folk
Researcher, UPenn Psychology
Stanford Social Media Lab
Anneke Buffone, affiliated researcher
USC Neely Center
Collaborative research on harm taxonomy and platform safety
Future Proofing Human Flourishing Advisory Board
Innovate Edu NYC

Independence & Ethics

A firewall between research and revenue.

CLARA Standards operates as two separate legal entities: a 501(c)(3) nonprofit that conducts all research, scoring, and advocacy, and a public benefit corporation (PBC) that builds consumer products like the Reclaim & Rewire mentorship app. The nonprofit's research can never be edited, delayed, or suppressed by the PBC — or by any funder, partner, or platform we evaluate.

Our Ethical Research Constitution makes this binding: we pre-register studies before collecting data, open-source our methods, publish all findings regardless of who they implicate, and disclose every conflict of interest. No tech company funds our work. No funder gets to see results before the public does. If we find something uncomfortable, we publish it anyway.

Read the full constitution →