CLARA Standards

Hey Parents—If You're Lost, It's OK

Part 1 of 3

Part 1: Hey Parents — If You're Lost, It's OK (you are here) · Part 2: Finding Your Way

Have empathy for yourself.

If you feel like giving up, that's understandable.

If you already gave up—if your 12-year-old has a CoverStar or TikTok or an Instagram account they weren't supposed to have for another year—we're not here to judge you. We know the controls are often not working the way they're supposed to.

You might feel your own phone use is out of control. For some of us it's the news apps we cannot stop checking. For some of us it's phone games or online games. For some it's email. For some our vice is Instagram or Slack. Most of us have something we feel we can't quite control. In a 2022 Gallup survey, 58% of U.S. smartphone users said they use their phone too much. Pew Research Center found in 2025 that 37% of U.S. adults visit Facebook several times a day, and 33% say the same of YouTube. In Common Sense Media's 2019 survey, 61% of parents felt their teen was addicted to their mobile device. Research on “technoference” links heavy parent technology use with poorer parent-child interactions, and a later systematic review found that parents' use of technology in a child's presence was associated with poorer cognitive and psychosocial outcomes and higher screen time among young children. (McDaniel & Radesky, 2018; Toledo-Vargas et al., 2025)

So before you beat yourself up about your kid's screen time, it is worth looking honestly at your own relationship with your phone too. This isn't a guilt trip. It's a recognition that parents and kids are being pulled by many of the same systems.

You might feel like a failure because you signed the Wait Until 8th pledge and now your 12-year-old has a phone with Instagram on it. We get it. You're not alone.

And honestly? You never had a chance.

Not because you're uninformed—but because even serious researchers are not offering identical advice.

Jonathan Haidt argues that smartphones and social media are a major driver of worsening adolescent mental health, and that families, schools, and communities have to act collectively, not one by one. His four proposed norms are no smartphones before high school, no social media before 16, phone-free schools, and more independence, free play, and responsibility in the real world. Sonia Livingstone does not say phones are harmless. Her point is that the evidence is more mixed and more context-dependent than simple doom narratives suggest. In her work, what matters is the context of use: what children do online, with whom, and under what conditions. Her research has repeatedly shown that digital media often create opportunities and risks at the same time, which is why she emphasizes not only limits, but also safer design, stronger regulation, better defaults, age-appropriate protections, and children's rights to privacy and autonomy. In her framing, the phone can be both the problem and the solution. (LSE Research for the World; EU Kids Online comparative report; Children and Screens, 2025)

So the cleanest summary is not “Haidt says take away the phone and Livingstone says it's fine.” That's too crude. A fairer summary is this: Haidt emphasizes delay, collective restriction, and rebuilding offline childhood. Livingstone emphasizes context, proportionality, safer design, and protecting children's agency as well as their safety. They overlap more than people sometimes admit: both are worried about platform incentives, and both think adults and policymakers should do more than tell individual families to solve this alone.

They're probably both partly right. But that doesn't help you at 9 p.m. on a Tuesday when your daughter is crying because her friend group moved to a new app and she's on the outside, and you have no idea how to find or set the parental controls or whether you should forbid the app altogether.

Apps are not neutral tools.

Their business models reward attention and engagement, which creates a built-in tension with child safety. Researchers including Livingstone have documented how the pressure to compete for users' attention extends to children, and regulators are increasingly saying the same thing in policy language. In its child-safety rules, Ofcom said in 2025 that services using recommender systems must configure them to filter harmful content from children's feeds, alongside other protections. In March 2026, Ofcom added that 72% of children ages 8 to 12 were still accessing sites and apps with a minimum age of 13.

Parents then end up carrying a lot of the burden, but the tools they get are inconsistent and often underused. A 2025 Family Online Safety Institute survey found that 51% of parents use parental controls on tablets, 47% use them on smartphones, and 35% use them on video game consoles. That does not necessarily mean parents are careless. It may also mean the tools are fragmented, confusing, easy for kids to get around, or too time-consuming to manage well.

A critical, often-missed point is that parents are being asked to solve problems that are partly structural. Age checks remain weak. Safety tools vary widely by product. And the burden of figuring out what works often lands on families after the product is already in circulation.

The people who work on child safety at these companies are, by and large, people who care. Most are parents themselves. Most chose this work because they want kids to be safer online. This isn't a story about bad individuals.

But large platforms are structured so that any meaningful youth-safety proposal has to cross many teams — safety, growth, legal, engineering, product, international — each with different incentives and timelines. Getting all of them to align on a substantial change is hard. Getting any one of them to block it is easy. The proposals that survive this process are often the smallest, least disruptive versions of what was originally intended.

Then there's implementation. Any meaningful change to how kids are protected touches age assurance, growth, international policy, legal exposure, the creator ecosystem, ads, gaming, devices, video, and the feed. These are different teams with different priorities, and aligning them on youth safety is exactly the coordination problem described above.

And what comes out the other end often has no real teeth. The parental controls are rarely meaningfully tailored by cultural context, by product type, or by the difference between a 9-year-old and a 15-year-old. There's a kind of reasoning common in product development where a feature that's hard to reverse is treated as too risky to ship at all. A parental control that's hard to update as a child ages can be used as a reason not to build the control in the first place — even though the fix is simple: send a notification when the child's birthday triggers a settings review. The reasoning isn't really technical. It's institutional risk aversion dressed up as product philosophy.

So what parents get is a one-size-fits-all tool that doesn't really fit anyone. Features built primarily to announce rather than to be used — controls that exist on paper but are too complex, fragmented, or unsupported to meaningfully change what kids experience. And to set up and actually monitor all the available controls, on the device and on each app, is basically a part-time job.

None of this makes it acceptable. If the breakdown is structural, then expecting these companies to fix it from the inside alone is a fantasy. I had been ready to leave Meta for a while before I actually did — the structural problems I'm describing weren't things I could fix from inside, and the gap between what the research showed and what shipped had become untenable. What I took with me was a conviction that the most important work in youth digital safety has to happen outside the walls of these companies, with independent evaluation, independent standards, and independent accountability. That's part of what CLARA exists to build. Not because the people inside are bad. Not because one company is uniquely negligent — they all have versions of this problem. But because the most important work is structurally hard to do from inside, and the kids who get hurt in the gap between what's announced and what's actually deployed are real.

That's the world you're parenting in. The experts disagree. The tech companies are still shipping products faster than families can make sense of them. And you're left holding a phone you're not sure you should have bought, wondering if your kid will be damaged by having it or left out without it.

So let's start here: you were set up to fail.

Why parents cave.

Because here's what happened after you signed that pledge. Your kid switched schools. One friend got WhatsApp. Another got TikTok. And one by one, the dominoes fell. Your kid started begging. All. The. Time. “It's social suicide if you don't let me have one. And I can be more independent—I can call you after school. Please, momma.”

And the data backs up why you caved. Among parents whose child has a smartphone, Pew found in 2025 that 92% say staying in contact is a reason they allow one. The same report found that lower-income parents are more likely to say they allow a smartphone so their child doesn't feel left out: 37%, compared with 27% of middle-income parents and 23% of upper-income parents. That is the bind many families are in. The same device that creates risk can also solve practical coordination problems and reduce social exclusion. Or, in Livingstone's phrase, the phone can be both the problem and the solution.

Have empathy for your kid.

Even when they click on the pop-up that says their phone has been infected by 3,000 viruses and will self-destruct and sell their grandma—and they hand over your information because if it were true, wouldn't it be better to act fast? They didn't know that was a scam.

Their brains are still developing. According to the National Institute of Mental Health, the brain continues maturing into the mid-to-late 20s, and the prefrontal cortex—the region involved in planning, prioritizing, and good decisions—is among the last parts to fully mature. That's not an excuse. It's biology.

Their world is smaller than you think.

We've built a world where kids can't just ride a bike to a friend's house. They're in dance, soccer, and tutoring all day and all weekend. Between overscheduling and overprotection, we've engineered away a lot of the unstructured social time that earlier generations took for granted—and then we blame phones for filling the gap.

When your daughter says texting her friends or making videos on CoverStar is her lifeline, she might be telling you the truth. For some kids, the phone is not just entertainment; it is how friendship is maintained inside tightly managed lives. Livingstone's work has consistently argued that digital media can create opportunities and risks at the same time, and that for some children digital contact partly compensates when in-person opportunities are limited. (LSE Research for the World; EU Kids Online)

That doesn't make phones an uncomplicated good. It just means that many kids are telling the truth when they say the phone helps them stay connected. Pew found in 2025 that 74% of teens say social media makes them feel more connected to what is going on in their friends' lives, 63% say it gives them a place to show their creative side, and 52% say it makes them feel more accepted or supported. At the same time, 39% say it makes them feel overwhelmed by drama, 31% say it makes them feel excluded by friends, and 27% say it makes them feel worse about their own life. Digital connection can be socially useful and emotionally costly at the same time.

Kids can be bullied online—and they can also become the bully. Would you know?

Bullying no longer lives only in the cafeteria. The National Telecommunications and Information Administration notes that cyberbullying can happen through social media, online games, websites, instant messaging, chat rooms, text messages, and forums.

The cleanest current numbers are stronger than the older stranger/anonymous statistic that still circulates in a lot of parent writing. Pew reported in 2022 that 46% of U.S. teens had been bullied or harassed online. The Cyberbullying Research Center's 2023 national sample found that about 55% of teens ages 13 to 17 reported lifetime cyberbullying victimization, and its 2025 sample found about 58%. The 2025 data also found that 24.5% said they had cyberbullied others at some point in their lives, and 16.1% said they had done so in the previous 30 days. So yes, your kid may be a target. But they may also be participating in harm.

So yes, check the group chats. But also ask about the gaming. Ask about Discord. And ask yourself: would you know if your kid was the bully?

Have grace when they sneak.

If your kid goes behind your back to download an app you said no to—but every single one of their friends has it—take a breath before you react. Kids are not only testing limits; they are operating in an environment where getting around limits is often easy. In Ofcom's 2024 media literacy report, 40% of children ages 8 to 17 said they had given a fake age to access a new app or site. And in March 2026, Ofcom said 72% of children ages 8 to 12 were still using sites or apps with a minimum age of 13.

They're not criminal masterminds. They're kids navigating an environment designed by billion-dollar companies to be hard to resist.

If they post publicly when you told them not to, don't let it slide. But don't just punish—have a real conversation. Hold the boundary. Explain why the boundary exists. Ask what social pressure is driving the behavior. Let them see your real worry for them, and make the conversation as important as the consequence.

In Part 2, we get into what you can actually do—the practical stuff that works, the age-gating problem nobody's solved, and why your kid's need for independence might be the most important thing the phone debate is missing.

Sources