Who Is Responsible for Kids’ Safety on Smartphones? Spotlight on Apple and Google
📱 Age Verification Laws Set to Reshape App Store Practices
A global push is gaining traction to hold app stores like Apple’s App Store and Google Play accountable for verifying users’ ages before allowing app downloads. This marks a significant shift in digital policy, especially regarding child safety on smartphones.
Previously, the focus of age verification was on individual apps or websites. But new legislation suggests that app marketplaces themselves may soon bear the responsibility — a move that could reshape how users access content on mobile devices.
🛡️ The Growing Debate: Who Should Protect Children Online?
Eric Goldman, a professor of law at Santa Clara University, believes the tech industry is currently facing significant internal disagreement over responsibility for children’s digital safety.
“Right now, it feels like everyone is shifting the blame rather than offering real solutions,” he noted.
States like Texas, Louisiana, and Utah have already passed laws requiring app stores to verify a user’s age when creating an account. Singapore has also implemented similar regulations. These laws are expected to come into effect next year, while other U.S. states and Congress consider similar legislation.
🔐 Finding the Right Balance: Age Verification vs. Digital Rights
Although the intent is to protect children, critics warn that mandatory age checks may compromise user privacy and freedom of speech. In the UK, for example, some users now need to verify their age via facial recognition technology.
Moreover, skeptics question whether such laws will genuinely reduce access to adult content, especially when users can bypass restrictions using VPNs (Virtual Private Networks) or alternative browsers.
👨👩👧👦 Tech Giants Face Pressure — and Push Back
Interestingly, some of the momentum behind these laws comes from within the tech industry. Companies like Meta (formerly Facebook) support shifting responsibility to app stores, potentially reducing their own legal exposure.
However, Apple and Google — operators of the largest app stores — oppose the regulations. They argue the rules are too broad and would force them to collect unnecessary personal information.
Kareem Ghanem, Google’s Director of Public Policy, stated:
“A weather app doesn’t need to know if a user is a child. App developers are in the best position to determine what content is age-appropriate.”
Patchwork Laws Create Regulatory Confusion
As of now, 12 U.S. Several U.S. states have enacted laws aiming to control how minors access social media and mobile applications. Meanwhile, 24 states require age verification to access adult content. However, with inconsistent rules across states, many developers fear a fragmented regulatory landscape.
To address this, some industry groups advocate for a centralized age verification system, rather than having every app or site create its own. Peter Chandler, who leads the tech group Internet Works, which includes platforms such as Reddit and Roblox, highlighted the need for straightforward solutions.
“It’s unreasonable to expect parents to navigate a maze of unique age checks. A unified solution is more practical.”
🧒 Apple and Google’s Proposed Solutions

Although Apple and Google have expressed concerns about broad legal mandates, both companies have signaled support for a collaborative industry standard to enhance online safety for kids. Both support a system where app stores provide apps with a user’s general age range — information that could be voluntarily submitted by parents.
Apple is preparing to roll out its own “age assurance” technology, designed to share age-related information with apps without collecting sensitive user data.
Apple has likened the app store ecosystem to a shopping mall, arguing that only vendors selling sensitive products should be required to verify customer age, not every store visitor.
“Just as a liquor store in a mall must check IDs, we believe high-risk apps should verify age — but not every visitor to the mall needs to provide their birthdate just to access the food court.”
☎️ Behind the Scenes: Industry Lobbying Heats Up
In May, Apple CEO Tim Cook reportedly contacted Texas Governor Greg Abbott, urging him to reconsider or veto the state’s app store bill. Despite the outreach, the bill was signed into law.
Meanwhile, some legal scholars warn that these laws may infringe on the First Amendment, as they could limit adults’ access to constitutionally protected content. While tech giants haven’t launched lawsuits yet, the debate continues.
In the past, the U.S. Supreme Court struck down broad internet censorship laws. More recently, the U.S. Supreme Court approved a Texas law targeting online adult content, potentially influencing future decisions on internet regulation
Federal Action Still in Limbo
Congress has yet to pass nationwide regulations addressing online child safety. Proposed legislation like the Kids Online Safety Act aims to impose a “duty of care” on digital platforms, but it has stalled in the House despite passing the Senate.
Other lawmakers, including Sen. Mike Lee (R-UT) and Rep. John James (R-MI), are working on bills that would regulate harmful content — such as violence and explicit material — via app stores.
🔮 The Future of Online Safety: Who Will Take the Lead?
The debate over who is responsible for children’s safety online is far from over. While some advocate for app-level control, others push for industry-wide standards led by the app marketplaces.
With rising pressure from lawmakers, tech companies, and parents alike, it’s clear that the age verification debate will shape the future of digital policy — and possibly, the way the internet works.