If you’re a parent in the United States, it is very easy to feel overwhelmed by the sheer number of laws and regulations around kids privacy and safety. If it feels like there are constant developments and new rules being proposed, that’s because there are! There’s also quite a lot of noise to cut through. The goal of this blog is to lay out what the rules are, and what they might look like in the future.
If there’s any law you might have already heard of in the kids’ privacy space (especially if you’re a fan of the show Silicon Valley), it’s probably COPPA. Originally passed in 1998, it’s one of the oldest laws on the books addressing kids’ online privacy in the US, and in fact one of the first kids’ online privacy laws, period. At a super-high level, COPPA was intended to be a tool to empower parents. It requires operators of online services that are directed to or likely to appeal to children to get a parent’s explicit consent before collecting any personal information from their child under the age of 13. Because of COPPA, many services that appeal to both children and adults often utilize an “age-screen” or “age gate” to ask users to provide their age; the operator then uses that age to provide the user with an age-appropriate experience. Despite its age, COPPA continues to be one of the biggest heavy-hitters in the world of kids’ online protection. Big headline-grabbing enforcements and fines over the past few years, such as $5.7m against Musical.ly (now known as TikTok), $170 million against YouTube, $20m against Xbox, and $520m against Epic Games (maker of Fortnite), plus loads of smaller enforcements, show that this law still has serious teeth. The US Federal Trade Commission (FTC) is the agency assigned to enforce COPPA, and also has the authority to issue regulations and rules to clarify the law and keep it up to date with modern technology. The FTC is actually in the process of updating the COPPA Rule right now, to try to address a host of new issues. One area they are focusing on is how to define a “service directed to kids” – as we’ve seen over the years, even if a company doesn’t want kids to use their service, that doesn’t exempt them from following the law if kids are trying to play. Despite the longevity of COPPA over the years, there are still many people who say it doesn’t go far enough to protect kids in light of today’s technology. Critics say it’s too easy for kids to simply lie about their age to get access to adult-oriented services, or that it’s too difficult and confusing for parents to sign off on every single online service their kid might want to access. However, the FTC has another trick up its sleeve…
This is where things get a little more nuanced. Looking back at the December 2022 enforcement against Epic Games, we see a new trend possibly emerging. The FTC acknowledged that COPPA only applied to the children playing Fortnite who were under 13 years old. Still, the FTC noted that teenagers aged 13-18 were also playing Fortnite, and were allegedly subject to abuse and harassment in-game from strangers over voice and text chat, which Epic had turned on by default when players started the game. In addition to its authority to enforce COPPA, the FTC also has a fairly broad enforcement authority to go after companies who engage in any “unfair or deceptive acts or practices in or affecting commerce.” The FTC used this separate authority to allege that Epic Games’ business practices were unfair to teens, and then required Epic as part of its settlement to ensure that going forward, voice and text communications will be off by default for anyone under 18. This emphasis on “privacy by default” was a first for the FTC, and signaled a paradigm shift breakthrough in how we think about protecting kids in games. In some ways, it also brought the US more in line with other international countries, who have started to think more holistically about how to protect kids online, shifting the conversation away from parental consent and more on imposing obligations on online providers themselves. But wait -- here’s where we hit a bit of a snag.
Right around the Epic Games FTC enforcement, the state of California passed a new law, called the California Age-Appropriate Design Code Act (CA AADCA). Modeled after similar legislation in the UK, this law imposed on online businesses with services likely to be accessed by children a duty to consider the holistic “best interests” of children when designing their products and services. Included in the law as well was an obligation to implement reasonable “age assurance,” i.e., adopt methods (proportionate to the risks) to ensure that users were telling the truth about their age when signing up for a service. Internet free speech advocacy group NetChoice sued in the state of California to stop enforcement of the law, arguing that the CA AADCA violated the US Constitution. The district court judge agreed, and issued an injunction. See, the US Constitution provides strong protections for US citizens – even kids – to access and engage in protected speech. The district court held that if the law required users to provide a photo ID, or even scan their face in order to access sites with protected speech content, then this could have a chilling effect on both kids and adults’ ability to access crucial information on the internet. The district court’s injunction is currently being appealed. What makes this case so important is that it cuts to what is arguably the biggest conundrum in protecting kids online today: we want our kids to be safe, but also don’t want to keep them from experiencing the world and forming positive bonds with other people online, or accidentally cut off adults’ access to the internet in the process.
There are plenty of proposals out there to change kids’ privacy laws, and all of them are complicated. There have been bills proposed for a COPPA 2.0, which would for example raise the minimum age for parental consent from 13 to 16, as well as the Kids Online Safety Act (KOSA) which like the CA AADCA seeks to impose new obligations on online platforms to protect kids. Any new law is likely to face Constitutional challenge, especially if the government uses them to attempt suppress protected speech – we note there are certainly those in the US who would like to use new kids’ safety laws as a justification to attack online providers of resources for LGTBQ+ kids. In the meantime, the tech industry is doing its best to step up its efforts, rolling out new parental controls and privacy protective features to keep kids safe. And k-ID is part of that effort, trying to streamline the parental consent and permissioning process for both parents as well as developers. Finding the right balance is tricky, but it’s worth it!