The UK Children’s Code: Protecting Kids’ Data

Steve Wood, Director of PrivacyX Consulting and former UK Deputy Information Commissioner, gives the low down on the UK Children’s Code.

Why children’s personal data matters
Harmful content and contact are often front of mind as a risk to children online but how personal data is used to target and profile children is becoming a crucial issue in how those risks manifest themselves.  

Personal data is collected from children when they sign up for services, tracks how they move from one service to another, and records their likes and engagements with content and people. This data builds up a profile. It is estimated that when children reach the age of thirteen, 72 million datapoints will already exist about them, just in the advertising space alone. Kids’ personal data can be exposed to public settings and their data will inform what types of videos and posts are recommended to them. Protecting data is therefore a key part of protecting kids online.
How the law protects personal data
In the UK, the General Data Protection Regulation (GDPR) took effect in 2018. While the UK has had a data protection law for over 30 years, the GDPR was an important step forward – it contained stronger sanctions, including fines of up to 4% of global turnover.  Crucially, it also contained provisions that made clear that children’s personal data required additional protection and organisations using this data must take steps to address the specific risks. 

To put this into practice, the law also required the UK Data Protection Regulator, the Information Commissioner, to create a special code of practice, dedicated to setting out the detailed standards that organisations must follow in protecting children’s data online.  This was a world first. The law also made clear that the Code must ensure the best interests of the child are reflected in the guidance. Children were defined as under 18.

The Code also had to reflect “age-appropriate design” – this meant that organisations providing online services likely to be accessed by children had to design in protections at the outset and bear in mind the different developmental ages of children. It is formally called the Age-Appropriate Design Code, or the Children’s Code for short.  
What does the Children’s Code cover?
The Code covers a range of digital platforms, such as social media, gaming and streaming services likely to be used by children.

The Code is not about keeping children off the Internet – it’s about providing them with an Internet where services are designed with protection built in for their data, which gives them and their parents greater confidence in how they can learn, explore and play online.

The code sets up 15 standards that organisations have to conform to. Some examples of what organisations have to do:

  • Understand the age of their users so that they can ensure that age-appropriate protections are in place for their personal data. For example, if an online platform knows the age of its users it can ensure that certain features are switched off or set to “off” as default. For example settings that allow direct messages from strangers can be switched off or sharing of location information to the wider world can be switched off. For the services that pose higher levels of risks they may have to use tools such as age verification or age estimation to check the age of the user.
  • Transparency and fairness. There needs to be information that children can understand and children must not be unfairly nudged to give away data or change privacy settings. For example children must be able to understand the implications of changing a setting that allows them to post information publicly. This may need to be different for a 12-year-old compared to a 17-year-old.
  • Profiling and detrimental uses of children’s data. Online services who profile children e.g. put them in a category that indicates they may like a certain type of video. They can only do this if they have assessed the harm to children and put safeguards in place.
If organisations don’t follow the standards in the code, the regulator in the UK can investigate and sanction the organisations if the breaches of the law are serious. Parents or children can also complain if they think their data protection rights have been breached. This can include fines or notice to force them to change their practices. The Code came into force in the UK in 2021.
Can you give some examples of what organisations have done to comply?
Here are just a few examples:

  • Instagram, Snapchat, YouTube and TikTok have all stopped using targeting advertising for children on their services. This means that advertising cannot be based on data about what children like or interact with.
  • Instagram, Snapchat and TikTok have all changed their settings so that strangers cannot send direct messages to children.
  • Instagram, Snapchat and TikTok have made changes to the way that their recommendation systems present content to children and have taken additional steps to prevent harmful content related to issues such as self-harm and weight loss are not recommended.
  • YouTube has switched off its autoplay mode, so that children are not nudged into spending more time online than may be healthy.
What about the rest of the world?
Many of the online services mentioned above have decided to make the changes to their platforms globally, so users outside of the UK have also benefited.
      
A number of other countries are looking at developing similar codes and laws, for example California has passed a similar code into law and Australia has planned a code as well. In the EU, the Digital Services Act contains a number of important provisions that protect children’s data and the EU will shortly issue its own code. 
ESRB globe SealESRB kids Seal