Steve Wood, Director of PrivacyX Consulting and former UK Deputy Information Commissioner, gives the low down on the UK Children’s Code.
Harmful content and contact are often front of mind as a risk to children online but how personal data is used to target and profile children is becoming a crucial issue in how those risks manifest themselves. Personal data is collected from children when they sign up for services, tracks how they move from one service to another, and records their likes and engagements with content and people. This data builds up a profile. It is estimated that when children reach the age of thirteen, 72 million datapoints will already exist about them, just in the advertising space alone. Kids’ personal data can be exposed to public settings and their data will inform what types of videos and posts are recommended to them. Protecting data is therefore a key part of protecting kids online.
In the UK, the General Data Protection Regulation (GDPR) took effect in 2018. While the UK has had a data protection law for over 30 years, the GDPR was an important step forward – it contained stronger sanctions, including fines of up to 4% of global turnover. Crucially, it also contained provisions that made clear that children’s personal data required additional protection and organisations using this data must take steps to address the specific risks. To put this into practice, the law also required the UK Data Protection Regulator, the Information Commissioner, to create a special code of practice, dedicated to setting out the detailed standards that organisations must follow in protecting children’s data online. This was a world first. The law also made clear that the Code must ensure the best interests of the child are reflected in the guidance. Children were defined as under 18. The Code also had to reflect “age-appropriate design” – this meant that organisations providing online services likely to be accessed by children had to design in protections at the outset and bear in mind the different developmental ages of children. It is formally called the Age-Appropriate Design Code, or the Children’s Code for short.
The Code covers a range of digital platforms, such as social media, gaming and streaming services likely to be used by children. The Code is not about keeping children off the Internet – it’s about providing them with an Internet where services are designed with protection built in for their data, which gives them and their parents greater confidence in how they can learn, explore and play online. The code sets up 15 standards that organisations have to conform to. Some examples of what organisations have to do:
Here are just a few examples:
Many of the online services mentioned above have decided to make the changes to their platforms globally, so users outside of the UK have also benefited. A number of other countries are looking at developing similar codes and laws, for example California has passed a similar code into law and Australia has planned a code as well. In the EU, the Digital Services Act contains a number of important provisions that protect children’s data and the EU will shortly issue its own code.