Last week the ICO turned their attention to age assurance in relation to the Children’s Code by publishing the Information Commissioner’s opinion on age assurance and calling to the public for evidence on the use of age assurance tools.
The Children’s Code came into effect from 2 September 2021 and applies to providers of Information Society Services (ISS) that are likely to be accessed by children, whether that be apps, websites or social media sites. According to one Ofcom report, in 2019 half of all 10-year olds owned a smartphone and 24% of 3- and 4-year olds had their own tablet. As a result of being stuck in the house for (most of) the past 18 months and the ever-growing use of technology, these figures are likely to have since risen. As the statistics suggest, children cannot be protected from the digital world, so the Code seeks to protect children from within the digital world by ensuring it is designed with children in mind. The Code contains 15 standards, and ‘age appropriate application’ is one of them. This covers both assurances that children cannot access adult or inappropriate content and age estimation of users.
The opinion sets out how the Commissioner expects ISS providers to conform with the ‘age appropriate application’ standard. ISS providers have sought clarity on the level of ‘age certainty’ required and how to collect this data whilst also complying with data minimisation principles. This is often a tricky balancing act. The solution: there is no ‘one size fits all’ answer, and a risk-based approach is recommended.
Irrespective of the risk level, the ICO recommends applying all the Code’s standards to all users, not just to children. But of course, this may not be practical. As a secondary option, if the risk to children is high (for example, if you are personalising marketing content based on a child’s data), the ISS provider should introduce age assurance measures that give the highest level of certainty on the age of users. This could be achieved by using tools requiring users to verify their age, not just self-declare it. However, if the risk to children is lower, the ISS provider should introduce measures that give “a level of certainty on the age of child users” that is proportionate to the potential risks.
The opinion outlines the different age assurance approaches. From highest age certainty to lowest, these include:
- age verification – this requires a user to verify their age, perhaps by providing documentation. This is appropriate for high risk ISS activities.
- age estimation – estimating a person’s age, usually using AI technology or other algorithmic means.
- account confirmation – this method is used by many ‘video on demand’ services and requires an account holder to confirm the ages of other users.
- self-declaration of age – asking the user to fill in their date of birth or confirm that they are over 18.
It is likely that we have all encountered self-declaration of age, and perhaps account confirmation, and we all recognise the potentially limited protection this provides. But with lower risk ISS activities, this need to protect must be balanced with the proportionate processing of data, and therefore such approaches may be appropriate. Essentially, each provider will need to establish the level of risk and select an approach that is proportionate to this. The level of risk can be determined by undertaking a Data Protection Impact Assessment (DPIA), see more here.
Alongside the opinion, the ICO have called for evidence to develop and maintain their knowledge on age assurance technology. By keeping up with developments, they hope to strengthen the ability to regulate effectively and fairly, and deliver an industry standard in age assurance. The survey poses questions on categories including emerging approaches to age assurance and estimation, data protection risks and economic considerations. The survey is open until 9 December 2021.
To find out more about the Code see here.
The Commissioner recognises that age assurance may require processing of personal data beyond that involved in the delivery of a core service. However, the risks to children online are very real.