Brief refresher on the Children’s Code: In 2020, the ICO published its Age appropriate design: a code of practice for online services (the “Code”). The Code set out 15 standards applicable to information society services (“ISS”) aimed at or likely to be accessed by children, requiring the “best interests” of the child to be the primary driver of product and service design. We have published an article setting out the aims of the  Code and practical steps organisations should take to ensure compliance, which you can find here.

As a starting point, it is necessary for all organisations to assess whether the Code applies to their existing or anticipated services. If the Code does apply, the ICO expects organisations to be able to evidence their compliance. Where it is decided that the Code does not apply, the ICO expects organisations to document the reasons for this decision. This emphasises the importance of evidentiary documentation but raises questions as to how an assessment can be wholly carried out for ISS providers to be able to determine which category they fall into for compliance purposes. This is straight-forward if your service is wholly or significantly intended for use by children, but in cases where services are run as ‘adult-only’ for example, would these services still need to comply with the Code? In determining whether they fall within scope of the Code, ISS providers are tasked with conducting routine self-evaluation, which involves completing risk and accountability checks before providing a service and retaining documentary. The ICO has published useful guidance on what it means for a service to be “likely to be accessed by children”. The key takeaways are summarised below.

  • If you are not sure whether children access your service: As an ISS, you must assess whether there is evidence that the existing users of your service include children and keep your decision under periodic review. If you conclude that your site is not likely to be accessed by children, an effective way of demonstrating this and limiting the likelihood of it changing is putting functional age-assurance measures in place. A tick-box self-declaration page is not sufficient evidence.
  • Conducting a DPIA: To satisfy accountability requirements, ISS providers must routinely conduct a data protection impact assessment (“DPIA”) to determine if they are offering online services to children and assess whether children are likely to access their services in a manner that is proportionate to the risk the service presents to children. In conducting this, the following factors are considered:
    • the types of information collected;
    • the volume of information;
    • the intrusiveness of any profiling;
    • whether decision-making or other actions follow from profiling; and
    • whether information is shared with third parties.

A practical example of how this feeds in to the assessment criteria can be seen in the case of a popular pornography site. It is most likely that when children access such a service, their personal data may be processed through third party cookie data sharing and profiling which is not off by default, and not in the best interests of the child. A DPIA would determine that their data processing is likely to result in a high risk to the rights and freedoms of children if they do access the site and the site would therefore fall within scope of the Code.

  • If you have advertising information targeted at children: You are likely to fall within scope of the Code if advertisements on your service platform (including third party advertisements) are directed at or likely to appeal to children.
  • If a child accesses your age-gating page only and does not access the rest of the site: If you have an age-gating page to prevent access to under 18s, you service site will not be covered by the Code if:
    • you only use the age-gating page to ensure that children are not accessing your adult site;
    • the measures are robust and effective to prevent under 18s accessing the service; and
    • it does not allow access to parts of your adult site before age assurance occurs.
  • If your content, design features and activities are of a nature that can be appealing to children: Examples may include cartoons, animation, music or audio content, incentives for children’s participation, digital functionalities such as gamification, presence of children, influencers or celebrities popular with children. The presence of these characteristics points to a decision that your service is likely to be accessed by children and fall within scope of the Code.
  • Ascertaining the number of child users to be considered ‘significant’: The number of UK child users may be considered significant either (a) in absolute terms or (b) in relation to the proportion it represents of total UK users of the service or the total number of children in the UK. If option (b) is used, this should be assessed on the basis of information on the current UK current official population information and cross-references with evidence gathered from the any age-profiling tools used.
  • If you run an adult-only service and have ascertained that children are likely to access your service: There are two routes to be considered if this applies to your service:
    • apply the principles of the Code to all users to satisfy the risk-based approach; or
    • apply and evidence appropriate age assurance measures to wholly restrict access by under 18s so that they are no longer likely to access your service.
Author

Vin leads our London Data Privacy practice and is also a member of our Global Privacy & Security Leadership team bringing his vast experience in this specialist area for over 22 years, advising clients from various data-rich sectors including retail, financial services/fin-tech, life sciences, healthcare, proptech and technology platforms.

Author

Chiemeka works as a privacy specialist in Baker McKenzie's Intellectual Property & Technology Practice Group and is based in the firm's London office. He is a Nigerian-qualified lawyer who focuses in data protection, privacy, and technology transactions.

Author

Marilyn is an associate in the Intellectual Property, Data and Technology team based in London. She joined Baker McKenzie as a Trainee Solicitor in September 2020 and was admitted as a solicitor in England and Wales in September 2022. During her training, Marilyn was seconded to Baker McKenzie's Dubai office for six months and later to Google's commercial legal team for six months.