Out-Law News 3 min. read
16 May 2022, 1:15 pm
The European Commission plans to establish a code of conduct to better protect children in the virtual world, working together with the digital industry.
This is part of the Commission's new strategy for a better internet for kids, which was adopted last week. The strategy outlines plans and envisaged actions that shall help to make digital services more age-appropriate and in children's best interests.
The new plans build on the European strategy for a better internet for children adopted in 2012. The Commission said it was necessary to adopt a new strategy because digital technologies and the way children use them "have changed dramatically" since 2012, as children use smartphones and other devices much more often and at a much earlier age.
As an essential step towards increasing the safety of children online, the Commission plans to facilitate an EU code of conduct on age-appropriate design, which shall address the "lack of effective age verification, the gathering of personal data and the commercial manipulation of children as well as the need for child-appropriate communication". The Commission said it will involve the digital industry, policymakers, civil society and children in developing the code. It also said that all digital products and services likely to be used by children should respect "fair and basic design features", such as easily understandable, age-appropriate and accessible terms and conditions, instructions and warnings, and "simple mechanisms to report harm".
Both the strategy and the planned code of conduct build on the Digital Services Act (DSA) recently agreed by EU law makers, which contains new safeguards for the protection of minors. For example, targeted advertising at children using any of their personal data will be prohibited under the DSA. It will also enable the Commission to invite providers of very large online platforms "to participate in codes of conduct and ask them to commit themselves to take specific risk mitigation measures." Platforms will not be obliged to participate in such codes of conduct, participation will be voluntary. However, if they decide to participate, their compliance with the code will be reviewed.
The strategy also highlights the need to introduce a European standard on age verification. The Commission said it will encourage EU member states to introduce electronic IDs for minors to strengthen effective age verification methods. This step will be a part of the planned European digital identity wallet. By also introducing "new and more effective" technical solutions for digital age verification, children shall no longer be able to easily access inappropriate content such as pornography or violent content. According to the Commission, "this type of standard will clarify what is expected from industry when age verification is required on any online tools and services."
Nils Rauer of Pinsent Masons in Frankfurt said the envisaged code of conduct would become an important cornerstone in the endeavour to build adequate media literacy amongst minors. "However, it certainly requires social flanking by parents and notably educational institutions such as schools teaching children how to use digital media sensibly. This is also reflected in the Commission’s overall strategy."
The strategy also envisages investigating the impact of neuro-marketing on children in order to help national consumer authorities to better assess how commercial influencing techniques may affect children.
The EU's code of conduct will not be the first of its kind. The UK has already introduced a code of conduct for the protection of children in the digital world in 2020.The children’s code, officially called the age appropriate design code (AADC), was the first statutory code of practice in the world addressing the use of children’s data. It came into force on 2 September 2020 with a 12-month transition period and aims to ensure that children’s privacy and wellbeing is protected online. The code applies to online services that are likely to be accessed by children under 18, even if children are not the service’s target audience. Both UK-based and non-UK companies are subject to the code if their online services are likely to be accessed by children in the UK. The code is comprised of 15 flexible standards for consideration by online services to improve and protect the experience of children accessing the service.
"In respect of oversight in the EU, the Irish Data Protection Commission (DPC) has the lead," data law expert Andre Walter of Pinsent Masons said. Late last year, the DPC published the fundamentals for a child-oriented approach to data processing (94-page PDF/1,5 MB). "The DPC Fundamentals will serve as the 'EU blueprint' on this topic. The European Data Protection Board (EDPB) embraced them and included guidelines on children's data as part of its work programme for 2022."
With its fundamentals, the Irish DPC highlights, among other things, that the GDPR requires individuals be given certain information about the use of their personal data by organisations processing their data, and that this information must be provided in a concise, transparent, intelligible and easily accessible form, using clear and plain language. It has stressed that the clarity of this information is particularly required where it is being provided to a child. The DPC also emphasises that organisations should use child-friendly language to explain to children exactly what it is that they are doing with their personal data.
There already are examples of services being penalised for information incomprehensible to children in the EU: Last year, the Chinese social media app TikTok had been fined €750,000 by the Dutch data protection authority after its English-language privacy notice was deemed insufficiently clear for children in the Netherlands to understand.