Out-Law / Your Daily Need-To-Know

Out-Law News 3 min. read

Digital Services Act wins MEPs’ approval, with changes


Online intermediaries face strict new requirements, including around the removal of illegal or harmful content, the sale of illegal products or services, targeted advertising, and the way they design their interfaces, under proposed new EU laws endorsed by MEPs.

The Digital Service Act (DSA) proposals approved by the European Parliament on Thursday 20 January contain significant amendments to the initial version of the DSA proposed in late 2020 by the European Commission. Talks are now anticipated between the Parliament and Council of Ministers, the EU’s other law-making body, on finalising the text. The Council agreed its negotiating position in November last year.

The draft DSA, as approved by MEPs, proposes a tiered approach to regulation, with a series of baseline requirements that all providers of intermediary services would need to adhere to, and more specific regulations applicable to online hosting providers and online platforms on top. Further obligations are envisaged for ‘very large online platforms’.

Very large online platforms will, under the proposed endorsed by MEPs, be subject to a specific duty to “effectively and diligently identify, analyse and assess … the probability and severity of any significant systemic risks stemming from, the design, algorithmic systems, intrinsic characteristics, functioning and use made of their services” in the EU. 

The draft text provides specific examples of the systemic risks the platforms should assess in their risk assessments. These include risks such as the dissemination of illegal content, the malfunctioning of their service and any “actual and foreseeable negative effects on the protection of public health”.

The risk assessments have to be completed at least annually or before new services are launched and, once they are complete, the platforms must put in place “reasonable, transparent, proportionate and effective mitigation measures, tailored to the specific systemic risks” identified.

While the draft DSA contains examples of the mitigating measures very large online platforms might implement, MEPs have added a proposed amendment that would make clear that the requirements to apply those measures “shall not lead to a general monitoring obligation or active fact-finding obligations”. The E-Commerce Directive from 2000, which the DSA is designed to enhance, already prohibits general monitoring obligations from being imposed on intermediaries.

Among the other proposed changes that would affect very large online platforms are draft new requirements around tackling so-called ‘deep fakes’.

“Where a very large online platform becomes aware that a piece of content is a generated or manipulated image, audio or video content that appreciably resembles existing persons, objects, places or other entities or events and falsely appears to a person to be authentic or truthful (deep fakes), the provider shall label the content in a way that informs that the content is inauthentic and that is clearly visible for the recipient of the services,” according to the draft DSA approved by the European Parliament.

Other significant proposals adopted by MEPs are plans to prohibit intermediaries from using “the structure, function or manner of operation of their online interface, or any part thereof, to distort or impair recipients of services’ ability to make a free, autonomous and informed decision or choice”. The draft text cites alleged practices that “exploit cognitive biases and prompt recipients of the service to purchase goods and services that they do not want or to reveal personal information they would prefer not to disclose”.

A list of specific actions that the intermediaries must refrain from taking in the context of their online interface and recipients’ decisions and choices include “giving more visual prominence to any of the consent options when asking the recipient of the service for a decision”.

MEPs have also moved to toughen the proposed new requirements around targeted advertising.

Under the plans, online platforms would be required to provide “meaningful information, including information about how their data will be monetised” to recipients of their service to enable those users to make informed decisions on whether to consent to the processing of their personal data for the purposes of advertising.

Platforms would be prohibited from disabling users’ access to “the functionalities of the platform” if they refuse to consent to the processing of their personal data for the purposes of advertising. The use of “targeting or amplification techniques that process, reveal or infer personal data of minors” for the purpose of displaying advertisements would also be prohibited.

The use of “special categories of data”, which, under data protection law, includes information about a person’s race or ethnicity, political opinions, religious beliefs, their health and sexual orientation, would also not be permitted for the purposes of “targeting individuals” if the MEPs’ proposals are adopted.

A new right for recipients of intermediary services to seek compensation from the service providers is also envisaged under the revised DSA proposals. This right would apply “against any direct damage or loss suffered due to an infringement by providers of intermediary services of obligations established under [the DSA]”.

We are processing your request. \n Thank you for your patience. An error occurred. This could be due to inactivity on the page - please try again.