EU Parliament nails down agreement on child sexual abuse regulation

The main political groups of the EU Parliament reached an agreement on the draft law to prevent the dissemination of online child sexual abuse material (CSAM) on Tuesday (24 October).

The proposed regulation aims to prevent and combat CSAM by requiring digital platforms in the EU to detect and report such material. The draft law attracted criticism because, in its original form, it would empower judicial authorities to ask intercommunication services like WhatsApp or Gmail to scan people’s private messages to find suspected content.

The agreed text, seen by Euractiv, focuses on the EU Centre and detection orders. The European Parliament’s Committee on Civil Liberties, Justice and Home Affairs is expected to adopt the file on 13 November, paving the way for the last stage of the legislative process.

EU Centre

The EU Centre will be a central hub of expertise to help fight child sexual abuse in the EU. Following a compromise text in September, which focused on changes in the roles of this body, the current text also introduced some changes.

The EU Centre will enjoy “the most extensive legal capacity accorded to legal persons” under the law of each member state.

Previous versions of the document suggested that the EU Centre should be located in the Hague, in the Netherlands, but this has been a debated part of the file. The new version does not explicitly say where the Centre should be located but states that the location cannot affect the tasks or the recruitment process.

Moreover, it has to be someplace where it can be set up on-site after the regulation comes into effect. The location must ensure “a balanced geographical distribution of EU institutions”, as well as sustainability, digital security, and connectivity “with regards to physical and IT infrastructure and working conditions”.

The Centre will also be able to search for CSAM in “publicly accessible content”, similar to a web crawler, a bot used, for example, by search engines to collect content so they can appear in search results. OpenAI also uses this technology for ChatGPT.

The EU Centre also must be independent and have a Fundamental Rights Officer to oversee the exercising of tasks.

Europol

Europol can request information from the Centre, which, if considered “necessary and proportionate”, has to go through “an available secure exchange communication tool, such as the Secure Information Exchange Network Application”, an information exchanging platform already used by Europol, member states, and third parties.

If the EU Centre finds that a report is “unfounded”, it has to be forwarded to Europol “in accordance with the Union law.”

Personal data processed in Europol’s information system can only “be granted on a case-by-case basis, upon submission of an explicit and justified request, which documents the specific purpose”.

So Europol can only transfer such data to the EU Centre when it is absolutely necessary and is “proportionate to the specified purpose”. In the past years, Europol was at the centre of an investigation by the European Data Protection Supervisor for data processing practices outside of its mandate, which was later on revamped.

Germany suggests splitting up child sexual abuse material regulation

Germany has suggested splitting off the more controversial parts of the draft regulation aiming to prevent and combat child sexual abuse material online, according to the country’s position paper dated Thursday (12 October) and seen by Euractiv.

The proposed regulation aims …

Encryption and detection orders

Encryption has been perhaps the most debated part of the draft law concerning detection orders, which would be issued to request communications services to detect suspected CSAM.

Digital rights organisations, experts, and messaging apps like Signal or WhatsApp said this tool would break end-to-end encryption, weakening data security and privacy rights.

According to the new text, the technologies used to detect CSAM must be “audited independently as per their performance”, whether provided by the EU Centre or developed by the provider using them.

As the regulation always made it clear that it should be technology-neutral, there are no specific requirements regarding what technology should be used.

The document also says that “the EU Centre shall decide the extent of the audit that will be made publicly available.” This, however, should not apply to end-to-end encryption.

Detection orders should also target a specific group of users. For example, “subscribers to a specific channel of communication”. Even so, there has to be “reasonable grounds of suspicion” about a link to child sexual abuse so that they can be targeted.

App stores and age verification

The role of software application stores with CSAM has also been under discussion for a while.

App store providers designated under the Digital Markets Act are mandated, for apps whose use is not permitted to children, to make reasonable efforts to ensure that is actually the case. MEPs detailed specific criteria for age verification systems.

Age verification systems can also be put into use but are not mandatory, only in the case of porn platforms.

[Edited by Luca Bertuzzi/Nathalie Weatherald]

Read more with EURACTIV