Contracts

6 min read
2/6/2025

Monitoring algorithms: is it in the host's interest to go beyond the law?

Recent case law has clarified the contours and limits of the absence of any general monitoring obligation on hosts. Since the European directive of June 8, 2000 on e-commerce, hosts have enjoyed a regime of irresponsibility with regard to the monitoring of hosted content.

Recognized as mere technical vectors of information, hosts cannot be held liable for the illicit content they store.

However, this is on condition that they had no knowledge of their illicit nature or, when they did have such knowledge, that they acted promptly to remove the content as soon as they became aware of it.

This principle was transposed into domestic law in Article 6 of the Law for Confidence in the Digital Economy of June 21, 2004 (the so-called LCEN Law), which confirms the absence of a general obligation to monitor hosted content.

This exemption is based on a fundamental principle: it is appropriate to charge hosts with a responsibility proportional to the resources they have available for monitoring content.

However, additional obligations were added by the law of 16 August 2022 concerning the distribution of terrorist content online: this law again imposed an injunction procedure for the removal of terrorist content on the Internet within one hour.

The evolution of the digital landscape has prompted European legislators to adjust the legal framework, while retaining the basic principles. For example, the European DSA Regulation of October 19, 2022 reaffirms the absence of a general obligation on intermediary service providers to monitor (Article 8). Indeed, the regulation provides for a regime of non-liability for the various types of intermediary services, under certain conditions:

  • The “simple conduit” service is not liable for the information transmitted, provided that it does not initiate the transmission, select the recipient or modify the data transmitted.

  • The provider of caching services also benefits from this irresponsibility for the automatic, intermediate and temporary storage of information, provided that it does not modify the contents, respects the conditions of access, complies with the updating rules commonly recognized in the sector, does not hinder the legitimate use of standard technologies, and acts promptly to remove or make inaccessible the data as soon as it becomes aware of their removal at source or of a judicial or administrative decision to that effect.

  • Finally, the hosting service provider is not held responsible for content stored at the request of a user, as long as it has no actual knowledge of its illicit nature, or promptly intervenes to remove or block access to it as soon as it is informed.

As a result of this change in the regulatory context and the very sharp increase in contentious content, hosts are tending to protect themselves further by adapting contractual stipulations that frame their service offering, in the sense of increasing content monitoring and moderation requirements for platforms, often beyond the minimum legal obligations.

Reinforcement of the host's obligations to monitor and remove illegal content through the GCU

Contractual freedom, a fundamental principle of the law of obligations, allows parties' commitments to be tailored to their specific needs, including in the field of online hosting. The Court of Cassation recently affirmed, in a ruling dated January 15, 2025 (Cass. com., Jan. 15, 2025, no. 23-14.625), that “Article 6, I, of the law of June 21, 2004 has neither the object nor the effect of preventing the parties to a contract from agreeing that the hosting provider will be subject to an obligation to monitor the content it stores or publishes, nor of prohibiting the provision of a contractual sanction, such as termination, in the event of failure to meet this obligation”.

In this case, the clause stipulated that the host undertook not to host illicit content, particularly that infringing intellectual property rights, and authorized its co-contractor to suspend or terminate the service in the event of an alert concerning such content.

Even if this obligation was not expressly qualified as surveillance, its effect was the same: the host had to implement sufficient technical means to prevent the presence of illicit content.

However, neither the law of June 21, 2004 nor Regulation (EU) 2022/2065 (DSA) impose such a proactive detection obligation on hosting providers. This clause therefore represented a voluntary contractual arrangement, legally admissible, as long as the obligation it institutes remains materially feasible.

Along the same lines, in a ruling dated September 4, 2024 (Cass. com., Sept. 4, 2024, no. 22-12.321), the Cour de cassation had already accepted the validity of a clause in a hosting provider's general terms and conditions, which provided for termination without notice of a paid referencing service in the event of serious or repeated breach by the co-contractor.

Although the clause gave the platform unilateral decision-making power, the judges found that it did not create a significant imbalance, in terms of Article L. 442-1, 2° of the French Commercial Code. In fact, this clause enabled the platform to exercise a contractual right of termination to ensure compliance with requirements arising from regulations on digital services.

The central role of GCU in the presentation and implementation of hosting providers service offerings

The DSA requires hosts to act promptly to prevent the dissemination of illicit content, as soon as they become aware of it. In addition, the same regulation imposes an obligation on hosts to be transparent about the rules governing the use of their services.

In line with this logic, Recital 45 of the DSA specifies that hosting providers must set out in a clear and up-to-date manner, within their GCU, the reasons likely to justify a restriction on access to their services.

This includes content moderation policies, the types of sanctions possible as well as the grounds that may lead to their application.

As a result, the general terms and conditions must indicate the measures the host implements to comply with this obligation to prevent the dissemination of illicit content. In any event, the transparency expected is intended to ensure that users are duly informed of their rights and obligations, as well as of the potential consequences of their actions on the online service.

This requirement is all the more important given that, in practice, the rigorous application of the GCU can produce particularly far-reaching effects, such as the deletion of a user's account, for example.

An example is provided by a recent case decided by the Paris Court of Appeal (CA Paris, 24.01.2025, n°21/10238).

A lawyer had found his Google account suspended, as well as access to Google Drive, after the host detected child pornography files in a file related to a criminal case he was handling as part of his professional activity.


Despite the protests of the interested party, who argued that holding these files was legitimate in the context of his defense mission, the judges found that Google had committed no fault. The company had merely applied its GCU, which expressly provided for the deactivation of accounts in the event of illicit content being detected, without being required to assess the context or any possible justifications for holding them.

Faced with the multiplication of contentious content online and the increase in reports from users, hosts often choose to reinforce their means of detecting illicit content themselves by having users accept them through their GCU.

This gives them additional leeway to block problematic user accounts, remove fraudulent ads. These actions can be carried out using detection and moderation algorithms.

Hosting companies control this content in a measured way in line with good morals. This is the case with Google, which reserves the right to remove all or part of content of a terrorist or child pornographic nature, or content facilitating human trafficking, or inciting harassment.

However, despite such precautions, the use of algorithms detecting specific keywords, this practice can lead to a requalification of the host's legal status.

Indeed, according to a ruling handed down by the Court of Justice of the European Union on March 23, 2010, a host loses its legal status once it adopts an active role in data processing, by selecting or prioritizing certain content.

Thus, in order to retain their status as hosts in the legal sense, hosts must ensure that their GCU do not impose obligations that could lead them to be equated with content publishers, the latter being fully responsible for the content published on their services.

Furthermore, we can legitimately wonder about the true intentions of certain hosts, when they undertake to monitor and moderate certain content: is it really a question of guaranteeing that no content contrary to public order or morality circulates on their online services ? To illustrate the point, here is an excerpt from the Google Drive Terms of Service that is sure to amuse careful readers of such documents :

Other articles that may interest you

See all articles

Contracts

2/6/2025

6 min read

Monitoring algorithms: is it in the host's interest to go beyond the law?

Recent case law has clarified the contours and limits of the absence of any general monitoring obligation on hosts. Since the European directive of June 8, 2000 on e-commerce, hosts have enjoyed a regime of irresponsibility with regard to the monitoring of hosted content.

Recognized as mere technical vectors of information, hosts cannot be held liable for the illicit content they store.

However, this is on condition that they had no knowledge of their illicit nature or, when they did have such knowledge, that they acted promptly to remove the content as soon as they became aware of it.

This principle was transposed into domestic law in Article 6 of the Law for Confidence in the Digital Economy of June 21, 2004 (the so-called LCEN Law), which confirms the absence of a general obligation to monitor hosted content.

This exemption is based on a fundamental principle: it is appropriate to charge hosts with a responsibility proportional to the resources they have available for monitoring content.

However, additional obligations were added by the law of 16 August 2022 concerning the distribution of terrorist content online: this law again imposed an injunction procedure for the removal of terrorist content on the Internet within one hour.

Read the article

Compliances

20/1/2025

12 min read

EHDS Regulation : European Health Data Space

For many years, the European Council has been calling on Member States to strengthen the implementation of their digital health strategies. In this context, on 3 May 2022, the European Commission presented a proposal for a regulation to establish the European Health Data Area (EHDS).

The draft regulation was adopted by the Member States on 22 March 2024 and then by the European Parliament on 24 April 2024. The publication of the text in the Official Journal is expected in autumn 2024, and its entry into force varies depending on the provisions concerned (between 2 years and 10 years).

Read the article

Innovations

13/1/2025

10 min read

AI ACT : Protection of rights and artificial intelligence

For several years, the European Union has sought to oversee the development of artificial intelligence in order to reconcile innovation and the protection of fundamental rights. In this context, Regulation EU 2024/1689 (AI Act) was adopted by the European Parliament and the Council on 13 June 2024, prior to its publication in the Official Journal of the European Union on 12 July 2024.

This text establishes regulations based on a risk-based approach, prohibiting certain practices and imposing strict requirements, especially for high-risk AI systems. The application of this regulation is particularly significant in the field of health, where AI promises major advances while requiring compliance with numerous European laws, such as the RGPD and the MDR regulation.

Read the article

CONTACT

In need of customized
support ?

* Mandatory fields. We collect this data in order to send you the answers you have requested by email. To find out more about the management of your personal data and to exercise your rights, refer to our privacy policy.

Merci, votre message a bien été envoyé !
Veuillez réessayer d'envoyer votre message ou directement nous contacter par téléphone !