Contracts
Monitoring algorithms: is it in the host's interest to go beyond the law?
Recent case law has clarified the contours and limits of the absence of any general monitoring obligation on hosts. Since the European directive of June 8, 2000 on e-commerce, hosts have enjoyed a regime of irresponsibility with regard to the monitoring of hosted content.
Recognized as mere technical vectors of information, hosts cannot be held liable for the illicit content they store.
However, this is on condition that they had no knowledge of their illicit nature or, when they did have such knowledge, that they acted promptly to remove the content as soon as they became aware of it.
This principle was transposed into domestic law in Article 6 of the Law for Confidence in the Digital Economy of June 21, 2004 (the so-called LCEN Law), which confirms the absence of a general obligation to monitor hosted content.
This exemption is based on a fundamental principle: it is appropriate to charge hosts with a responsibility proportional to the resources they have available for monitoring content.
However, additional obligations were added by the law of 16 August 2022 concerning the distribution of terrorist content online: this law again imposed an injunction procedure for the removal of terrorist content on the Internet within one hour.

The evolution of the digital landscape has prompted European legislators to adjust the legal framework, while retaining the basic principles. For example, the European DSA Regulation of October 19, 2022 reaffirms the absence of a general obligation on intermediary service providers to monitor (Article 8). Indeed, the regulation provides for a regime of non-liability for the various types of intermediary services, under certain conditions:
- The “simple conduit” service is not liable for the information transmitted, provided that it does not initiate the transmission, select the recipient or modify the data transmitted.
- The provider of caching services also benefits from this irresponsibility for the automatic, intermediate and temporary storage of information, provided that it does not modify the contents, respects the conditions of access, complies with the updating rules commonly recognized in the sector, does not hinder the legitimate use of standard technologies, and acts promptly to remove or make inaccessible the data as soon as it becomes aware of their removal at source or of a judicial or administrative decision to that effect.
- Finally, the hosting service provider is not held responsible for content stored at the request of a user, as long as it has no actual knowledge of its illicit nature, or promptly intervenes to remove or block access to it as soon as it is informed.
As a result of this change in the regulatory context and the very sharp increase in contentious content, hosts are tending to protect themselves further by adapting contractual stipulations that frame their service offering, in the sense of increasing content monitoring and moderation requirements for platforms, often beyond the minimum legal obligations.

Reinforcement of the host's obligations to monitor and remove illegal content through the GCU
Contractual freedom, a fundamental principle of the law of obligations, allows parties' commitments to be tailored to their specific needs, including in the field of online hosting. The Court of Cassation recently affirmed, in a ruling dated January 15, 2025 (Cass. com., Jan. 15, 2025, no. 23-14.625), that “Article 6, I, of the law of June 21, 2004 has neither the object nor the effect of preventing the parties to a contract from agreeing that the hosting provider will be subject to an obligation to monitor the content it stores or publishes, nor of prohibiting the provision of a contractual sanction, such as termination, in the event of failure to meet this obligation”.
In this case, the clause stipulated that the host undertook not to host illicit content, particularly that infringing intellectual property rights, and authorized its co-contractor to suspend or terminate the service in the event of an alert concerning such content.
Even if this obligation was not expressly qualified as surveillance, its effect was the same: the host had to implement sufficient technical means to prevent the presence of illicit content.
However, neither the law of June 21, 2004 nor Regulation (EU) 2022/2065 (DSA) impose such a proactive detection obligation on hosting providers. This clause therefore represented a voluntary contractual arrangement, legally admissible, as long as the obligation it institutes remains materially feasible.
Along the same lines, in a ruling dated September 4, 2024 (Cass. com., Sept. 4, 2024, no. 22-12.321), the Cour de cassation had already accepted the validity of a clause in a hosting provider's general terms and conditions, which provided for termination without notice of a paid referencing service in the event of serious or repeated breach by the co-contractor.
Although the clause gave the platform unilateral decision-making power, the judges found that it did not create a significant imbalance, in terms of Article L. 442-1, 2° of the French Commercial Code. In fact, this clause enabled the platform to exercise a contractual right of termination to ensure compliance with requirements arising from regulations on digital services.

The central role of GCU in the presentation and implementation of hosting providers service offerings
The DSA requires hosts to act promptly to prevent the dissemination of illicit content, as soon as they become aware of it. In addition, the same regulation imposes an obligation on hosts to be transparent about the rules governing the use of their services.
In line with this logic, Recital 45 of the DSA specifies that hosting providers must set out in a clear and up-to-date manner, within their GCU, the reasons likely to justify a restriction on access to their services.
This includes content moderation policies, the types of sanctions possible as well as the grounds that may lead to their application.
As a result, the general terms and conditions must indicate the measures the host implements to comply with this obligation to prevent the dissemination of illicit content. In any event, the transparency expected is intended to ensure that users are duly informed of their rights and obligations, as well as of the potential consequences of their actions on the online service.
This requirement is all the more important given that, in practice, the rigorous application of the GCU can produce particularly far-reaching effects, such as the deletion of a user's account, for example.
An example is provided by a recent case decided by the Paris Court of Appeal (CA Paris, 24.01.2025, n°21/10238).
A lawyer had found his Google account suspended, as well as access to Google Drive, after the host detected child pornography files in a file related to a criminal case he was handling as part of his professional activity.
Despite the protests of the interested party, who argued that holding these files was legitimate in the context of his defense mission, the judges found that Google had committed no fault. The company had merely applied its GCU, which expressly provided for the deactivation of accounts in the event of illicit content being detected, without being required to assess the context or any possible justifications for holding them.
Faced with the multiplication of contentious content online and the increase in reports from users, hosts often choose to reinforce their means of detecting illicit content themselves by having users accept them through their GCU.
This gives them additional leeway to block problematic user accounts, remove fraudulent ads. These actions can be carried out using detection and moderation algorithms.
Hosting companies control this content in a measured way in line with good morals. This is the case with Google, which reserves the right to remove all or part of content of a terrorist or child pornographic nature, or content facilitating human trafficking, or inciting harassment.
However, despite such precautions, the use of algorithms detecting specific keywords, this practice can lead to a requalification of the host's legal status.
Indeed, according to a ruling handed down by the Court of Justice of the European Union on March 23, 2010, a host loses its legal status once it adopts an active role in data processing, by selecting or prioritizing certain content.
Thus, in order to retain their status as hosts in the legal sense, hosts must ensure that their GCU do not impose obligations that could lead them to be equated with content publishers, the latter being fully responsible for the content published on their services.
Furthermore, we can legitimately wonder about the true intentions of certain hosts, when they undertake to monitor and moderate certain content: is it really a question of guaranteeing that no content contrary to public order or morality circulates on their online services ? To illustrate the point, here is an excerpt from the Google Drive Terms of Service that is sure to amuse careful readers of such documents :

Other articles that may interest you

Contracts
2/6/2025
6 min read
Monitoring algorithms: is it in the host's interest to go beyond the law?
Recent case law has clarified the contours and limits of the absence of any general monitoring obligation on hosts. Since the European directive of June 8, 2000 on e-commerce, hosts have enjoyed a regime of irresponsibility with regard to the monitoring of hosted content.
Recognized as mere technical vectors of information, hosts cannot be held liable for the illicit content they store.
However, this is on condition that they had no knowledge of their illicit nature or, when they did have such knowledge, that they acted promptly to remove the content as soon as they became aware of it.
This principle was transposed into domestic law in Article 6 of the Law for Confidence in the Digital Economy of June 21, 2004 (the so-called LCEN Law), which confirms the absence of a general obligation to monitor hosted content.
This exemption is based on a fundamental principle: it is appropriate to charge hosts with a responsibility proportional to the resources they have available for monitoring content.
However, additional obligations were added by the law of 16 August 2022 concerning the distribution of terrorist content online: this law again imposed an injunction procedure for the removal of terrorist content on the Internet within one hour.

Contracts
6/1/2025
13 min read
Software & unilateral price revision: between contractual freedom and legal framework
Through this article, we want to share with you several feedback that can help you prevent the emergence of disputes and, therefore, to secure your commercial relationships.
We will not mention relationships between traders, governed by the Commercial Code. We will focus on a particular, although relatively common, situation, namely commercial relationships between a software publisher and a professional customer.

Compliances
12/2/2025
8 min read
RGPD vs IA: The challenges of protecting personal data in the implementation of AIS
At a time when the first provisions of the artificial intelligence regulation are coming into force, the compliance of AI systems is becoming an essential issue.
Artificial intelligence (AI) is defined by Regulation (EU) 2024/1689 of the European Parliament and of the Council of 13 June 2024 as follows: ” A system designed to work with elements of autonomy and capable, for a given set of human-defined goals, of generating results such as content, predictions, recommendations, or decisions that influence the environments with which it interacts.” The regulation distinguishes between artificial intelligence systems (AIS) and general-purpose AI models.
AIS are AI applications designed for specific tasks or areas, such as medical diagnostic support systems. In contrast, general-purpose AI models are versatile systems, capable of being used in a variety of contexts and for a variety of applications. For example, a natural language processing model can be adapted to perform machine translation.
Artificial intelligence raises complex issues, especially in the area of personal data protection. Indeed, artificial intelligence systems operate using a large or even massive quantity of data, justifying the establishment of a rigorous framework governing their use and processing, while ensuring respect for the fundamental rights of individuals, including respect for privacy.
The challenges are multiple : how to ensure that algorithms do not compromise the privacy of individuals? How can we ensure that the data analysis carried out by AI systems remains ethical and in accordance with the principles of transparency, fairness and accountability?
To face these challenges, which are not the same in the design phase and in the deployment phase, data protection authorities, such as the CNIL in France and the EDPS at the European level, must constantly reassess and adjust their doctrines to inform actors in the field on the compliance procedures to be carried out by integrating technological developments. Here we provide an overview of recent developments in this doctrinal and/or regulatory framework relating to AI and the RGPD.