The revised Code of conduct on countering illegal hate speech online (Code of conduct+) is integrated into the Digital Services Act (DSA) by the European Commission.
This Code of conduct+ strengthens the conditions applicable to signatory online platforms regarding how they deal with content deemed illegal hate speech according to EU law and Member States’ laws.
Following this integration into the DSA, these online platforms can be encouraged to adopt complementary measures if it appears, following regular assessments, that they do not systematically respect the Code of conduct. Such measures can be defined by the European Commission and the DSA Board within which the BIPT participates as a coordinator (DSC) together with representatives of the other Belgian competent authorities.
Why this revision?
A revision was necessary to strengthen the performance indicators, among which the timelines for processing. This revision was also necessary to facilitate the integration of the Code of Conduct into the DSA, taking into account the objectives and particularities of the Act, for example the articulation of the role of the Monitoring Reporters in relation to that of the Trusted Flaggers.
What is the relationship between the Code of conduct+ and the DSA?
The Code of conduct contributes to the proper application of the DSA by its signatories without affecting the precedence of the obligations it imposes on everyone, i.e. the signatories and beyond, i.e. all providers of intermediary services including online platforms. There is thus a clear hierarchy between the DSA, which is a regulatory instrument that is directly applicable in all Member States, and the Code, which sets out voluntary commitments to which only its signatories subscribe.
Who signed the Code of conduct+?
Dailymotion, Facebook, Instagram, Jeuxvideo.com, LinkedIn, Microsoft-hosted consumer services, Snapchat, Rakuten Viber, TikTok, Twitch, X and YouTube.
Does the Code of conduct+ apply to platforms other than the signatories?
When designated by the European Commission pursuant to the Digital Services Act, providers of very large online platforms (more than 45 million users in Europe) may adhere to the Code on a voluntary basis. Smaller online platforms are also totally free to make the same commitments. However, only very large online platforms will be subject to annual audits conducted in accordance with the DSA and may be asked by the Commission and the Board to implement additional measures following these audits .
What are the commitments made by the online platforms ?
The commitments made by the online platforms under the Code of conduct+ are mainly:
- Authorise Monitoring Reporters to carry out regular monitoring of their processing of hate speech notices;
- Apply their best efforts to review at least two thirds of these notices within 24 hours;
- Commit to carry out very precise and specific actions with a view to ensuring transparency regarding the measures taken to reduce the volume of hate speech disseminated by their services;
- Cooperate with various stakeholders such as experts and civil society organisations;
- Raise awareness among users regarding illegal hate speech.
Who can become a Monitoring Reporter?
Not for profit or public entities with expertise on illegal hate speech in at least one EU Member State. Trusted Flaggers designated according to the DSA may also become Monitoring Reporters.
What is the difference between Monitoring Reporters and Trusted Flaggers?
The status of “Trusted Flagger” is awarded by the national Digital Services Coordinators to organisations on request and which meet strict criteria in terms of expertise, independence and objectivity as defined in Article 22 DSA. This is a long-term role which ends either by means of a decision of the Trusted Flagger, or by a revocation from the national Digital Services Coordinator. Online platforms are required to process their notices as a priority so that decisions can be taken as quickly as possible. Notices from Trusted Flaggers may relate to any content for which their expertise has been recognised and which they consider to be illegal.
The status of “Monitoring Reporter” is awarded by the Commission and the signatories to entities with expertise in hate speech in at least one Member State. Monitoring Reporter take part in a monitoring project for an annual period of maximum six weeks. This is a short-term role. Their notices only target content inciting hatred. The signatory online platforms will apply their best efforts to review at least two thirds of these notices within 24 hours following their receipt.
The two roles are complementary and can be combined if all the criteria in article 22 of the DSA are met.
At what level are the competent Belgian authorities involved?
As the national Digital Services Coordinator and competent authority, the BIPT participates with a representative of the other Belgian competent authorities in the DSA Board chaired by the European Commission. In this respect, the competent Belgian authorities are participating in the assessment of codes of conduct which should help to mitigate the risks of disseminating illegal hate speech in this instance. Moreover, they contribute to defining, together with the Commission and the other members of the Board, the measures that online platforms would be encouraged to adopt in the event of systematic non-compliance with this Code of conduct+.
Please note that at national level, Belgian competent authorities may all receive complaints regarding infringements of the DSA concerning for example a failure by an online platform to follow up on an order or notice, including those relating to incitement to hatred.
However, all complaints regarding platforms established outside Belgium will be sent to the coordinator of the Member State of establishment with a possible opinion.