Potency of the European Union's Code of Practice on Disinformation

 

Questions concerning the potency of European Union’s Code of Practice on Disinformation.

In this post I will discuss the European Union’s Code of Practice on Disinformation (hereafter referred to as the Code of Practice) and how it relates to what is termed Very Large Online Platforms (VLOPs) such as Twitter. In the last month or so there has been much finger pointing and moral indignation on the part of the mainstream media with respect to Twitter and its alleged disregard for the Code of Practice. Much of this narrative echoes the European Unions (EU) rhetoric concerning the consequences for Twitter if it does not comply with the voluntary Code of Practice within the context of the European Parliaments (EP) enactment of the Digital Services Act (DSA).

I will attempt to untangle the rhetoric and apparent dichotomy in the narrative and answer the question of how enforceable the rules enacted under the DSA in practice.

I will begin by describing how an influential reader funded mainstream media news outlet conveys this narrative to its respective readers. I am referring to the British Broadcasting Corporation (BBC). Reading this article, one could be left with the impression that Twitter is being confrontational in its defiance of legal commitments. I intend to test the veracity of this narrative showing that the Code of Practice allows a degree of flexibility for VLOPs such as Twitter.

On 27 May, 2023 BBC News published an article written by Francesca Gillett entitled:  Twitter pulls out of voluntary EU disinformation code

Ms. Gillett opens her commentary reporting on the adversarial tone taken by the European Union (EU) with respect to Twitter:

“Twitter has pulled out of the European Union's voluntary code to fight disinformation, the EU has said. Thierry Breton, who is the EU's internal market commissioner, announced the news on Twitter - but warned the firm new laws would force compliance. ’Obligations remain. You can run but you can't hide,’ he said.”

From this we can see that Mr. Thierry immediately sows an element of confusion by emphasising the voluntary and also obligatory nature of the Code of Practice. Ms. Gillett proceeds to quote Mr. Thierry further in his warning to Twitter as follows:

“Twitter will be legally required to fight disinformation in the EU from 25 August, he said, adding: ‘Our teams will be ready for enforcement.’”

In the above Mr. Breton asserts a legal requirement to be imposed on Twitter. So, what we have here is, according to Ms. Gillett a voluntary code that will result, from what Mr. Breton said enforced compliance under the Code of Practice.

So, how does Ms. Gillett reconcile this apparent dissonance in language? I will label this as the voluntary - enforceability dissonance.

Ms. Gillett writes that many small and big Information Technology corporations such as Meta, TikTok, Google, Microsoft and Twitter had signed up to the Code of Practice.

In regard to this I refer you to the European Commission (UC) publication entitled: 2022 Strengthened Code of Practice on Disinformation on June 16, 2022.

Ms. Gillett in her article summarises the intent of the 2022 Strengthened Code of Practice on Disinformation. She writes:

“The code was launched in June last year, and aims to prevent profiteering from disinformation and fake news, as well as increasing transparency and curbing the spread of bots and fake accounts.”

In the following comment Ms. Gillett writes that it is up to the firms concerned to choose which pledges under the Code of Practice to act upon:

“Firms that sign the code can decide which pledges to make, such as cooperating with fact-checkers or tracking political advertising.”

Ms. Gillett at this point does not resolve the voluntary - enforceability dissonance. Twitter, seemingly does seem to have a certain degree of latitude in terms of the pledges adopted by VLOPs. Once a pledge is made, we need to determine whether it is enforceable.

Instead of helping us ascertain this enforceability, Ms. Gillett goes onto to paint a negative narrative of Twitter’s behaviour under Elon Musk’s ownership. Concerning Twitter’s actions around the moderation of content Ms. Gillett writes:

“Under Elon Musk's ownership, moderation at Twitter has reportedly been greatly reduced - which critics say has allowed an increase in the spread of disinformation.”

As evidence of Twitter seemingly fudging its pledge to combat disinformation Ms. Gillett informs us:

“The social media giant used to have a dedicated team that worked to combat coordinated disinformation campaigns, but experts and former Twitter employees say the majority of these specialists resigned or were laid off.”

Furthermore, Ms. Gillett informs us that the BBC diligently did its own check in April, 2023 revealing that the:

 “….  BBC found hundreds of Russian and Chinese state propaganda accounts were thriving on Twitter.”

Ms. Gillett is basing these comments on a previous BBC analysis entitled: Twitter staff cuts leave Russian trolls unchecked - BBC News authored by Grigor Atanesian, spokesman for the Global Disinformation Team.

From all these comments we see that Twitter is accused of reducing its efforts in the area of content moderation. Ms. Gillett does not say that Twitter is breaking its pledge; Twitter is not literally stopping all its efforts to moderate content. 

Indeed. Ms. Gillett informs us that Elon Musk defends himself by saying that since he took ownership of Twitter in October 2022 the level of misinformation has dropped. So, we see that Twitter is applying a level of discretion in the area of countering misinformation.

Ms. Gillett then goes onto to inform us about how the code will be enforced:

“Alongside the voluntary code, the EU has also brought in a Digital Services Act - a law which obliges firms to do more to tackle illegal online content.

So, voluntary - enforceability dissonance may be resolved by the European Parliament’s (EP) Digital Services Act (DSA). The operative word used is “obliges”. Perhaps this Act, which came into effect on November 22, 2022 may resolve the dichotomy.

Here is a link to the EPs Briefing Paper, in PDF form entitled: Digital services act (europa.eu) regarding the DSA:

So, what enforceable obligations is Ms. Gillett referring to? Ms. Gillett writes:

From 25 August, platforms with more than 45 million monthly active users in the EU - which includes Twitter - will have to comply legally with the rules under the DSA.

So, the obligation takes the form of legal compliance to DSA rules. I will try to unpack the substance of this legal compliance.

Ms. Gillett’s comment is based on a Press Release with the title: Digital Services Act: Commission designates first set of Very Large Online Platforms and Search Engines published on April 25, 2023.

The Press Release opens with the following proclamation:

“Today, the Commission adopted the first designation decisions under the Digital Services Act (DSA), designating 17 Very Large Online Platforms (VLOPs) and 2 Very Large Online Search Engines (VLOSEs) that reach at least 45 million monthly active users.”

Twitter is listed as one of the “VLOPS. On the basis of this Ms. Gillett writes:

“The law will mean Twitter will have to have a mechanism for users to flag illegal content, act upon notifications "expeditiously" and put in measures to address the spread of disinformation.”

What is meant by “…. Twitter will have to have a mechanism…”? What is the basis of this compulsion?

Referencing the AFP news agency, Ms. Gillett writes that this agency, on the previous Friday quoted a “…. European Commission official as saying: "If (Elon Musk) doesn't take the code seriously, then it's better that he quits."

What is behind this threat? What would happen if Mr. Musk does not take the code seriously? So, what happens next for Twitter? What happens after August 25, 2023? Can full compliance, as opposed to minimal efforts of which Twitter is accused of be enforced?  What body will determine full compliance is being actioned.

At this point can it be argued that the voluntary - enforceability dissonance has been resolved? 

Let’s have a closer look at what the EC has to say about the DSA’s enforcement. I refer you to a Question and Answers press briefing Questions and Answers: Digital Services Act (europa.eu) posted by the EC on April 25, 2023.

The first question, amongst many referenced on this web page is:

“What is the Digital Services Act?”

The EC writes: 

“The Digital Services Act (DSA) regulates the obligations of digital services that act as intermediaries in their role of connecting consumers with goods, services, and content. This includes online marketplaces amongst others.”

Regarding these obligations the EC informs us:

“The Digital Services Act is a regulation that is directly applicable across the EU.”

A list of the obligations that these intermediaries face include many worthwhile obligations. These include commitments towards improved consumer services, new user rights and increased transparency regarding advertising.

Concerning VLOPs the EC declares:

“Obligations for very large online platforms and search engines to prevent abuse of their systems by taking risk-based action, including oversight through independent audits of their risk management measures. Platforms must mitigate against risks such as disinformation or election manipulation, cyber violence against women, or harms to minors online. These measures must be carefully balanced against restrictions of freedom of expression, and are subject to independent audit.”

The use of the phrases must mitigate and carefully balanced in reference to disinformation or election manipulation and the other harms listed is interesting. It is an acknowledgement that the severity of these kinds of risks cannot be totally eliminated, and decisions must take into account free speech.  The references to independent audit of the risk management measures themselves is also an admission that the whole process surrounding abuse detection requires not only a nuanced approach but provides a substantial degree of discretion offered to VLOPs.

There are other obligations on the list including the following one that sets out the EC’s and EU’s supervision and enforcement role with respect to VLOPs:

“A unique oversight structure. The Commission is the primary regulator for very large online platforms and very large online search engines (reaching 45 million users), while other platforms and search engines will be under the supervision of Member States where they are established. The Commission will have enforcement powers similar to those it has under anti-trust proceedings. An EU-wide cooperation mechanism will be established between national regulators and the Commission”

The ECs enforcement power is specified as being similar to those under EU anti-trust proceedings. What the similarities are and how this cooperation mechanism will work is not spelt out. Here is a link to a web page Antitrust (europa.eu) that outlines the EU’s antitrust laws:

With respect to the following question:

Does the Digital Services Act define what is illegal online?

According to the EC itself the answer is no. In this regard the EC writes that the new rules only:

“…. set out EU-wide rules that cover detection, flagging and removal of illegal content, as well as a new risk assessment framework for very large online platforms and search engines on how illegal content spreads on their service.”

As to what constitutes illegal content this is defined by EU member states as follows:

“…. in other laws either at EU level or at national level – for example terrorist content or child sexual abuse material or illegal hate speech is defined at EU level. Where a content is illegal only in a given Member State, as a general rule it should only be removed in the territory where it is illegal.”

Another question on this web page is:

“What penalties will businesses face if they do not comply with the new rules?

In response to this question the EC explains:

A “new enforcement mechanism” has been put in place that is based on “…. national and EU-level cooperation…. “.  This mechanism will “…. supervise how online intermediaries adapt their systems to the new requirements.”

Under the enforcement mechanism each EU member state “…. will need to appoint a Digital Services Coordinator, an independent authority which will be responsible for supervising the intermediary services established in their Member State and/or for coordinating with specialist sectoral authorities.”

Each nationally based Digital Services Coordinator (DSC) “…. will impose penalties, including financial fines.”

With respect to the penalties themselves each EU member state will be required to “…. clearly specify the penalties in their national laws in line with the requirements set out in the Regulation, ensuring they are proportionate to the nature and gravity of the infringement, yet dissuasive to ensure compliance.”

The degree to which the member states have done this is not clear and intimates that the legislative work to define the penalties within the member states is currently not complete.

Apart from the EU state-based DSCs the EC reserves an enforcement prerogative with respect to VLOPS and very large search engines (VLSEs). As well as this at the EC level the penalties are clearly specified:

“…. the Commission will have direct supervision and enforcement powers and can, in the most serious cases, impose fines of up to 6% of the global turnover of a service provider.”

The enforcement mechanism is not only limited to fines: the Digital Services Coordinator and the Commission will have the power to require immediate actions where necessary to address very serious harms, and platforms may offer commitments on how they will remedy them.

On June 7, 2023 RTE published an article What are Twitter's legal obligations on stopping disinformation? that sheds some light on the DSA enforcement matter written by Ethan Shattock (Maynooth University). 

Mr. Shattock summarises the 'run’ but not 'hide’ stance towards Twitter by the EU internal market commissioner Thierry Breton:

“Highlighting legal requirements which come into effect on August 25th, the Commissioner’s message was clear. Large social media platforms will not have voluntary discretion to tackle—or refuse to tackle—the spread of disinformation under new EU law. This poses timely questions surrounding what Twitter’s departure from the EU Code of Practice on Disinformation means and the platform’s obligations under the Digital Services Act which have potential relevance in this area.”

In response to this tough language that rules out voluntary discretion Mr. Shattock asks: “Does the Digital Service Act give Twitter any wiggle room?” In response to this he points out that there is an element of ambiguity in the DSA regarding VLOP obligations:

“While Twitter's obligations to combat 'systemic risks’ are intended to include problems like disinformation, the true extent of these obligations under the EU’s Digital Services Act remains unclear.”

Mr. Shattock continues by pointing out that the measures to be adopted by VLOPs are not directly specified:

“Importantly, this law does not directly specify that Very Large Online Platforms must adopt measures to remove disinformation as part of their obligations to combat systemic risks.”

The DSA grants an element of prudence to VLOPs in terms of how they determine the suitable actions needed to fight the risks associated with disinformation. These measures could range from complex changes to how the platforms operate to doing very little. Mr. Shattock writes:

“Digital Services Act provides discretion for platforms such as Twitter to decide on appropriate measures to combat risks such as disinformation. Such measures under this legislation may simply include limiting advertisements, but could also involve considerable modifications to how platforms control what content their users are exposed to.”

Mr. Shattock points out that even though the DSA allows a VLOP a level of discretion the DCA does provide the EC with the capability to oversee whatever approaches Twitter ends up adopting. This would be done through the appointment of independent auditors whose role, as stakeholders will be to evaluate the compliance of VLOPs to their risk mitigation legal obligations to report on the kinds of measures they are taking to safeguard platform users from disinformation.

However, Mr. Shattock points out that the DSA provides an amount of freedom to Twitter in terms of the measures it takes towards effectively defining and countering the systemic risks of disinformation.

Mr. Shattock concludes his article with the following observation:

“Going forward, the role of the European Commission and independent auditors in assessing compliance with the Digital Services Act will be crucial in providing a clear picture of how platforms such as Twitter are tackling—or failing to tackle—systemic risks such as disinformation.”

The current debate in the media has not entirely resolved is the voluntary - enforceability dissonance.  The fiery rhetoric of Mr. Thierry echoed by Ms. Gillett draws our attention away from the subtlety inherent in the Code of Practice and DSA with respect to the obligations imposed on VLOPs. 

Regarding how this debate unfolds and the degree to which the regulatory aspects are structured and operate in practice; we have to wait and see what happens after August 25, 2023.

 References

The Digital Services Act package | Shaping Europe’s digital future (europa.eu)


 

 

 

 

Comments

Popular posts from this blog

Code of Practice on Disinformation. A Comparative Analysis: Methodological Limitations

Reflections on Bluntness and "Push Back' in International Discourse

A Discourse on Laurel and Hardy Statecraft