Special Report
09 2021

Disinformation affecting the EU: tackled but not tamed

About the report: Disinformation is a serious concern in organised societies. Social media and new technologies have increased the scale and speed with which false or misleading information can reach its audiences, whether intended and unintended. The EU action plan against disinformation was relevant at the time it was drawn up but incomplete. Even though its implementation is broadly on track and there is evidence of positive developments, it has not delivered all its intended results. We make recommendations to improve the coordination and accountability of EU actions against disinformation. We focus on the operational arrangements of the European External Action Service’s strategic communications division and its task forces. We recommend increasing Member States’ involvement in the rapid alert system and improving the monitoring and accountability of online platforms. We also point to the need for an EU media literacy strategy that includes combatting disinformation and taking steps to enable the European Digital Media Monitoring Observatory achieve its ambitious objectives.
ECA special report pursuant to Article 287(4), second subparagraph, TFEU.

This publication is available in 23 languages and in the following format:
PDF
PDF General Report

Executive summary

I

Disinformation has been present in human communication since the dawn of civilisation and the creation of organised societies. What has changed in recent years, however, is its sheer scale and the speed with which false or misleading information can reach its intended and unintended audiences through social media and new technologies. This may cause public harm.

II

The European Council, in its conclusions of 28 June 2018, invited the EU’s High Representative for Foreign Affairs and Security Policy and the Commission to present an action plan with specific proposals for a coordinated response to disinformation. The EU action plan against disinformation, presented on 5 December 2018, includes ten specific actions based on four priority areas or ‘pillars’, and consolidates the EU’s efforts to tackle disinformation. EU spending on tackling disinformation has been relatively low to date: €50 million between 2015 and 2020.

III

The purpose of the audit was to assess whether the EU action plan against disinformation was relevant when drawn up and delivering its intended results. The audit covered the period from the run-up to the adoption of the EU action plan against disinformation in December 2018 until September 2020. This report is the first comprehensive, independent assessment of its relevance and the results achieved. Overall, we conclude that the EU action plan was relevant but incomplete, and even though its implementation is broadly on track and there is evidence of positive developments, some results have not been delivered as intended.

IV

The EU action plan contains relevant, proactive and reactive measures to fight disinformation. However, even though disinformation tactics, actors and technology are constantly evolving, the EU action plan has not been updated since it was presented in 2018. It does not include comprehensive arrangements to ensure that any EU response against disinformation is well coordinated, effective and proportionate to the type and scale of the threat. Additionally, there was no monitoring, evaluation and reporting framework accompanying the EU action plan, which undermines accountability.

V

The European External Action Service’s three strategic communications task forces have improved the EU’s capacity to forecast and respond to disinformation in neighbouring countries. However, they are not adequately resourced or evaluated, and their mandates do not cover some emerging threats.

VI

The EUvsDisinfo project has been instrumental in raising awareness about Russian disinformation. However, the fact that it is hosted by the European External Action Service raises some questions about its independence and ultimate purpose, as it could be perceived as representing the EU’s official position. The rapid alert system has facilitated information sharing among Member States and EU institutions. Nevertheless, Member States are not using the system to its full potential for coordinating joint responses to disinformation and common action.

VII

With the code of practice, the Commission has established a pioneering framework for engagement with online platforms. We found that the code of practice fell short of its goal of holding online platforms accountable for their actions and their role in actively tackling disinformation.

VIII

The report also highlights the absence of a media literacy strategy that includes tackling disinformation, and the fragmentation of policy and actions to increase capacity to access, understand and interact with media and communications. Finally, we found there was a risk that the newly created European Digital Media Observatory would not achieve its objectives.

IX

Based on these conclusions, we recommend that the European External Action Service and the Commission:

  • improve the coordination and accountability of EU actions against disinformation (European External Action Service and Commission);
  • improve the operational arrangements of the StratCom division and its task forces (European External Action Service);
  • increase participation in the rapid alert system by Member States and online platforms (European External Action Service);
  • improve the monitoring and accountability of online platforms (Commission);
  • adopt an EU media literacy strategy that includes tackling disinformation (Commission);
  • take steps to enable the European Digital Media Observatory to fulfil its ambitious objectives (Commission).

Introduction

01

The European Commission1 defines “disinformation” as the “creation, presentation and dissemination of verifiably false or misleading information for the purposes of economic gain or intentionally deceiving the public, which may cause public harm”. Such public harm includes threats to democratic political and policy-making processes as well as to the protection of EU citizens' health, the environment or security.

02

The Commission’s definition of disinformation excludes misleading advertising, reporting errors, satire and parody, or clearly identified partisan news and commentary. Unlike hate speech or terrorist material, for example, false or misleading information is not illegal on its own.

03

The EU’s legitimacy and purpose rest on a democratic foundation, which depends on an informed electorate expressing its democratic will through free and fair elections. Any attempt to maliciously and intentionally undermine or manipulate public opinion therefore represents a grave threat to the EU itself. At the same time, combatting disinformation represents a major challenge, as it should not impair the freedom of opinion and expression enshrined in the EU Charter of Fundamental Rights.

04

The term “disinformation” emerged in the early 20th century and has been used extensively ever since. In recent years, the internet has amplified the scale and speed with which false information is reaching audiences, often anonymously and at minimal cost.

05

EU efforts to combat disinformation date back to March 2015, when the European Council2 invited the EU’s High Representative for Foreign Affairs and Security Policy (the “High Representative”), in cooperation with the Member States and EU institutions, “to develop an action plan on strategic communication to address Russia’s ongoing disinformation campaigns”. This led to the creation of the strategic communications division (“StratCom”) and the first of its task forces within the European External Action Service (EEAS), with a mandate to counter disinformation originating outside the EU (Russia) and design and disseminate positive strategic communications in the EU’s eastern neighbourhood – known as the East strategic communications task force. In 2017, two more StratCom task forces were created: one for the southern neighbourhood and another for the Western Balkans (see also paragraphs 45-49).

06

In late 2017, the Commission set up a high-level expert group to offer concrete advice on tackling disinformation. The group delivered its report in March 20183, which formed the basis for the Commission’s “Communication on tackling online disinformation: a European approach”4 (April 2018). This Communication outlined key overarching principles and objectives to guide action to raise public awareness about disinformation, as well as the specific measures the Commission intended to take.

07

The European Council, in its conclusions of 28 June 20185, invited the EU’s High Representative for Foreign Affairs and Security Policy and the Commission to “present an action plan by December 2018 with specific proposals for a coordinated EU response to the challenge of disinformation, including appropriate mandates and sufficient resources for the relevant EEAS Strategic Communications teams”.

08

Based on the April 2018 communication, the Commission published an EU action plan against disinformation in December 2018 (hereinafter: the “EU action plan”). It sets out ten specific actions based on four priority areas or ‘pillars’ targeting society as a whole, as shown in Table 1 below.

Table 1

The pillars and actions of the EU action plan against disinformation

Pillar Actions
I. Improving the capabilities of EU institutions to detect, analyse and expose disinformation (1) Strengthen the strategic communications task forces and EU delegations with additional resources (human and financial) to detect, analyse and expose disinformation activities
(2) Review the mandates of the South and Western Balkans strategic communications task forces.
II. Strengthening coordinated and joint responses to disinformation (3) By March 2019, establish a rapid alert system among Member States and EU institutions that works closely with other existing networks (such as NATO and G7)
(4) Step up communication before the 2019 European Parliament elections
(5) Strengthen strategic communications in the Neighbourhood.
III. Mobilising private sector to tackle disinformation (6) Closely and continuously monitor the implementation of a code of practice to tackle disinformation, including pushing for rapid and effective compliance, with a comprehensive assessment after 12 months
IV. Raising awareness and improving societal resilience (7) With Member States, organise targeted campaigns to raise awareness of the negative effects of disinformation, and support work of independent media and quality journalism
(8) Member States should support the creation of teams of multi‑disciplinary independent fact-checkers and researchers to detect and expose disinformation campaigns
(9) Promotion of media literacy, including through Media Literacy Week (March 2019) and rapid implementation of the relevant provisions of the Audio-visual Media Services Directive
(10) Effective follow-up of the Elections Package, notably the Recommendation, including monitoring by the Commission of its implementation

Source: ECA, based on the EU action plan.

09

There is no EU legal framework governing disinformation apart from article 11 of the Charter on Fundamental Rights on the Freedom of expression and information, and a series of policy initiatives. Responsibility for combatting disinformation lies primarily with the Member States6. The EU's role is to support the Member States with a common vision and actions aimed at strengthening coordination, communication and the adoption of good practices. Annex I presents the main departments and offices of the EU institutions involved in the implementation of the EU action plan. As Annex II shows, EU spending on tackling disinformation has been relatively low to date: €50 million between 2015 and 2020.

10

In December 20197, the Council confirmed that the EU action plan “remains at the heart of the EU’s efforts” to combat disinformation, and called for it to be reviewed regularly and updated where necessary. It also invited the EEAS to reinforce its strategic communication work in other regions, including sub-Saharan Africa. The European Parliament has also on numerous occasions expressed the importance of strengthening efforts to combat disinformation8.

11

In early 2020, almost immediately following the COVID‑19 outbreak, an unprecedented wave of misinformation, disinformation and digital hoaxes appeared on the internet, which the World Health Organization described as an “infodemic”9. This posed a direct threat to public health and economic recovery. In June 2020, the European Commission and the High Representative published a communication entitled “Tackling COVID‑19 disinformation - Getting the facts right”10, which looked at the steps already taken and concrete actions to follow against disinformation regarding COVID‑19.

12

On 4 December 2020, the Commission presented the “European democracy action plan”11, part of which is dedicated to strengthening the fight against disinformation. It builds on existing initiatives in the EU action plan against disinformation. Furthermore, the Commission also issued a proposal for a Digital Services Act12 proposing a horizontal framework for regulatory oversight, accountability and transparency of the online space in response to the emerging risks.

13

Figure 1 provides a timeline of the main EU initiatives since 2015.

Figure 1

Timeline of main EU initiatives against disinformation

Source: ECA.

Audit scope and approach

14

This audit report comes two years after the adoption of the EU action plan against disinformation. It is the first comprehensive, independent assessment of its relevance and the results achieved, thereby contributing to the regular review of the EU action plan requested by the Council.

15

The aim of our audit was to ascertain whether the EU action plan against disinformation is relevant and delivering its intended results. To answer this question, we addressed two sub-questions:

  • Is the EU action plan relevant for tackling disinformation and underpinned by a sound accountability framework?
  • Are the actions in the EU action plan being implemented as planned? To answer this sub-question, we assessed the status of the actions under each of the four pillars of the plan.
16

The audit covered the period from the run-up to the adoption of the EU action plan against disinformation in December 2018 until September 2020. Where possible, this report also refers to recent developments in this area after this date, such as the Commission’s presentation of the European democracy action plan and the proposal for a Digital Services Act in December 2020 (see paragraph 12). However, as these documents were published after we had completed our audit work, they are beyond the scope of this audit.

17

The audit included an extensive desk review and analysis of all available documentation on the structures put in place, and the actions planned and implemented through the EU action plan. We sent a survey to the 27 Member States’ Rapid Alert System contact points, with a response rate of 100 %. We also held meetings with numerous stakeholders such as the EEAS and relevant Commission directorates-general (DGs), the European Parliament, the Council, Commission representations, NATO, the NATO-affiliated Strategic Communication Centre of Excellence in Latvia, national authorities, online platforms, journalist and fact-checking organisations, audio-visual media services regulating bodies which provide advice to the Commission, academics and experts, project managers/coordinators and an external expert.

18

We also assessed 20 out of 23 projects the Commission identified as being directly related to fighting disinformation through media literacy. Annex III summarises our assessment of these projects.

Observations

The EU action plan against disinformation was relevant when drawn up, but incomplete

19

For this section, we examined whether the EU action plan had been relevant when first drawn up, i.e. whether it addressed the needs identified by experts and other stakeholders. We also assessed whether it had been reviewed and updated. We looked at the events and sources it was based on, and assessed whether it contains adequate coordination arrangements for communication and the elements necessary for measuring implementation performance and ensuring accountability.

The EU action plan was broadly consistent with experts’ and stakeholders’ views on disinformation

20

Fighting disinformation is a highly technical domain that requires input and expertise from a diverse range of professionals. Public consultation is also essential to establish the views and priorities of stakeholders and to understand the threat better.

21

We found that the Commission had relied on appropriate external expertise and undertaken a comprehensive public consultation13 as a basis for the EU action plan. The EU action plan largely addressed the suggestions and concerns expressed in these documents.

22

At the time the EU action plan was published in December 2018, it presented a structured approach to address issues requiring both reactive (debunking and reducing the visibility of disinformation content) and proactive longer-term efforts (media literacy and measures to improve societal resilience). It emphasised the goal of protecting the upcoming 2019 European elections, as well as long-term societal challenges requiring the involvement of many different actors.

23

Except in the area of media literacy, the Commission identified concrete actions to follow the main recommendations from the report of the independent ‘High-level expert group on fake news and online disinformation’ (HLEG). The HLEG, consisting of 39 experts with different backgrounds, was set up in January 2018 by the Commission to advise on policy initiatives to counter disinformation online. This report, together with the Commission’s April 2018 communication formed the basis for the EU action plan.

24

Further evidence of the EU action plan’s relevance is that its actions sought to engage a broad array of key stakeholders in the area, including not only EU institutions and Member States, but also others such as the private sector, civil society, fact-checkers, journalists and academia.

The EEAS and the Commission did not establish clear coordination arrangements to implement the EU action plan

25

The EU action plan against disinformation was not accompanied by an overall coordination framework to ensure that any EU response is effective and proportionate to the type and scale of the threat. Defining and coordinating communication workflows, for example, would make it possible to identify when to work together and in partnership with local actors and civil society to raise awareness about disinformation threats.

26

A communication strategy ensures a coherent response when different actors are involved. Each of the EU action plan’s four pillars is under the responsibility of a different Commission DG or the EEAS. This creates the risk of them ‘working in silos’ (i.e. in parallel without cooperation or coordination) when it comes to communication, without any single body being in charge or having complete oversight of all communication to combat disinformation.

27

The Commission’s Directorate-General for Communication (DG COMM) is responsible for external communication for the institution. DG COMM’s management plan for 2019 recognises its role in tackling disinformation and highlights the need for cooperation among DGs and other institutions, referring in particular to the Directorate-General for Communications Networks, Content and Technology (DG CNECT) and the Joint Research Centre. However, there is no mention of the EEAS or the StratCom task forces, which are also very much involved in positive communication and fighting disinformation.

28

DG COMM has established an internal network against disinformation (IND). Its aims include improving coordination of communication activities on tackling disinformation, establishing an online repository of validated rebuttals, systematically detecting disinformation and coordinating the response, and promoting positive messaging. Eleven meetings took place between its inception in May 2018 and January 2020. These meetings are broadly inclusive, with representatives from many Commission services and representations, the EEAS and other institutions, as well as other experts. However, the meetings so far have only involved representatives sharing information on action taken, with no connection to policy-making and no evidence of concrete follow-up actions or decisions taken to make the IND an effective coordination mechanism.

29

DG COMM’s 2019 management plan had only one indicator related to disinformation (out of 102) and this only measured the number of IND meetings.

30

In addition, the Commission representations play a vital role in the Commission’s external communication, through positive messaging and outreach, media briefings, debunking myths and fighting disinformation, and are expected to participate actively in the IND. Their myth-busting activities are listed on a page on each official representation website. These pages were often difficult to access as their location varied between representations: some (e.g. in Greece and Spain) included it under the news section, while others (e.g. in Poland and Ireland) did not. They were also not regularly updated. Some pages provided limited, often-anecdotal information and there were no statistics available on the number of visitors to these pages.

31

Finally, DG COMM was developing a central web-based disinformation hub as a centralised portal bringing together all aspects of the EU institutions’ work on disinformation. Its launch was due in early 2020 but it has been cancelled for unclear reasons.

A piecemeal monitoring and reporting framework and the lack of long-term funding undermine the EU action plan’s accountability

32

To ensure accountability, an action plan needs clear objectives and time-bound actions, accompanied by a number of indicators for monitoring performance. Provisions for financing, regular reporting, evaluation and revisions are also essential elements of an action plan.

33

Some objectives in the EU action plan have generic wording such as “step up” or “strengthen”, which do not lend themselves to measurement. No overall KPIs exist for the EU action plan as a whole. In addition, half of the actions (actions 1, 2, 4, 5 and 8) have no KPIs and the actions are either not clearly defined or time bound (see also Annex IV).

34

The timeframe of the actions varies between short- and long-term, and some are concrete and time-bound (e.g. “by March 2019, the Commission and the High Representative, in cooperation with Member States, will establish a Rapid Alert System”), while others are vague (e.g. “Member States should significantly strengthen their own communication efforts on Union values and policies”).

35

The EU action plan was not accompanied by a dedicated monitoring and evaluation framework (this applies also to the recently published European democracy action plan). There was no provision to evaluate the plan as a whole and there has been no overall evaluation to date. Feedback from its implementation in the Member States is not recorded centrally and not aggregated. Each representation carries out its own communication campaign and collects statistics, but we did not find any evidence of those statistics being used by the Commission for lessons learned, best practice, or as a baseline. There is no reporting beyond stating that some activities belong to the category of efforts against disinformation. For example, tools such as the Commission’s internal myth-busting wiki or the newsletter against disinformation are not monitored in terms of Member State engagement (e.g. using surveys, user statistics or indicators).

36

The Commission and EEAS regularly update and report to various Council working parties and preparatory bodies on progress in implementing actions under specific pillars of the EU action plan. However, this reporting is not in the public domain and does not encompass the whole EU action plan.

37

Although there have been separate reports on specific aspects of the EU action plan (an assessment of the code of practice and a Commission report on the EU elections), only one report on the implementation of the EU action plan as a whole has been published. This was on 14 June 2019, six months after the presentation of the EU action plan itself.

38

Although this first implementation report covered all pillars of the EU action plan, it has a number of shortcomings:

  • it does not provide any measure of performance;
  • except for the code of practice, reporting for each pillar is mostly in a general narrative form and there is no detailed reporting for each action;
  • there is no reporting annex on individual projects linked to the EU action plan;
  • there is no indication of when to expect the next implementation report.
39

Fighting disinformation is a constantly evolving area, which would merit regular reporting. The Joint Communication on tackling COVID‑19 disinformation recognised the need to develop regular reporting14.

40

The EU action plan lacks a dedicated financial plan covering the costs of all activities assigned to different entities. Financing comes from different sources and there are no provisions to secure funding in the long term, even though some of the events mentioned in the EU action plan are recurring. Annex II presents the budget allocated to the different actions to combat disinformation. It shows that the main source of funding is different every year and exposes a lack of financial planning (see also paragraphs 50-51). The Commission and the EEAS do not always earmark expenditure linked to fighting disinformation (it has no specific intervention code) – such information has been compiled only for this audit. Figure 2 gives an overview of all EU funding against disinformation from 2015 to 2020 (it does not include activities that contribute indirectly to combatting disinformation, namely the pro-active communications activities in the EU’s neighbourhood).

Figure 2

All EU funding against disinformation 2015-2020

Source: ECA, based on Commission and EEAS information.

41

Furthermore, the EU action plan has not yet been updated since it was presented in 2018. For example, some actions are linked only to the 2019 European elections or the 2019 Media Literacy Week, both of which have passed. Disinformation is constantly evolving. The tactics used, the technology behind disinformation campaigns, and the actors involved are all constantly changing15. The Council also highlighted the need to regularly review and update the EU action plan (see paragraph 10).

42

Although the COVID‑19 disinformation communication (June 2020), the European democracy action plan and the proposal for a digital services act extend certain actions originally set out in the EU action plan, they cannot be considered a comprehensive update thereof. Moreover, having actions pursuing similar objectives in different action plans and initiatives makes coordination more complex, increasing the risk of inefficiencies.

The implementation of the EU action plan is broadly on track, but has revealed a number of shortcomings

43

This section assesses the implementation of the actions under each of the four pillars of the EU action plan and the extent to which they have improved the way the EU tackles disinformation.

The strategic communications task forces play an important role but are not adequately staffed or funded to deal with emerging threats

44

Under pillar I of the EU action plan, we examined the EEAS StratCom task forces. We looked at their mandate and determined whether they were adequately staffed and sufficiently funded. In this context, we also examined the role and position of EUvsDisinfo, an EU flagship project against disinformation.

The StratCom task forces’ mandates do not adequately cover the full range of disinformation actors

45

Apart from improving the EU’s capacity to forecast and respond to external disinformation activities (the StratCom task forces’ mandates do not cover disinformation generated within the EU), the StratCom task forces have contributed greatly to effective communication and promoting EU policies in neighbouring regions.

46

The mandates of the three StratCom task forces (East, Western Balkans and South) have grown out of a series of Council Conclusions, with differences in tasks and focus. For example, the mandate of the East StratCom task force specifically covers the task “challenge Russia’s ongoing disinformation campaigns”16. The East StratCom task force’s mandate was defined in terms of a single, external malign actor rather than protecting Europe from disinformation regardless of source.

47

This was not the case for the other two StratCom task forces, whose original focus was stepping up communication activities in their respective regions. The South StratCom task force was established to cover the EU’s southern neighbourhood and the Gulf region, while the Western Balkans task force was established to enhance strategic communications in that region. Prior to the December 2019 Council Conclusions17, addressing disinformation was not the central priority of either task force. Only the East StratCom task force had the explicit objective of strengthening the capacity to forecast, address and respond to disinformation. Table 2 below sets out each StratCom task force’s objectives at the time of the audit.

Table 2

Comparison of the StratCom task forces’ objectives

StratCom Task force East Western Balkans South
Objectives
  • Effective communication and promotion of EU policies towards the eastern partnership countries (Armenia, Azerbaijan, Belarus, Georgia, Moldova and Ukraine);
  • Strengthening the overall media environment in the EU’s eastern neighbourhood, including support for media freedom and strengthening independent media;
  • Improving EU capacity to forecast, address and respond to disinformation activities by Russia
  • Strengthen EU communications in the region to support EU enlargement policy and the stabilisation and association process
  • Contribute to informed debate about the EU and region, including by addressing disinformation
  • Effective and strategic communication and promotion of EU policies
  • Bridge the policy-communications gap
  • Supporting and promoting media freedom in the region
  • Address misperceptions about the EU and disinformation in the region

Source: EEAS.

48

The three StratCom task forces are distributed widely across different regions and cover different agents of disinformation. Nevertheless, the StratCom task forces’ media monitoring activities focus extensively on Russian international media, Russian official communication channels, proxy media, and/or media inspired/driven by the Russian narrative, operating in the EU and its neighbourhood. However, according to EEAS analysis, other actors such as China have emerged to varying degrees as prominent disinformation threats. In this respect, the new European Parliament committee on foreign interference (INGE) has also held hearings to discuss potential threats from third countries18.

49

The StratCom task forces’ mandate is a political one, which does not specifically spell out their policy objectives and is not underpinned by a firm legal foundation. Under action 2 of the EU action plan, the High Representative was meant to review the mandates of the South and West Balkans StratCom task forces (but not the East task force). However, this review was never done. The EEAS considers that the Council Conclusions adopted in December 2019, by stating explicitly that “all three Task Forces should be able to continuously detect, analyse and challenge disinformation activities”19, provides sufficient basis for (re-)affirming their mandates. The Council also invited the EEAS to assess its needs and options for expanding to other geographical areas, which demonstrates that political support exists to broaden the scope of the StratCom task forces.

The StratCom task forces do not have a dedicated and stable funding source

50

When it was set up in 2015, the East StratCom task force was not provided with any resources of its own and was funded out of the administrative expenditure of the EEAS and the Commission’s Service for Foreign Policy Instruments. The EU action plan boosted the funding available for the EEAS StratCom task forces. In fact, strategic communications is the only part of the action plan whose specific budget has increased. As Figure 2 above shows, the budget for the StratCom task forces and strategic communications has increased nearly fourfold since the action plan was adopted.

51

Even though disinformation is not just a short-term threat, the StratCom task forces do not have a stable funding source, which could threaten their sustainability. For example, a significant source of funding for the StratCom task forces has been a European Parliament preparatory action called ‘StratCom Plus’ (see Annex II). By their nature, preparatory actions are designed to prepare new actions like policies, legislation and programmes. Figure 3 illustrates how the added funding has been allocated to improving different capabilities.

Figure 3

Funding by the 'StratCom Plus' preparatory action of different EEAS StratCom capacities (2018-2020)

Source: ECA, based on EEAS data.

52

The importance of funding and adequate resourcing has been stressed on numerous occasions20, including by the European Parliament21, Member States22 and civil society23. However, stakeholders differ in their opinions about how to prioritise the available EU funding to fight disinformation. Based on our interviews, some Member States feel greater emphasis should be placed on analysing and monitoring those sources and actors to which disinformation can be more easily attributed. Others feel more funding should be allocated to positive communication.

Staffing needs not yet satisfied

53

The EU action plan envisaged strengthening the StratCom division by adding 11 positions ahead of the European elections, recruiting permanent officials in the medium term and new staff in the EU delegations “to reach a total increase of 50-55 staff members” by the end of 2020. The recruitment plan has been implemented in three phases: (1) redeployment of contract staff within the EEAS; (2) recruitment of staff to the StratCom team; and (3) addition of staff across 27 EU delegations in the EU neighbourhood.

54

The StratCom division is still in the process of recruiting and deploying staff. As of October 2020, it had 37 staff and had therefore not yet reached the total increase of 50-55 staff, as stated in the EU action plan. One reason why it has been difficult to meet this target is that many StratCom staff are seconded from the Council, the Commission and the Member States and some have been withdrawn from their secondments.

55

Almost all staff reinforcements (including all the redeployments) have been contract staff: the EEAS acknowledged that it is not easy to recruit permanent officials with the necessary expertise and skills needed to perform the duties required. However, despite their important contribution, contract staff have a maximum tenure of six years.

56

The other group that makes up a significant portion of StratCom capacity is seconded national experts. These have primarily supported the work of the East and recently also the Western Balkans StratCom task force. As well as benefitting the EEAS, their secondment benefits their home countries by allowing them to obtain more expertise and create deeper connections with the EEAS. However, overreliance on secondment can result in uncertainty in staffing and periodic loss of institutional memory and expertise due to frequent turnover. All of these factors potentially undermine the building of institutional memory and expertise.

57

In light of the COVID‑19 pandemic and the additional workload it has created for the task forces, there is a risk that the EEAS, with the current distribution and number of staff, will have insufficient capacity to keep pace with new trends and developments like emerging threat sources and disinformation strategies and tactics. Moreover, the Council’s request to reinforce strategic communications work in other regions (see paragraph 10) may further stretch its limited staffing capacity.

58

Effective data analysis is critical not only for monitoring, detecting and analysing disinformation, but also as a basis for sound, evidence-based strategic insights and policy-making. At the time of the audit, the StratCom division’s data analysis cell includes full-time in-house analysts, who are supported by external contractors. The cell, launched in mid-2019, supports the work of the StratCom task forces and the rapid alert system (RAS) under pillar II of the EU action plan. Its analyses are predominantly on demand and ad hoc. In addition, its work is not integrated in a structured manner into the work of all the StratCom task forces. Although relying on external contractors can provide flexibility, in-house capacity is critical to provide sensitive analysis at short notice and to build institutional memory and expertise.

Measuring the impact of the StratCom task forces’ work remains a challenge

59

The greatest challenges in strategic communications remain measuring vulnerability to and the impact of disinformation, as well as efforts to understand, analyse and respond to it. The Commission uses opinion polling as one way to assess the effectiveness of strategic communications to influence perceptions about the EU. However, it is difficult to attribute the results of this polling to EU actions.

60

Beyond the communication campaigns, the StratCom task forces did not comprehensively measure the impact of their work. Furthermore, none of them had an evaluation function to assess their effectiveness and identify areas for improvement.

The hosting of EUvsDisinfo by the EEAS creates uncertainty about the project’s ultimate purpose

61

EUvsDisinfo is the public face and flagship of the EU’s efforts to combat disinformation, and the East task force’s main anti-disinformation product. It has a searchable, open-source database, with more than 9 700 disinformation cases as of 1 October 2020. Key materials on the EUvsDisinfo website are published in five EU languages; the rest is in English and Russian only. According to the EEAS, the original focus on Russian disinformation provided the foundation for a unique and pioneering approach; there are no comparable initiatives by Member State governments.

62

From its beginnings in 2015, EUvsDisinfo has steadily built up visibility online (see Figure 4 below) by cataloguing, analysing and publishing examples of Russian disinformation. Many stakeholders confirmed that EUvsDisinfo has been instrumental in raising awareness and influencing perceptions about the Russian disinformation threat to the EU and its Member States.

Figure 4

EUvsDisinfo: Number of visitors, unique page views and Twitter and Facebook followers

Source: ECA, based on EEAS information.

63

However, EUvsDisinfo has also faced criticism in the past. For example, it was censured by the Dutch parliament24 in 2018 for erroneously attributing Russian disinformation to a Dutch domestic publication. Furthermore, some cases published on EUvsDisinfo do not represent a threat to EU democracies.

64

Looking ahead, the future role and mission of EUvsDisinfo are unclear, beyond producing more examples of Russian disinformation – a threat that is now well established and acknowledged. Despite the EEAS’s claims that EUvsDisinfo is independent and does not represent the EU’s official position, its location within the EEAS makes this questionable. This raises the question of whether such an instrument should be hosted and run by a public authority (such as the EEAS) or whether it should be run under the responsibility of a civil society organisation.

The RAS has brought Member States together but has not lived up to its full potential

65

The establishment of the rapid alert system (RAS) is the key action under pillar II of the EU action plan (action 3). As the plan states, “The first hours after disinformation is released, are critical for detecting, analysing and responding to it”. The RAS was established in March 2019, by the deadline set in the plan. It has two key elements: a network of national contact points and a web-based platform whose aim is to “to provide alerts on disinformation campaigns in real-time through a dedicated technological infrastructure” in order to “facilitate sharing of data and assessment, to enable common situational awareness, coordinated attribution and response and ensure time and resource efficiency”25. The EEAS provides the secretariat for the RAS and hosts the website.

Although a useful tool for sharing information, the RAS had not yet issued alerts at the time of the audit and has not been used to coordinate common action

66

In order to work effectively, the RAS must be able to issue timely alerts, coordinate common attribution and response and facilitate exchange of information between Member States and EU institutions. We examined whether the RAS had been operational before the 2019 European elections, as stipulated in the EU action plan. We also assessed its activity and the level of engagement of its participants.

67

We found that the RAS was set up quickly in March 2019 as envisaged in the EU action plan. It has brought together Member States and EU institutions, and has facilitated information sharing, but it had not issued alerts at the time of the audit and has not coordinated common attribution and response as initially envisaged.

68

Most stakeholders consulted during our audit had a positive view of the RAS. For them, it fills an important gap in the counter-disinformation ecosystem by creating a community. This was also confirmed by our survey of Member States: the RAS allows them to share information, obtain new insights and mutually strengthen their capabilities. Figure 5 below presents the aspects most valued by the national RAS contact points.

Figure 5

Member States’ ratings of importance of elements of the RAS

Source: ECA.

69

Despite this positive opinion of the RAS as an information exchange tool, we found no evidence that the information shared through the RAS had triggered any substantial policy developments at Member State level. Building common situational awareness remains a work in progress for the RAS, hampered by the absence of harmonised and consistent definitions (e.g. of the term disinformation itself, and differing views on its sources, responses, levels of preparedness, etc.) and lack of common risk assessment.

70

When the RAS was launched, providing real-time alerts to react swiftly to disinformation campaigns was considered its primary purpose, driven by the urgency of the upcoming European elections. For the StratCom team, however, the key aim was to bring practitioners together and develop a community, as no such mechanism previously existed in the EU. These differing motives have obscured understanding among stakeholders and the wider public of what the RAS does.

71

An alert mechanism has been developed, which can be used in extremely urgent cases, but it had not been activated at the time of the audit. A threshold for triggering the alert system has been defined in qualitative terms: a disinformation campaign that has ‘transnational significant impact’ (i.e. a targeted attack on several countries). However, a quantitative assessment of this threshold is not possible.

72

In addition to the alert function, the RAS was conceived to help attribute disinformation attacks to their source and promote a coordinated response. However, this coordination capacity of the RAS has not been tested.

Activity and engagement in the RAS are driven by a limited number of Member States

73

The RAS brings together the contact points of the Member States, the Intelligence and Situation Centre of the EEAS, the European Commission (especially DGs CNECT, JUST and COMM), the European Parliament and the General Secretariat of the Council. Representatives from NATO, the G7 Rapid Response Mechanism participate in the rapid alert system. Outside experts, including from civil society and online platforms, are also sometimes present during RAS meetings. Generally, meetings of the national contact points take place every quarter, but the level of engagement varies between different Member States. Most activity is driven by one third of the Member States, which participate more regularly and are more active in the meetings.

Latest statistics point to a downward trend in activity levels

74

The statistics generated by the platform point to a number of trends. Firstly, activity is driven by a small number of core users, with other users being much more passive. Secondly, since its launch, activity levels have peaked around two main events: the European elections and the early weeks following the general lockdown in mid-March 2020. However, in the case of the latter, these have since come down and stabilised at the end of August 2020 at around half the levels in May.

75

The user statistics point to a downward trend in activity levels. For example, average daily views – even in the specific parts of the RAS that focus on COVID‑19 disinformation – have declined, as shown in Figure 6. In addition, the number of actively engaged users has been in steady decline since the European elections in late May 2019. While these metrics do not tell the full story, they indicate clearly that the platform is not living up to its full potential.

Figure 6

Average number of RAS users from March 2019 to March 2020

Source: EEAS StratCom. The two drops in the user number is a result of the policy introduced in August to deactivate user accounts that have not been active for more than three months.

Cooperation with online platforms and existing networks is mostly informal

76

According to the EU action plan, online platforms should cooperate with the RAS contact points, in particular during election periods, to provide relevant and timely information. However, there is no protocol establishing cooperation between the RAS and the online platforms and as the StratCom team does not monitor the number of cases flagged, it is not possible to assess the RAS’s performance in this area.

The code of practice made online platforms adopt a stance against disinformation but stopped short of making them accountable

77

One of the main reasons why disinformation is so acute is the widespread use of the internet, combined with the advent of new technologies and the ever-increasing use of online platforms to access information. This greatly facilitates the creation, amplification and dissemination of false information. According to the Digital Economy and Society Index, in 2020, 85 % of EU citizens were internet users. Most platforms monetise their services through their handling of personal data (mainly based on advertising models). This created a fertile ground also for disinformation actors, allowing them to better target their actions.

78

In the case of online platforms, disinformation occurs mostly as a result of users sharing false information, which can then be promoted by the platforms’ algorithms to prioritise the display of content. These algorithms are driven by the online platforms’ business model, and they privilege personalised and popular content, as it is more likely to attract attention. Disinformation also affects web search results, which further hinders users in finding and reading trustworthy online information26. Picture 1 below shows an example of the search predictions provided by an online platform for the term “The EU is”, which are almost all negative.

Picture 1

Example of predictions from an online platform when searching for “The EU is”

Source: ECA actual internet search on 18 October 2019 at 11:55 (GMT +1).
Google is a trademark of Google LLC.

79

Fake accounts, internet trolls and malicious bots also contribute to the dissemination of false information.

The CoP provides a framework for the Commission to interact with social media platforms

80

Following the April 2018 Commission communication and the HLEG proposals, the Commission decided to engage with online platforms and other trade associations on the subject of disinformation. This led to the creation of the code of practice (CoP) (see Annex V), adopting a voluntary approach based on self-regulation by the signatories. The CoP was signed in October 2018, before being incorporated into the EU action plan under pillar III. Presently it has 16 signatories.

81

The CoP committed online platforms and trade associations representing the advertising sector to submit reports to the European Commission setting out the state of play regarding measures taken to comply with their commitments. These measures ranged from ensuring transparency in political advertising to closing fake accounts and preventing purveyors of disinformation from making money. The Commission closely monitored their compliance with these commitments.

82

Most of the stakeholders interviewed during the audit stressed that the Commission’s engagement with online platforms was a unique and necessary initiative. Many of those from outside the EU whom we consulted are observing the Commission’s efforts closely. They see the EU as the first global player trying to achieve the delicate balance between protecting freedom of expression and limiting malign spread of harmful disinformation.

83

The CoP provided a framework for the Commission to interact with social media platforms ahead of the EU elections in May 2019 and later during the COVID‑19 pandemic to mitigate the negative effects of the related “infodemic”. Box 1 presents EU efforts to limit COVID‑19 related disinformation through the CoP (see also Annex VI).

Box 1

EU efforts to limit the COVID‑19 “infodemic” through the CoP

In March 2020, when the impact of the COVID‑19 pandemic became more apparent, meetings took place between the Commission and the social media platforms. The Commission requested that the platforms give more prominence to information from authoritative sources and remove false advertising.

In June 2020, the European institutions published a joint communication entitled “Tackling COVID‑19 disinformation - Getting the facts right”. The communication highlights the role of the EU action plan.

The signatories to the CoP presented their efforts through dedicated reporting, which was published in September27 and October 202028. Below are some examples of these efforts, taken from the platforms’ reports:

  • Google blocked or removed over 82.5 million COVID-19-related advertisements in the first eight months of 2020, and in August 2020 alone Microsoft Advertising prevented 1 165 481 ad submissions related to COVID‑19 from being displayed to users in European markets.
  • In August 2020, over 4 million EU users visited authoritative sources on COVID‑19 as identified by search queries on Microsoft’s Bing. Facebook and Instagram reported that more than 13 million EU users visited their COVID‑19 “information centre” in July and 14 million in August.
  • Facebook displayed misinformation warning screens associated with COVID‑19-related fact-checks on over 4.1 million items of content in the EU in July and 4.6 million in August.

Platforms have different moderation policies. Their reports have different formats and data are difficult to compare, as the terminology used by the companies differs. Facebook analyses “coordinated inauthentic behaviour” and “influence operations” while Twitter reports on “manipulative behaviour”. While Google and Microsoft reported removing millions of advertisements, Twitter stated that it did not find a single promoted tweet containing misinformation. Despite these discrepancies, the Commission considered that “In general, the reports provide a good overview of actions taken by the platforms to address disinformation around COVID‑19”.

The assessment of the CoP revealed limitations in the reporting requirements

84

A number of reviews and evaluations have assessed the CoP. They revealed several shortcomings concerning the way the Commission established reporting requirements for the signatories to the CoP (see Box 2). These evaluations had not led to any changes in the CoP.

Box 2

Evaluations of the code of practice

An initial assessment of the CoP, before it was signed, was made by the Sounding Board of the multi-stakeholder Forum on disinformation29 on 24 September 2018. It stated that: “…the “Code of practice”, as presented by the working group, contains no common approach, no clear and meaningful commitments, no measurable objectives or KPIs, hence no possibility to monitor progress, and no compliance or enforcement tool: it is by no means self-regulation, and therefore the Platforms, despite their efforts, have not delivered a Code of Practice.” Some elements of this opinion are still relevant today and were reflected in subsequent assessments and evaluation of the CoP.

The European Regulators Group for Audiovisual Media Services (ERGA) presented an opinion on the CoP in April 202030. It identified three main weaknesses:

  • lack of transparency about how the signatories are implementing the CoP;
  • CoP measures are too general in content and structure;
  • limited number of signatories to the CoP.

The Commission completed its own evaluation of the CoP in May 2020. Its overall conclusion was that the CoP had produced positive results31. The report stressed that the CoP had created a common framework and had improved cooperation between policymakers and the signatories. The main weaknesses it identified were:

  • its self-regulatory nature;
  • lack of uniformity of implementation (uneven progress in monitoring);
  • lack of clarity around its scope and some of the key concepts.

The signatories themselves did not manage to prepare an annual review of the CoP as initially agreed. As the signatories do not have a common representative, coordination is time-consuming and informal, and reaching consensus on how this annual review will take place and who will conduct it has proven difficult.

In September 2020, the Commission presented a staff working document32 taking stock of all evaluations of the CoP to date. It recognised that “it remains difficult to precisely assess the timeliness, comprehensiveness and impact of platforms’ actions”. The Commission has also identified the need for “commonly-shared definitions, clearer procedures, more precise and more comprehensive commitments, as well as transparent key performance indicators (KPIs) and appropriate monitoring”.

85

Our work confirms that signatories’ reporting varies, depending on their level of commitment and whether they are an online platform or a trade association. In addition, the online platforms’ reports are not always comparable and their length varies considerably.

86

This variation among signatories to the CoP also proved to be a problem for setting overall KPIs. These KPIs made it possible to monitor the actions of some signatories, but not all. For example, under the heading “Integrity of services”, the Commission proposed the indicator “Number of posts, images, videos or comments acted against for violation of platform policies on the misuse of automated bots”. This output indicator is relevant for specific online platforms only.

87

According to the Commission’s own analysis of the CoP signatories’ reports, the metrics provided so far satisfy only output indicators. For example, platforms report that they have rejected advertisements, or removed a number of accounts or messages that were vectors of disinformation in the context of COVID‑19 (see also Box 1). If this reported information is not put into context (i.e. by comparing it, in time, against baseline data and other relevant information such as the overall creation of accounts), and the Commission cannot verify its accuracy, it is of limited use.

88

The assessment of the CoP conducted on behalf of the Commission not only looks at the current state of the reporting but also recommends possible metrics for the future. The document proposes two levels of indicators:

  1. ‘structural’ indicators for the code as a whole, measuring overall outcomes, the prevalence of disinformation online and the code’s impact in general. These help to monitor, at a general level, whether disinformation is on the rise, stable or declining;
  2. tailored ‘service-level’ indicators, broken down by pillar, to measure each individual signatory platform’s results in combatting disinformation.

At the time of the audit, the Commission had not provided signatories with any new reporting template or more meaningful new indicators.

89

The issues described above show that online platforms are not held accountable for their actions and their role in actively tackling disinformation.

Lack of a coherent media literacy strategy and fragmentation of EU actions dilutes their impact

90

Pillar IV of the EU action plan focuses on raising awareness and strengthening societal resilience to disinformation. It aims to improve media literacy actions, such as the 2019 Media Literacy Week, and to support independent media and investigative journalists. It also calls on Member States to rapidly implement the media literacy provisions of the Audio-Visual Media Services Directive and create teams of multi-disciplinary independent fact-checkers in light of the 2019 European elections.

91

Media literacy’ refers to skills, knowledge and understanding that allow citizens to use media effectively and safely and equip them with the critical thinking skills needed to exercise judgment, analyse complex realities and distinguish between opinion and fact33. Responsibility for media literacy, which lies at the cross-section of education policy and the EU’s digital agenda, lies with the Member States. The Commission’s role is to foster collaboration and facilitate progress in the area. Disinformation does not respect borders, however, and developing common tools and sharing best practices at EU level are important.

92

In order to assess the actions under this pillar, we examined the 2019 EU Media Literacy Week event, and whether there was a well-defined strategy for the various initiatives in this area. We reviewed the Commission’s report on the 2019 European elections34 and assessed 20 projects directly related to media literacy and fighting disinformation.

93

The Commission’s report on the 2019 European elections stated that “[while] manipulative efforts recurrently focused on politically sensitive topics and targeted EU audiences ahead of the elections, no large-scale covert interference operation in the 2019 elections has been identified so far”.

Member States do not participate evenly in the EU Media Literacy Week

94

European Media Literacy Week is a series of actions to raise awareness about media literacy across the EU (see Box 3). It is unclear how it reflects a coherent EU media literacy strategy, however; although it includes some high-level discussions, it mainly serves to illustrate some specific EU and Member State initiatives. The 2020 edition was to be jointly organised between the Commission and the Council, which could have further stimulated Member State action and participation. However, it was cancelled due to COVID‑19.

Box 3

European Media Literacy Week

The 2019 Media Literacy Week was one of two specific media literacy actions in the EU action plan. It took place in March 2019 in Brussels and in Member States, and included a high-level conference. Over 320 events were organised at the time, and in total 360 events had been organised up to the end of September 2020.

Nearly half of all events took place in France, with Belgium (mostly Brussels) a distant second. A small number of Member States did not host any events at all – as the geographical distribution of the events shows (see picture opposite). As would be expected, most events took place around the time of the official launch. However, no further statistics are available on the number of people these events reached, their thematic distribution and the extent to which they dealt specifically with disinformation.

Source: ECA, based on Commission data.

There is no overarching media literacy strategy that includes tackling disinformation

95

We found there were a multitude of EU and Member State initiatives that addressed media literacy, and a plethora of policy documents. This is also clear from the Council Conclusions on media literacy35, which include an annex with the main policy documents. However, these actions are not coordinated under an overarching strategy for strengthening societal resilience, particularly in media literacy, which would include tackling disinformation. While actions to address Member States’ specific challenges on media literacy are also important in order to achieve local impact, EU support for media literacy lacks the following underlying elements, which would be conducive to its sound financial management:

  • a regular update , in cooperation with the Media Literacy Expert Group (MLEG), of the most important media literacy practices and actions in the EU and Member States (the Council of Europe produced such a mapping in 2016, the first of its kind; however, it has not been updated since then36);
  • clear objective-setting based on systematic and regular research into media literacy and the impact of media and digital platforms, accompanied by a set of indicators to measure performance;
  • the necessary coordination mechanisms to create synergies and avoid overlap between initiatives and actions under, for example, the Audio-Visual Media Services Directive, the digital education action plan, the Creative Europe Programme, the digital competence framework and skills agenda, the recently published European democracy action plan and the Digital Services Act, the media and audiovisual action plan, etc.;
  • unified monitoring of EU media literacy initiatives.
96

For the next multi annual financial framework (2021-2027), according to the Commission, approximately €14 million in EU funding from the Creative Europe programme37 – €2 million per year – has been earmarked to support media literacy. However, as the Council Conclusions on media literacy also state, it will be necessary to develop additional funding sources.

Most projects examined produced tangible results, but many did not demonstrate sufficient scale and reach

97

Of the 20 projects we assessed, ten were funded under Horizon 2020, and the other ten were pilot projects and preparatory actions funded by the European Parliament (see table in Annex III).

98

The ‘Media Literacy for all’ call for proposals, launched in 2016 by the European Parliament, includes pilot projects and preparatory actions for co-financing innovative start-up ideas from across the EU in the field of media literacy. A pilot project runs for two years, followed by a three-year preparatory action. The Horizon 2020 projects are research and innovation projects covering highly technical aspects of tackling disinformation, such as the use and detection of bots or developing a new generation of content verification tools.

99

Our analysis (see Annex III) found tangible results in 12 out of 20 projects. Most positive results were achieved by projects that built on the results of previous projects to produce tools related to fact-checking, or by projects to create teaching and learning material against disinformation (see Box 4).

Box 4

Examples of EU-funded projects achieving positive results

Based on the theoretical research of project 2, which investigated how the information generated by algorithms and other software applications is shared, project 1 produced, as a proof of concept, an interactive web tool designed to help increase transparency around the nature, volume, and engagement with false news on social media, and to serve as a media literacy tool available to the public.

Project 11 set out to create an educational, multilingual (eight EU languages), crowdsourced online platform for teaching and learning about contemporary propaganda. This action was accompanied by sets of contextualised educational resources, and online and offline workshops and seminars for teachers, librarians and media leaders. The project was well set up and produced tangible results, with active participation from six EU countries. Even though the project ended on 1 January 2019, its platform and resources remain active.

100

However, we identified shortcomings in 10 out of the 20 projects, which concerned mostly their small scale and limited reach. Seven projects did not or are unlikely to reach their intended audience, and the results achieved by three projects were difficult to replicate, which limited their impact. Box 5 presents a number of projects illustrating these issues:

Box 5

Examples of EU-funded projects with limited reach or scale of actions

Project 10 set out to create a system for automatically detecting false information from the way it is spreading through social networks. The project was successful, and an online platform soon recruited the project manager and the people involved in the project, acquiring the technology. This is evidence of a well-identified research project that produced good results. However, its subsequent acquisition by an American online platform has limited the targeted audience it would benefit and does not contribute to the development of an independent EU capability in this sector.

One other project (14) focused on the representation of women in the media. It existed as an online portal presenting news that women journalists and editors found most relevant in their regions while at the same time trying to fact-check news on women’s and minority issues. Although the project reached a considerable audience via Facebook and Twitter with its gender theme, its main output was a website that is no longer available.

Another project (16) was also supposed to develop social skills and critical thinking. It was composed of various heterogeneous parts that focused on creativity, with no clear connection between them and a loose conceptual link to media literacy. For example, children in schools created animation or simple games about cleaning a school gym or protecting a vending machine. The planned activities cannot be easily reproduced.

101

Overall, there was little evidence of any comparative analysis of project results – especially in terms of what had worked and why. Nor was there much evidence of the Commission coordinating the exchange of best practice and media literacy material across the EU. An evaluation framework is also lacking. Such a framework is critical to the long-term development of societal resilience, as it ensures that lessons learned feed directly into future actions, policy and strategy. Obtaining evidence of the direct impact of media literacy measures is difficult, and efforts are still at an early stage of development38. The Council Conclusions on media literacy also called for the development of systematic criteria and evaluation processes, and a uniform and comparative methodology for Member States’ reporting on the development of media literacy39.

The SOMA and EDMO projects attracted limited interest from media literacy experts and fact-checkers

102

As the EU action plan states under pillar IV, independent fact-checkers and researchers play a key role in furthering the understanding of the structures that sustain disinformation and the mechanisms that shape how it is disseminated online. The Commission financed the Social Observatory for Disinformation and Social Media Analysis (SOMA) project, a digital platform with the aim of forming the basis for a European network of fact-checkers. SOMA is funded under Horizon 2020, with an overall budget of nearly €990 000. The project started in November 2018 and is scheduled to end on 30 April 2021.

103

Our analysis revealed that SOMA managed to attract only two fact-checkers recognised by the International Fact Checking Network40. At the time of the audit (October 2020), SOMA had 48 registered members. Several of those we contacted admitted to never using the SOMA platform. While the technology behind SOMA has been evaluated positively, the project is still not widely used by the fact-checking community.

104

Well before the end of the SOMA project, and without waiting for an evaluation to collect and apply lessons learned, the Commission in June 2020 launched the first phase (worth €2.5 million, running until the end of 2022) of the European Digital Media Observatory (EDMO). It aims to strengthen societal resilience by bringing together fact-checkers, media literacy experts, and academic researchers to understand and analyse disinformation, in collaboration with media organisations, online platforms and media literacy practitioners.

105

SOMA and EDMO therefore have partially overlapping objectives, and most contractors are involved in both projects at the same time. The evaluators of SOMA suggested merging the two projects but no formal links between them have yet been established. There is also a risk of overlap in their financing, as both projects are based on and use the same technology and commercial products.

106

EDMO has been presented as a holistic solution to tackling many of the societal challenges around disinformation. However, being in its infancy, its visibility among stakeholders is still limited, according to its management. It is too early to judge the effectiveness of EDMO. Nevertheless, given the limited awareness of EDMO among stakeholders, its achievements may not match its overly ambitious goals. Its current focus is on building up the necessary infrastructure, and it will need more resources to succeed in its aims.

107

The media literacy experts we interviewed commented that the media literacy community was not feeling sufficiently engaged with EDMO. EDMO’s advisory board features a broad array of expertise from across academia and journalism, reflecting the EU action plan’s key emphasis on strengthening fact-checking and supporting journalism. However, the media literacy community or civil society, who could provide useful links between academia and policy-making are under represented (two out of 19 experts).

Conclusions and recommendations

108

We examined whether the EU action plan against disinformation was relevant when drawn up and delivering its intended results. We conclude that the EU action plan was relevant but incomplete, and even though its implementation is broadly on track and there is evidence of positive developments, some results have not been delivered as intended.

109

We found that the EU action plan was consistent with experts’ and stakeholders’ views and priorities. It contains relevant, proactive and reactive measures to fight disinformation. However, even though disinformation tactics, actors and technology are constantly evolving, the EU action plan has not been updated since it was presented in 2018. In December 2020, the Commission published the European democracy action plan, which includes actions against disinformation, without clarifying exactly how it relates to the EU action plan against disinformation (see paragraphs 20-24 and 41-42).

110

The EU action plan does not include coordination arrangements to ensure that EU responses to disinformation are coherent and proportionate to the type and scale of the threat. Each of the EU action plan’s pillars is under the responsibility of a different Commission Directorate-General or the European External Action Service, without any single body being in charge or having complete oversight of communication activities (see paragraphs 25-31).

111

There is no monitoring, evaluation and reporting framework accompanying the EU action plan and the European democracy action plan, which undermines accountability. In particular, the plans include generic objectives that cannot be measured, several actions that are not time-bound and no provisions for evaluation. Only one report on the implementation of the EU action plan has been published, with limited information on performance. Without a comprehensive, regular review and update, it is difficult to ensure that EU efforts in this area are effective and remain relevant. Furthermore, there was no comprehensive information on the funding sources and the estimated costs of the planned actions (see paragraphs 32-40).

Recommendation 1 – Improve the coordination and accountability of EU actions against disinformation

The European Commission should improve the coordination and accountability framework for its actions against disinformation by incorporating:

  1. clear coordination and communication arrangements between the relevant services implementing the EU actions against disinformation;
  2. a dedicated monitoring and evaluation framework containing clear, measurable and time-bound actions, as well as indicators to measure performance and evaluation provisions;
  3. regular reporting on the implementation of the actions, including any necessary updates;
  4. a summary of the main funding sources and expenditure made in implementing the actions.

Timeframe: end of 2021 for recommendations a) and b) and mid 2023 for recommendations c) and d)

112

Under pillar I of the EU action plan the three EEAS strategic communication task forces have improved the EU’s capacity to forecast and respond to disinformation activities and have contributed substantially to effective communication and promoting EU policies in neighbouring countries. The task forces’ mandates do not adequately cover the full range of disinformation actors, including new emerging threats (see paragraphs 45-49).

113

Staffing of the task forces largely depends on the secondment of national experts, making it more difficult for the EEAS to manage and retain staff. The StratCom team has not yet met its recruitment targets, and the COVID‑19 crisis has created additional workload. Furthermore, the task forces have no evaluation function to assess their effectiveness and identify areas for improvement (see paragraphs 53-58 and 60).

Recommendation 2 – Improve the operational arrangements of the StratCom division and its task forces

The EEAS should:

  1. bring emerging disinformation threats to the Council’s attention. It should then review and clarify the policy objectives to be achieved by strategic communications division and its task forces.
  2. reach the recruitment targets set in the EU action plan;
  3. focus its human resources on the most sensitive tasks, such as threat analysis and threat evolution, and outsource less sensitive communication activities where these cannot be done in-house due to staff shortages;
  4. undertake regular evaluations of the task forces’ operational activities, beyond their communication campaigns.

Timeframe: mid 2022

114

EUvsDisinfo has been instrumental in raising awareness about Russian disinformation. However, its placement inside the EEAS raises some questions about its independence and ultimate purpose, as it could be perceived as representing the EU’s official position (see paragraphs 61-64).

115

Under pillar II, the EEAS quickly set up the rapid alert system (RAS). We found that the RAS had facilitated information sharing among Member States and EU institutions. However, the RAS has never issued alerts and, consequently, has not been used to coordinate joint attribution and response as initially envisaged. Furthermore, the latest statistics show that activity and engagement in the RAS are driven by a limited number of Member States. There is a downward trend in activity levels and cooperation with online platforms and existing networks is mostly informal. Additionally, there is no protocol establishing cooperation between the RAS and the online platforms (see paragraphs 65-76).

Recommendation 3 – Increase participation in the RAS by Member States and online platforms

The EEAS should:

  1. request detailed feedback from Member States on the reasons for their low level of engagement and take the necessary operational steps to address them;
  2. use the RAS as a system for joint responses to disinformation and for coordinated action, as initially intended;
  3. propose to the online platforms and the Member States a framework for cooperation between the RAS and the online platforms.

Timeframe: mid 2022

116

The one action under pillar III is about ensuring continuous monitoring of the code of practice (CoP). It sets out a number of voluntary measures to be taken by online platforms and trade associations representing the advertising sector. With the CoP, the Commission has created a pioneering framework for engagement with online platforms. During the initial stages of the COVID‑19 pandemic, the CoP led the platforms to give greater prominence to information from authoritative sources.

117

Our assessment of the CoP and the Commission’s evaluations revealed different reporting from the platforms, depending on their level of commitment. Moreover, the metrics that the platforms are required to report satisfy only output indicators. Platforms do not provide access to data sets, so that the Commission cannot verify the reported information. Consequently, the CoP falls short of its goal to hold online platforms accountable for their actions and their role in actively tackling disinformation (see paragraphs 77-89).

Recommendation 4 – Improve the monitoring and accountability of online platforms

Building on recent initiatives such as the new European democracy action plan, the Commission should:

  1. propose additional commitments to the signatories to address weaknesses identified in the evaluations of the code of practice;
  2. improve the monitoring of the online platforms’ activities to tackle disinformation by setting meaningful KPIs;
  3. establish a procedure for validating the information provided by online platforms.

Timeframe: end of 2021

118

Under pillar IV of the EU action plan, we found there were a multitude of EU and Member State media literacy initiatives, and a plethora of policy documents, which are not organised under an overall media literacy strategy that includes tackling disinformation (see paragraph 95).

119

Most activities during the 2019 EU Media Literacy Week took place in just two Member States, which substantially limited the initiative’s awareness-raising potential. Our analysis of a sample of 20 projects addressing disinformation found tangible results in 12 projects. Most positive results were achieved by projects that built on the results of previous projects to produce fact-checking tools or teaching material. The key shortcomings identified in 10 projects relate to the small scale and reach of their activities (see Box 3, paragraphs 94 and 97-101).

Recommendation 5 – Adopt an EU media literacy strategy that includes tackling disinformation

The Commission should adopt a media literacy strategy that includes combatting disinformation as an integral part. In order to better target disinformation through media literacy actions and reduce their fragmentation, this strategy should include:

  1. a regular update, in cooperation with the Media Literacy Expert Group (MLEG), of the most important media literacy practices and actions in the EU and Member States;
  2. clear objective setting based on systematic and regular research into media literacy and the impact of media and digital platforms, accompanied by a set of indicators to measure performance;
  3. the necessary coordination mechanisms to create synergies among projects.

Timeframe: end of 2022

120

Independent fact-checkers and researchers play a key role in furthering the understanding of disinformation. So far, the Commission’s efforts to develop a European network of fact-checkers (the Social Observatory for disinformation and Social Media Analysis (SOMA)) have not managed to attract much awareness from this community. The European Digital Media Observatory (EDMO) is supposed to replace SOMA, but both projects were running in parallel at the time of the audit. EDMO’s visibility among stakeholders is still limited, which contrasts with its ambition of providing a holistic solution to the societal challenges around disinformation. Furthermore, the media literacy community and civil society, who could provide important links between academia and policy-making, are not well represented on EDMO’s advisory board (see paragraphs 102-107).

Recommendation 6 – Take steps to enable EDMO to fulfil its ambitious objectives

To ensure that EDMO meets its ambitious objectives, the Commission should:

  1. collect lessons learned from the SOMA project once it has ended and apply them in EDMO;
  2. increase the representation of media literacy and civil society experts in EDMO’s advisory board;
  3. increase awareness of EDMO among stakeholders, in particular fact-checkers and media literacy experts;

Timeframe: end of 2021

This Report was adopted by Chamber III, headed by Mrs Bettina Jakobsen, Member of the Court of Auditors, in Luxembourg on 27 April 2021.

For the Court of Auditors

Klaus-Heiner Lehne
President

Annexes

Annex I — Main EU institution departments and offices fighting disinformation

Responsibilities
European External Action Service (EEAS) Responsible for the StratCom Task Forces (East StratCom Task Force since 2015 and since mid-2017 the South Task Force covering the Middle East and North Africa region (May 2017), and the Western Balkans Task Force covering the Western Balkans (July 2017). The EEAS is also responsible for maintaining and moderating the digital platform of the Rapid Alert System, and strengthening detection and analytical capabilities. In 2020, it created a new Strategic Communications Division, comprising the Task Forces and other relevant capabilities.
The Directorate-General for Communications Networks, Content and Technology (DG CNECT) Leads the activities relating to the Code of Practice, the creation of an independent network of fact-checkers and researchers, and actions in support of media literacy and cybersecurity. DG CNECT also implements projects financed by the European Parliament and R&I projects financed under the H2020 program.
Directorate-General for Communication (DG COMM) Responsible for the Network against Disinformation (an internal forum of close collaboration across all Directorate Generals, the EEAS, European Parliament and EU Representations in Member States). Also responsible for proactive and objective communication on EU values and policies and raising public awareness activities.
Directorate-General for Justice and Consumers (DG JUST) Contributed on fundamental rights and democracy aspects to the main outputs relating to disinformation. Led on the preparation of the Commission’s electoral package which was issued in September 2018 (the “Election package”)
Secretariat General of the European Commission Tasked with coordinating the implementation of the actions from the EU action plan.
European Parliament (DG Communications) Part of the internal coordination network, drafts counter narratives and funds preparatory actions to be implemented by the EEAS and MS via direct management.
Commission Representations Provide locally-tailored messaging, including specific tools to counter myths and disseminate facts
European Parliament liaison offices

Source: European Commission.

Annex II — EU spending on actions against disinformation (in euros)

Entity Budget line Financed under Title Budget Allocation Total
2015 2016 2017 2018 2019 2020
EEAS 19 06 01 Commission prerogative (FPI) Information Outreach on EU External Relations 298 200 298 200
19 06 77 01 Preparatory Action (FPI) Preparatory Action "StratCom Plus" 1 100 000 3 000 000 4 000 000 8 100 000
1200 EEAS Contract staff 1 187 000 1 128 942 2 098 697 2 159 748 6 574 387
2214 EEAS Strategic Communication Capacity 800 000 2 000 000 2 000 000 4 800 000
DG CNECT 09 04 02 01 Horizon 2020 Leadership in information and communications technology 3 115 736 2 879 250 10 885 524 16 880 510
09 03 03 CEF - Telecom European Digital Media Observatory 2 500 000 2 500 000
09 05 77 04 Pilot Project Pilot Project "Media literacy for all" 245 106 500 000 745 106
09 05 77 06 Prepratory Action Preparatory Action "Media literacy for all" 499 290 500 000 500 000 1 499 290
ERC 08 02 01 01 Horizon 2020 Strengthening frontier research in the European Research Council 1 980 112 1 931 730 149 921 150 000 4 211 763
FPI 19 02 01 00 IcSP Countering disinformation in southern and eastern Ukraine 1 934 213 1 934 213
DG JUST 33 02 01 REC programme Study on the impact of new technologies on free and fair elections 350 000 350 000
33 02 01 REC programme Promotion activities for EU citizenship rights (e.g. an event for the cooperation network on elections or related to the citizenship report) 376 000 376 000
34 02 01 REC programme Studies and research into specific areas concerning Union Citizenship (Network of Academics and others) 434 000 434 000
DG COMM 16 03 02 03 Operational budget Online and written information and communication tools 91 603 62 249 153 852
16 03 01 04 Operational budget Communication of the Commission Representations, Citizens’ Dialogues and ‘Partnership’ actions 132 000 132 000
08 02 05 00 Corporate budget Horizontal activities of Horizon 2020 110 000 110 000
DIGIT for EEAS 26 03 77 09 Preparatory action "Data Analytics Solutions for policy-making" 251 421 251 421
TOTAL 5 095 848 2 176 836 4 716 171 14 563 756 10 634 134 12 163 997 49 350 742

Source: ECA on information provided by the Commission and the EEAS.

Annex III — Assessment of projects against disinformation (Pilot Projects, Preparatory Actions, H2020)

Project number Type of Project Countries Project Duration (Actual) Status Direct link with other projects Grant Amount (in euro) Was the Commission's monitoring adequate? Criterion 1 Relevance to disinformation Criterion 2 Tangible and sustainable results Criterion 3Sufficient scale and reach
1 H2020 United Kingdom (targets: Germany, France, Poland, Sweden, United Kingdom / Brazil, Canada, China, Mexico, Russia, Ukraine, United States, Taiwan) Jan. 2016 - Dec. 2020 Ongoing Yes 1 980 112 There is both a continuous reporting and independent reporting in the form of an audit and a scientific report.     The project produced mainly research papers. Most of the presentations of these papers have taken place outside of the EU. With the UK having left the EU, it is unclear how this research will still benefit the EU.
2 H2020 United Kingdom July 2017 - Jan. 2019 Ended Yes 149 921 There is both a continuous reporting and an independent reporting in the form of an audit.     There is no indication that the project will go beyond a proof of concept and if it does, it is not clear whether the public will benefit from it as much as the private sector if the end product is commercialized.
3 H2020 Greece Jan. 2016 - Dec. 2018 Ended Yes 3 115 737 There is an independent expert opinion and review reports.     The tool produced by the project catered mostly to experts and was not user friendly enough for the general public (two subsequent projects were needed to refine results and to enhance the scale and reach of the project).
4 H2020 Italy Jan. 2018 - Dec. 2020 Ongoing No 2 879 250 The project is still on-going. There is continuous reporting with a first assessment.   There is a weakness as one of the software components is dated and the project is not using state of the art methods in this area.  
5 H2020 Ireland, Greece, Italy, Cyprus, Austria, Portugal, Romania, United Kingdom Jan. 2018 - Nov. 2021 Ongoing Yes 2 454 800 An independent, remote review was carried out in July 2020 which was facilitated by DG CNECT.   A well-managed project but some corrective action is needed to focus on key components, and a more detailed elaboration of dissemination and exploitation. Weaknesses in the implementation of the dissemination and business exploitation strategies.
6 H2020 Czechia, Ireland, Spain, Austria Dec. 2018 - Nov. 2021 Ongoing Yes 2 753 059 An independent, remote review are being carried out (started in August 2020).     There is uncertainty on how the centralised, online platforms will engage with the tool.
7 H2020 France, Italy, Poland, Romania, United Kingdom Dec. 2018 - Nov. 2021 Ongoing Yes 2 505 027 Three individual evaluations were conducted in December 2019 and an overall assesment was conducted in February 2020. Additionally, between January and April 2020, a project review was carried out.      
8 H2020 Denmark, Greece, Italy Nov. 2018 - April 2021 Ongoing Yes 987 438 The Project was reviewed by 3 independent monitors, and evaluated by the Project Officer.   The project is still ongoing in parallel with a similar project in the field.  
9 H2020 Belgium, Bulgaria(C), Germany, Greece, France, United Kingdom Dec. 2018 - Nov. 2021 Ongoing Yes 2 499 450 No input and no coordinating effort from the Commission in the beginning.   Results are being tested at the prototype stage. This can entail risks. No guidance from the Commission and ideas on how results will be sustainable are limited to partner initiatives linked with their own individual contacts/ partners/ clientiele.  
10 H2020 Switzerland/United Kingdom Sep. 2018 - Nov. 2019 (initially Feb. 2020) Ended Yes 150 000 At the request of the Court the project officer worked well to collect the information needed on establishing how the results were exploited.     Results exploited mainly by an American company.
11 Pilot Project Belgium, Romania, France, Croatia, Poland, Finland, United States Jan. 2018 – Jan. 2019 Ended No 125 000 The project was monitored with various qualitative and quantitative indicators.      
12 Pilot Project Spain, Italy, Malta, Portugal, United Kingdom 2016 Ended Yes 171 057 Commission monitoring was not evident.   Sustainable training courses were only developed in one country out of five. Limited reach of the projects's results.
13 Pilot Project Belgium, Greece, Spain, Italy, Latvia, Lithuania, Hungary, Malta, Austria, Poland, Portugal, Romania, Slovakia 2017 Ended No 118 445 Continuous reporting and a technical report and independent Final evaluation were produced.     Sustainability concerns. In its final self assessment the project highlighted the absence of an overall media literacy strategy.
14 Pilot Project Poland July 2018 – June 2019 Ended No 127 590 There is only one page assessment that does not analyse the outcomes. The project mixes fact checking with women's rights and sexism and the relevance to disinformation is weak. The website created by the project is no longer operational.  
15 Pilot Project Belgium, Austria, Portugal 2017 Ended No 122 815     The brainstorming and white papers were done but there is no toolkit. The project was aborted due to insolvency of the coordinator.
16 Pilot Project Denmark, Ireland, Greece, Cyprus, Portugal July 2018 – June 2019 Ended No 131 150 There is no proof of on ongoing monitoring. The final assesment is 133 words long and does not include any recommendations. The project deals with creative thinking in general. The outputs/outcomes are not measurable. The project was a stand-alone exercise and cannot be easily reproduced or continued
17 Preparatory Action Belgium, Bulgaria, Germany, Spain, Croatia, Romania, Italy, Latvia July 2019 - Aug. 2020
Extension discussed
Ongoing No 124 546,72 An interim technical implementation report was produced.      
18 Preparatory Action Denmark, Germany, Spain, France, Italy, Netherlands, Poland, Finland 2018 Ongoing Yes 214 556 The project monitored actions closely with clearly defined indicators.      
19 Preparatory Action Spain, France, Romania, Sweden 2018 Ongoing Yes 159 380 The project is still on-going and the technical report is of good quality. There will be also an independent report.      
20 Preparatory Action Greece, Spain, Lithuania, Finland Aug. 2019 - Aug. 2020 (extension discussed) Ongoing No 86 630 The reporting required only one mid-evaluation report after seven months. Some documentation was not immediately available and had to be sent by post.     The project is finding it difficult to bridge a financing gap.
Not fulfilled
Partly fulfilled
Fulfilled
N/A

Source: ECA.

Annex IV — Assessment of the actions included in the EU action plan on Disinformation

EU Action Plan Is the Action clearly defined? Is the Action time bound? Responsibility
Action 1: With a view to the 2019 European Parliament elections in particular, but also with a longer-term perspective, the High Representative, in cooperation with the Member States, will strengthen the Strategic Communication Task Forces and Union Delegations through additional staff and new tools which are necessary to detect, analyse and expose disinformation activities. Member States should, where appropriate, also upgrade their national capacity in this area, and support the necessary increase in resources for the Strategic Communication Task Forces and Union delegations. Yes – to hire more people and acquire tools With a view to the 2019 European elections but also in a longer-term perspective The High Representative and Member States
Action 2: The High Representative will review the mandates of the Strategic Communications Task Forces for Western Balkans and South to enable them to address disinformation effectively in these regions. Yes – to review the mandate No deadline The High Representative
Action 3: By March 2019, the Commission and the High Representative, in cooperation with Member States, will establish a Rapid Alert System for addressing disinformation campaigns, working closely with existing networks, the European Parliament as well as the North Atlantic Treaty Organisation and G7’s Rapid Response Mechanism. Yes – to establish the Rapid Alert System March 2019 The Commission and the High Representative
Action 4: With a view to the upcoming European elections, the Commission, in cooperation with the European Parliament, will step up its communication efforts on Union values and policies. Member States should significantly strengthen their own communication efforts on Union values and policies. No – it is not clear what “step up communication efforts” means – more articles, more press releases? With a view to the 2019 European elections The Commission in cooperation with the European Parliament and Member States
Action 5: The Commission and the High Representative, in cooperation with Member States, will strengthen strategic communications in the Union’s neighbourhood. No – it is not clear how the strategic communications can be strenghtened No deadline The Commission and the High Representative, in cooperation with Member States
Action 6: The Commission will ensure a close and continuous monitoring of the implementation of the Code of Practice. Where needed and in particular in view of the European elections, the Commission will push for rapid and effective compliance. The Commission will carry out a comprehensive assessment at the conclusion of the Code’s initial 12-month period of application. Should the implementation and the impact of the Code of Practice prove unsatisfactory, the Commission may propose further actions, including actions of a regulatory nature. Yes – the Commission needs to monitor and conduct a comprehensive assessment Where needed and in particular in view of the 2019 European elections. The Commission will carry out a comprehensive assessment at the conclusion of the Code’s initial 12-month period of application The Commission
Action 7: With a view especially to the 2019 European elections, but also to the longer term, the Commission and the High Representative, in cooperation with the Member States, will organise targeted campaigns for the public and trainings for media and public opinion shapers in the Union and its neighbourhood to raise awareness of the negative effects of disinformation. Efforts to support the work of independent media and quality journalism as well as the research into disinformation will be continued in order to provide a comprehensive response to this phenomenon. Partially – the Commission needs to launch campaigns and trainings, however it is not clear what efforts it should make to support media and research With a view especially to the 2019 European elections, but also to the longer term The Commission and the High Representative, in cooperation with Member States
Action 8: Member States, in cooperation with the Commission, should support the creation of teams of multi-disciplinary independent fact-checkers and researchers with specific knowledge of local information environments to detect and expose disinformation campaigns across different social networks and digital media. No – it is not clear how the MS should support the creation of fact-checker and researcher teams No deadline The Commission and Member States
Action 9: As part of the Media Literacy Week in March 2019, in cooperation with the Member States, the Commission will support cross-border cooperation amongst media literacy practitioners as well as the launch of practical tools for the promotion of media literacy for the public. Member States should also rapidly implement the provisions of the Audio-visual Media Services Directive, which deal with media literacy. No – it is not clear how the Commission is to support cross border cooperation amongst media literacy practitioners during one week. The Action Plan encouragement to implement the AudioVisual Medial Service Directive is just that. It has no legal power over the deadlines imposed in the directive (end of 2022). March 2019 The Commission and Member States
Action 10: In view of the upcoming 2019 European elections, Member States should ensure effective follow-up of the Elections Package, notably the Recommendation. The Commission will closely monitor how the Package is implemented and where appropriate, provide relevant support and advice. No – it is not clear how the election package should be followed up – evaluation, legal changes? With a view to the 2019 European elections The Commission and Member States
Yes
No
Partially

Source: ECA.

Annex V — The code of practice on disinformation

The code of practice on disinformation (CoP) is a set of commitments to fight disinformation, agreed on a voluntary basis by representatives of online platforms, leading social networks, advertisers and the advertising industry. The CoP was encouraged and facilitated by the European Commission. This was the first time major industry players had decided to act together to address the spread of online disinformation.

The initial signatories in October 2018 included Facebook, Google, Twitter and Mozilla, as well as a number of advertisers and advertising industry bodies. Microsoft signed the CoP in May 2019, followed by TikTok in June 2020. Presently there are 16 signatories.

The CoP consists of a series of commitments under five pillars:

  • scrutiny of ad placements;
  • political advertising and issue-based advertising;
  • integrity of services;
  • empowering consumers;
  • empowering the research community.

It also includes an annex identifying best practices that signatories will apply to implement the CoP’s commitments. The signatories prepared individual roadmaps to implement the CoP.

As the CoP is voluntary, there are no penalties for failure to honour these commitments. Therefore, it is important to monitor the signatories’ progress in implementing their commitments. Between January and May 2019, the European Commission verified the implementation of these commitments by Facebook, Google and Twitter, particularly in relation to the integrity of the European elections. The three platforms reported on a monthly basis on action they had taken in relation to the scrutiny of ad placements, transparency of political and issue-based advertising, and fake accounts and malicious use of bots. The Commission also invoked the CoP to request five sets of reports from the signatories on action they had taken to counter disinformation during the COVID‑19 pandemic.

These reports are available on the Commission’s website, together with the Commission’s assessment.

Annex VI — Chronology of the main EU actions as a response to the 2020 COVID-19 pandemic and “infodemic”

Source: ECA.

Acronyms and abbreviations

CoP: Code of Practice

EDMO: European Digital Media Observatory

EEAS: European External Action Service

EMLW: European Media Literacy Week

ERGA: European Regulators Group for Audio-visual Media Services

EU action plan: EU action plan against disinformation

EUD: EU Delegations

EUvsDisinfo: The flagship project of the European External Action Service’s East StratCom Task Force. It was established in 2015 to better forecast, address, and respond to the Russian Federation’s ongoing disinformation campaigns affecting the EU, its Member States, and countries in the shared neighbourhood.

HLEG: Independent High Level Expert Group on Fake News and Online Disinformation

HWP: Horizontal Working Party on Enhancing Resilience and Countering Hybrid Threats

IE: Information Environment

IND: Internal Network on Disinformation

RAS: Rapid Alert System

SOMA: Social Observatory for Disinformation and Social Media Analysis

StratCom: Strategic Communication

Glossary

Algorithms: A process or set of rules applied by a computer in calculations or other problem-solving operations.

Bot: An automated software application that is programmed to do certain tasks.

Disinformation: The communication of false or misleading information for the purpose of deceit.

Infodemic: An excessive amount of information – some accurate and some not – that can be used to obscure or distort the facts.

Internet troll: Person who posts insults, often laced with profanity or other offensive language on social networking sites.

Media literacy: Capacity to access, understand and interact with media and communications.

Misinformation: The communication of false or misleading information, whether in good faith or for the purpose of deceit.

Strategic Communications: Coordinated and coherent communication by an organisation in pursuit of specific goals.

Wiki: A collaborative website on which any user can add and edit content.

Replies of the Commission and the EEAS

Executive summary

I

The European Union (EU) has recognised disinformation and foreign interference as an important challenge to democracy and society, at EU level as well as globally. Since 2015, it has increased its efforts to tackle these phenomena – to that end, different policy documents have been issued, including the 2018 Action Plan against Disinformation (2018 Action Plan). The EU continues to assess the ever-changing threats in this field and aims to support partner countries around the globe to address them.

The EU aims to adapt its approach to these developments to ensure a coherent, comprehensive and effective framework to tackle disinformation and foreign interference, in full respect of fundamental rights and freedoms.

III

The 2018 Action Plan remains, together with other policy documents, one of the core EU policy pillars on tackling disinformation. At the time of its adoption and still today, the Action Plan is one of its kind, looking at disinformation from different angles and identifying core areas for action. It allows for immediate short-term response to disinformation campaigns and long-term investment in strengthening societal resilience.

It is a comprehensive document that focusses exclusively on disinformation and thus highlights the importance that the EU attaches to this challenge. It also underlines the EU’s aim to work with all stakeholders, including from civil society and private industry, to develop a whole-of-society approach. The value of international cooperation is highlighted, forming the basis for increased cooperation with key partners like NATO and the G7.

IV

Building on the 2018 Action Plan and the steps taken to implement it, the Commission and the European External Action Service (EEAS) has issued policy documents like the 2020 Joint Communication on COVID-19 Disinformation (JOIN(2020) 8 final) and the European Democracy Action Plan (EDAP). These documents built on the achievements stemming from the 2018 Action Plan and reiterate many of its points. Therefore, while not a formal update to the 2018 Action Plan, these documents are being regarded as the further developments of the 2018 Action Plan. The EDAP sets out measures to promote free and fair elections, strengthen media freedom and counter disinformation. In doing so, it takes into account the 2018 Action Plan as well as the 2020 Joint Communication on COVID-19 Disinformation. The EDAP reiterates many of the calls of the 2018 Action Plan, including the cooperation with international partners and the need to further enhance analytical capabilities at the European and national level. It also takes into account the progress made on the basis of the 2018 Action Plan to date and should therefore, be considered as an extension of the policy framework foreseen in 2018 Action Plan. The EDAP and the 2020 Joint Communication on COVID-19 Disinformation were adopted after the conclusion of the audit and were therefore, outside of its scope.

V

The EEAS considers that the absence of an overarching, single mandate from the Council has not impeded the development or the operation of the strategic communications division and its Task Forces, including the adaptation to new and emerging issues. Any new consolidated mandate would need to reflect the same level of political support as originally reflected in the European Council mandate of 2015.

VI

The EEAS underlines the uniqueness of the EUvsDisinfo project, which has been set up as a direct implementation of the 2015 mandate given by the European Council. The EUvsDisinfo project is of great value for the EEAS and the EU institutions as a whole to raise awareness for the ever-evolving threat of disinformation campaigns. As the disinformation challenge and related threats are evolving, it is only natural to revisit the approach taken on a regular basis.

The EEAS considers that the establishment of the Rapid Alert System (RAS) was an important element in developing its counter-disinformation strategy. In fact, the RAS represents the only forum in the EU institutional set up where disinformation experts from EU institutions and Member States are working together to address disinformation related issues. The EEAS welcomes the active participation of a large number of Member States and invites others to intensify their activity.

VII

The assessment of the Code of Practice on Disinformation published in September 2020 (the Code of Practice) acknowledged that it should be further improved in several areas. The findings of the assessment supported two Commission’s policy initiatives adopted at the end of 2020 – the EDAP and the Digital Services Act (DSA) – that aim inter alia to strengthening the fight against disinformation by establishing a transparency and accountability co-regulatory framework, thus addressing the indicated shortcomings.

VIII

The first phase of supporting and observing grass root media literacy projects through the Media Literacy for All Pilot project and preparatory action showed a fragmented landscape, however mirroring the needs of Member States. Since then, the Commission has further developed its media literacy work programme in the framework of the Creative Europe Programme, to scale up the relevant actions in this area and share the results across Member States, cultural and linguistic borders. The targeted, smaller scale initiatives and a more harmonised pan-European approach are necessary and can complement each other. However, the Commission does not see that a certain degree of fragmentation, which mirrors the Member States’ heterogeneous approaches, would present a risk per se, while it acknowledges the need for coordination, and is addressing this.

The European Digital Media Observatory (EDMO) was launched only in June 2020 and has been developing fast into its current advanced phase.

IX

First bullet point – The Commission and the EEAS accept recommendations 1 (a) and 1 (d), and partially accept recommendations 1 (b) and 1 (c).

Second bullet point – The EEAS accepts recommendations 2 (a), (b), (c) and (d).

Third bullet point – The EEAS accepts recommendations 3 (a), (b) and (c).

Fourth bullet point – The Commission accepts recommendations 4 (a) and (b), and partially accepts recommendation 4 (c).

Fifth bullet point – The Commission accepts recommendations 5 (a), (b) and (c).

Sixth bullet point – The Commission accepts recommendations 6 (a), (b) and (c).

Introduction

01

The Commission and the EEAS point out that the challenge of disinformation and information manipulations continues to change and evolve rapidly. Definitions and precise characterisations are therefore being reviewed, also in recent policy documents such as the 2020 Joint Communication on COVID-19 Disinformation and the EDAP.

Observations

22

The Commission also adopted a package of measures to support free and fair elections in light of the 2019 elections, with recommendations addressed to Member States and national and European political parties, which included measures to address disinformation41. As part of the implementation of these recommendations, the Commission organised three meetings of a specially created European Cooperation Network on Elections to exchange good practice and information among Member State competent authorities.

25

Although a formal framework was not set up, the Commission hold regular coordination meetings at services and political level to ensure that information was shared about the implementation of the 2018 Action Plan.

28

The Network against Disinformation has helped build the capacity of the services to respond to disinformation, providing the tools, competences and support to all the Directorates-General and the Representations of the EU that are part of it. It brought together previously disjointed actions, encouraged collaboration and provided a repository of common myths and the facts needed to rebut them. It fostered a culture of fact-checking and responding to myths and disinformation and in so doing provided a more unified approach to the EU’s efforts to counter disinformation. Finally, the many awareness raising activities, raised the profile of the EU’s policy work against disinformation.

29

Directorate-General for Communication (DG COMM) points out that its 2020 management plan includes three measurable indicators related to disinformation: reach of awareness-raising activities fighting disinformation and number of visit of anti-disinformation web pages (besides the number of meetings of the internal network against disinformation).

30

An overall modernisation of the relevant websites is ongoing with a view of launching the new versions by the end of 2021, including measuring their performance.

31

DG COMM’s external communication efforts were focused on COVID-19 and vaccines mis- and disinformation, through dedicated web pages. It was decided that the planned central web-based disinformation hub would be transformed into an internal tool, revamping the existing WIKI.

40

The 2018 Action Plan is not a funding programme. As a result, the funds allocated to its implementation depend on the decisions taken in another framework (e.g. work programme of funding programmes like Horizon 2020). The total budget cannot be earmarked ex-ante.

41

The 2018 Action Plan remains one of the core guiding documents that steer the EU’s response to disinformation and it has not lost its relevance since its adoption. However, the threat has evolved and so has the EU’s approach to tackling disinformation – this, however, does not mean that the four pillars identified in the 2018 Action Plan have lost any of their relevance. On the contrary – building on the 2018 Action Plan and the steps taken to implement it, the EU has issued policy documents like the 2020 Joint Communication on COVID-19 Disinformation and the EDAP. These documents built on the achievements stemming from the 2018 Action Plan and reiterate many of its points. Therefore, while not a formal update to the 2018 Action Plan, these documents are being regarded as the further developments of the 2018 Action Plan. In particular, this refers to its rationale to strengthen societal resilience in the long term and protect democracies by better understanding and defining the threat, and taking actions to strengthen capabilities to monitor, analyse and expose disinformation.

42

The Commission notes that the EDAP, the DSA and the Media and Audiovisual Action Plan are effectively proposing to take over several actions and following up on various points from the 2018 Action Plan. These initiatives are complementary, and will be well coordinated, mitigating the risk of inefficiencies in addressing certain weaknesses of the 2018 Action Plan identified in this audit.

48

The EEAS considers that the Strategic Communications Division and its Task Forces pursue a risk-based approach in its monitoring activities and focus on those actors and issues that are most prevalent. The EEAS also takes into account the rapidly changing and evolving threat landscape and on a regular basis aims to review and adapt its processes and the work of the entire team accordingly. It also points out that threat actors can be state or non-state actors. The EEAS continuously strives to adapt team and focus to changing threat landscape.

49

While the EEAS considers that the current mandates given by the European Council and other Council formations have provided a good basis for the ongoing operations of the StratCom Division and its Task Forces so far, an updated consolidated mandate taking into account the changed threat landscape could be useful. Any such new consolidated mandate would need to be given at the level of the European Council, in line with the 2015 mandate.

51

The EEAS points out that, under the 2021 budget, the EEAS Strategic Communications Divisions has received a clear allocation of funds on different budget lines, which nearly doubled its operating budget.

54

The EEAS points out that, in the past allocation 2019, 2020 and 2021, the Strategic Communications Division has received a total of 52 new posts, including 27 Local Agents in Delegations, which will enable the EEAS to fulfil the Action Plan’s target.

55

EEAS common reply to paragraphs 55 and 56.

The EEAS would like to underline the crucial importance of recruiting expert and specialist knowledge for its work related to disinformation. Contractual agents and seconded national experts, even if they have an overall limited tenure, play an important role in the current operations, for example in the field of data analysis, as well as regional or linguistic knowledge.

58

The activities of the data team continue to be built. These are meant to serve the entire Division and all work strands, as well as to adjust to any changes in the disinformation landscape. The approach of the data team is comprehensive to support the entire Division, but also contribute to the cooperation with international partners like the G7 and NATO. The work of the data team additionally includes the constant further development of the methodological and conceptual framework of the Division’s data work.

60

The EEAS strives to further improve evaluation of all aspects of its operations. However, when it comes to the impact of counter-disinformation activities, there is no established methodology to measuring the impact of disinformation, as well as of actions taken to tackle it. This reflects the challenges that the entire community who works to counter disinformation and foreign interference is facing.

64

The EEAS considers the EUvsDisinfo as a well-established brand, which is clearly anchored in the EEAS role and activities. The focus on Russian disinformation stems from the very specific mandate from the European Council in 2015. The EEAS is continuously adapting and expanding the scope and approach of these tools.

67

The EEAS considers that the establishment of the RAS was a major positive development to contribute to the EU’s capability to address disinformation and information manipulations.

69

The EEAS considers that the RAS has greatly contributed to a better coordination between Member States and European institutions by facilitating information exchange, better situational awareness and practical cooperation on communication activities.

70

The EEAS considers that the RAS has been built as a facilitating structure for the EU Member States and the European institutions. It has proven to be very useful in this context. The operating principles and workflows have been jointly agreed and the regular meetings held with the Points of Contact as well as bilateral exchanges with the RAS Members and EEAS ensure a common understanding of the rationale behind the RAS. Members are also regularly consulted on their views of how the RAS can be further built and improved.

74

EEAS common reply to paragraphs 74 and 75.

The EEAS agrees that the level of activity on the RAS platform could be increased, especially by the Member States who have been less active in the past, and that there is much untapped potential. The fluctuations in activity can be explained by the importance of specific events, such as elections.

76

The EEAS underlines that there was no intention to formalise the cooperation between the RAS and the social media platforms. Notwithstanding the provisions of the Code of Practice, informal cooperation between the RAS and the platforms has been taken forward as appropriate.

84

Further to the assessment of the Code of Practice, the EDAP has set up a process to revise and strengthen the Code. The Commission will issue guidance and convene the signatories to strengthen it according to the guidance. Setting up a robust and permanent framework for the monitoring of the Code is also specifically announced in EDAP. Furthermore, the DSA proposes a horizontal framework for regulatory oversight, accountability and transparency for online platforms. When adopted, the DSA will establish a co-regulatory backstop for the measures, which would be included in a revised and strengthened Code of Practice on disinformation. Thus, better monitoring and accountability will be possible if and when the proposal for the DSA is adopted.

86

With regard to the absence of overall Key Performance Indicators (KPIs) for all signatories, the Commission notes the difficulty to identify KPIs capable of monitoring the actions of all signatories of the Code of Practice, given the very different nature and activities of the signatories and the different levels of commitments e.g., the services provided by social media platforms vary significantly from the services provided by search engines.

88

The 2020 Joint Communication on tackling COVID-19 disinformation listed the features that platforms had to report about in their monthly reports. The Commission asked platforms to take action to harmonise reporting and proposed a detailed list of data points relevant to these features from the platform signatories.

94

The 2019 edition of European Media Literacy Week was a comprehensive and coordinated initiative, featuring events at EU and national levels, showcasing of projects, regulatory and forward-looking policy exchanges. The initiative was highly appreciated by the media literacy community.

96

In addition to the Creative Europe Programme, where the funding of approximately €14 million, compared to the pilot project and preparatory action funding, will be multiplied, also other programmes, such as Erasmus+ and the European Solidarity Corps actively support media literacy initiatives. Support will continue through the 2021-2027 period.

99

The Commission points out that projects may not provide “tangible results” but they may nevertheless be highly successful. The projects are aimed to reach to new areas, initiate new research avenues, pilot and test concepts, create new communities and are by nature and design of experimental nature.

100

Horizon 2020 projects are chosen through an objective external evaluation following set criteria, and they are supervised through regular reviews by objective reviewers. The evaluation and the reviews are carried out by selected highly competent experts, whose opinion the Commission trusts with regard to best value for money. The Commission notes that, in view of reach, the limited available funding plays a key role.

Box 5 – Examples of EU-funded projects with limited reach or scale of actions

One of the key objectives of Media Literacy for All was, in addition to tackle disinformation related media literacy challenges, to reach the hard to reach sectors of European societies with an emphasis on the socially disadvantaged, minorities and those on the margins of society, for whom media literacy skills are especially important. The projects funded under Media Literacy for All provide valuable insights on how to engage productively with the above mentioned groups.

Project 14 had a limited time span, hence the web site has since been removed. Project 16 used established media literacy techniques, such as development of videogames, podcasts and video clips, with an overarching theme of empowerment through the development of critical evaluation skills and creation of media content. The project also covered media literacy activities designed specifically for children with a migration background.

101

The Commission notes that media literacy is a difficult area for measuring impact. A task force in collaboration with the European Regulators Group for Audiovisual Media Services (ERGA) and Media Literacy Expert Group (MLEG) has been initiated to review possibilities for useful KPIs on 26 March 2021 with an aim for first results by end of 2021.

Notably, a comprehensive evaluation of the implementation of Media Literacy for All (2016-2020), is not possible before all projects are completed in spring 2022. Nevertheless, the success and impact of the projects are assessed systematically at individual project level exit meetings as well as through annual coordination meetings of all ongoing projects and via presentations to the Member States at the MLEG meetings. This is part of the verification process of the projects’ impacts.

103

The Commission acknowledges that the Social Observatory for Disinformation and Social Media Analysis (SOMA) did not attract many fact-checkers recognised by the International Fact Checking Network. However, the Commission would like to note that the number of disclosed members now stands at 55 and so far 25 investigations on disinformation campaigns across Europe were produced by members of the SOMA network.

105

The Commission acknowledges a partial overlap of SOMA and EDMO. This overlap between the two projects and the fact that the two projects rely on the same technological solution ensure a smooth migration of the fact-checking community from SOMA to EDMO. Moreover, before ending, the SOMA project will pass the lessons learned to EDMO. The Commission notes that technological platform provided by EDMO, while based on the same technology, will provide extended functionalities to a much wider network of fact-checking and research organisations when compared to SOMA. The Commission will carefully monitor the projects to avoid double funding.

107

The Commission notes that the media literacy community will be further involved in EDMO through the national hubs, operational from summer 2021, which will have to carry out specific media literacy tasks.

Conclusions and recommendations

109

The EDAP includes a dedicated section on countering disinformation and foreign interference. The Commission and the EEAS considers this the evolution of policy framework as proposed in the 2018 Action Plan. The EDAP reiterates many of the calls of the 2018 Action Plan, such as strong international cooperation, and references inter alia the RAS, which the 2018 Action Plan introduced for the first time. It also includes a call for more obligations and accountability for online platforms, setting out next steps, including the co-regulatory framework set up by the DSA, which is the follow up of pillar 3 of the 2018 Action Plan. It acknowledges the progress made in monitoring and analysing disinformation and calls for a more detailed framework and methodology to build on the achievements following the 2018 Action Plan. In the view of the Commission and the EEAS, it is therefore inherent to the text of the EDAP that it takes the 2018 Action Plan fully into account and is considered its natural follow-up.

110

Coordination among all services for the implementation of the 2018 Action Plan was carried out by the secretariat of the Security Union Task Force and regular reporting on the progress made was included in the Security Union Progress Reports. In the context of the pandemic, coordination of policies in the field of disinformation was ensured by the Secretariat General in a dedicated Inter-Service Group, which also helped prepare the 2020 Joint Communication on COVID-19 disinformation and its follow up. In the context of EDAP, a new Inter-Service Group has been created under the leadership of the Secretariat General to coordinated services’ work, including on disinformation. Besides, the EDAP foresees the establishment of a clear protocol to pull together knowledge and resources quickly in response to specific situations.

Recommendation 1 – Improve the coordination and accountability of EU actions against disinformation

The Commission and the EEAS accept recommendation 1 (a).

As also announced in EDAP, the EU institutions will further ensure that their internal coordination on disinformation is strengthened with a clear protocol to pull together knowledge and resources quickly in response to specific situations.

The Commission and the EEAS partially accept recommendation 1 (b).

Following the adoption of EDAP, the Commission and the EEAS are monitoring the implementation of actions against disinformation in the context of that action plan. As pointed out, the EDAP builds on the 2018 Action Plan and further develops many aspects of it. In addition, the Commission and the EEAS would like to stress that, given the policy nature of some actions, it is difficult to set a single set of performance indicators. Relevant legislative proposals will provide separate evaluation frameworks.

The Commission and the EEAS partially accept recommendation 1 (c).

The Commission and the EEAS will review the implementation of the EDAP in 2023, a year ahead of the European Parliament elections, including the reporting arrangements to follow.

The Commission and the EEAS accept recommendation 1(d).

Recommendation 2 – Improve the operational arrangements of the StratCom division and its task forces

The EEAS accepts recommendation 2 (a) and will continue to brief and update different Council formations, taking into account their views in further refining policy objectives and approaches.

The EEAS accepts recommendation 2 (b) and points out that recruitment targets as spelled out in the 2018 Action Plan are expected to be reached in 2021.

The EEAS accepts recommendation 2 (c).

The EEAS accepts recommendation 2 (d).

114

The EEAS underlines the uniqueness of the EUvsDisinfo project, which has been set up as a direct implementation of the 2015 mandate given by the European Council. The EUvsDisinfo project is of great value for the EEAS and the EU institutions as a whole to raise awareness for the ever-evolving threat of disinformation campaigns. As the disinformation challenge and related threats are evolving, it is only natural to revisit the approach taken on a regular basis.

Recommendation 3 – Increase participation in the RAS by Member States and online platforms

The EEAS accepts recommendation 3 (a) and would like to underline that many operational aspects are in the competence of Member States.

The EEAS accepts recommendation 3 (b) and underlines that the changes already implemented will further facilitate joint responses.

The EEAS accepts recommendation 3 (c).

117

Regarding the provision of data to the Commission, the Commission notes that the Code of Practice is a self-regulatory tool voluntarily adopted by its signatories to address disinformation activities running on their services. For the time being, there is no legal framework that obliges the Code’s signatories to give the Commission access to the data sets. This is an example of the limited powers of the Commission in this field.

However, as indicated, the EDAP sets out the next steps to strengthen the Code of Practice, including by issuing guidance in anticipation of the co-regulatory backstop, which will be put in place with the DSA. The DSA is proposing a horizontal framework for regulatory oversight, accountability and transparency of the online space in response to the emerging risks. The EDAP also sets out how to establish a robust and permanent framework for the monitoring of the Code of Practice.

An even better monitoring will be possible when the proposal for the DSA is adopted. DSA notably includes provisions for very large platforms to establish regular monitoring frameworks related to the relevant risks and to submit their risk assessments and risk mitigation measures to independent audits.

Recommendation 4 – Improve the monitoring and accountability of online platforms

The Commission accepts recommendation 4 (a) and notes that it has already started implementing it in in the framework of the EDAP. The Commission has also addressed the disinformation problem in the proposed DSA, which should strengthen the obligations of transparency, accountability and monitoring.

The Commission will shortly issue guidance to address the shortcomings of the Code of Practice, including KPIs and benchmarks to better monitor its effectiveness. The Commission also stresses its current limited competences in this field, which preclude it from addressing on its own all the complex societal challenge posed by disinformation.

The Commission accepts recommendation 4 (b) and will implement it in the framework of the EDAP.

This will include setting up a robust framework for recurrent monitoring of the Code of Practice. As also set out in EDAP, the strengthened Code of Practice shall monitor the effectiveness of platforms policies on the basis of a new methodological framework which includes principles for defining KPIs.

The Commission partially accepts recommendation 4 (c). It will assess the possibilities for establishing a procedure for validating the information provided by online platforms in full respect of the currently purely self-regulatory character of the Code of Practice. At the same time, it notes that the Commission’s proposal for the DSA proposes a horizontal framework for regulatory oversight, accountability and transparency of the online space in response to the emerging risks, which includes provisions for very large platforms to establish regular monitoring frameworks related to the relevant risks and to submit their risk assessments and risk mitigation measures to independent audits.

Recommendation 5 – Adopt an EU media literacy strategy that includes tackling disinformation

The Commission accepts recommendation 5 (a) and notes that it started being implemented.

The Commission accepts recommendation 5 (b) and notes that it started being implemented.

The Commission accepts recommendation 5 (c) and notes that it started being implemented.

120

The Commission acknowledges that SOMA did not attract many fact-checkers recognised by the International Fact Checking Network.

The Commission would like to note that the number of disclosed members of SOMA now stands at 55 including members active in fact-checking and so far 25 investigations on disinformation campaigns across Europe were produced by members of the SOMA network.

The Commission notes that EDMO started its operations only 4-5 months before the auditors reached EDMO’s management. At that time, EDMO had still a limited operational capacity to reach stakeholders. The EDMO’s capacity to operate is greatly increasing. Several meetings and survey with stakeholders already took place and the EDMO national hubs, operational from summer 2021, will further extend its reach. Further resources earmarked under the Digital Europe Programme will ensure sufficient resources to achieve EDMO’s objectives.

The Commission notes that the media literacy community will be further involved in EDMO through the national hubs, operational from summer 2021, which will have to carry out specific media literacy tasks.

Recommendation 6 – Take steps to enable EDMO to fulfil its ambitious objectives

The Commission accepts recommendation 6 (a). The SOMA project will provide EDMO with a handover package with the lessons learned during its activity.

The Commission accepts recommendation 6 (b) and will suggest EDMO, while respecting its independence, to increase the representation of media literacy and civil society experts in EDMO’s advisory board.

The Commission accepts recommendation 6 (c) and notes that EDMO has already organised, on the 9 October 2020, a workshop with the fact-checking community and will continue the outreach activities towards the fact-checking community. Moreover, the media literacy community will be further involved in EDMO through the national hubs – to be established in the second half of 2021 – which will carry out also specific media literacy tasks.

Audit team

The ECA’s special reports set out the results of its audits of EU policies and programmes, or of management-related topics from specific budgetary areas. The ECA selects and designs these audit tasks to be of maximum impact by considering the risks to performance or compliance, the level of income or spending involved, forthcoming developments and political and public interest.

This performance audit was carried out by Audit Chamber III External action, security and justice, headed by ECA Member Bettina Jakobsen. The audit was led by ECA Member Baudilio Tomé Muguruza, supported by Daniel Costa de Magalhaes, Head of Private Office and Ignacio Garcia de Parada, Private Office Attaché; Alejandro Ballester Gallardo, Principal Manager; Emmanuel-Douglas Hellinakis, Head of Task; Piotr Senator and Alexandre Tan, Auditors. Michael Pyper provided linguistic support.

Endnotes

1 Communication on tackling online disinformation, COM(2018) 236 final of 26 April 2018.

2 EUCO 11/15 (Point 13) European Council meeting (19 and 20 March 2015) – Conclusions.

3 Final report of the High Level Expert Group on Fake News and Online Disinformation.

4 See footnote 1.

5 EUCO 9/18 European Council meeting (28 June 2018) – Conclusions.

6 Articles 2-6 of Treaty on the functioning of the European Union.

7 Council Conclusions on the “Complementary efforts to enhance resilience and counter hybrid threats”, paragraph 30 (10 December 2019).

8 European Parliament resolution of 15 June 2017 on Online platforms and the Digital Single Market (2016/2276(INI)); European Parliament resolution of 3 May 2018 on media pluralism and media freedom in the European Union (2017/2209(INI); European Parliament resolution of 25 October 2018 on the use of Facebook users’ data by Cambridge Analytica and the impact on data protection (2018/2855(RSP)); European Parliament resolution on foreign electoral interference and disinformation in national and European democratic processes (2019/2810(RSP)).

9 World Health Organisation “Let’s flatten the infodemic curve”.

10 JOIN(2020) 8 final, 10 June 2020.

11 Communication on the European Democracy Action Plan, COM(2020) 790 final of 3 December 2020.

12 Proposal for a regulation of the European Parliament and of the Council on a single market for digital services (Digital Services Act) and amending Directive 2000/31/EC, COM(2020) 825 final of 15 December 2020.

13 European Commission public consultation on fake news and online disinformation.

14 Tackling COVID-19 disinformation – Getting the facts right, p. 7, JOIN(2020) 8 final of 10 June 2020.

15Automated tackling of disinformation”, EPRS study, March 2019.

16 European Council Conclusions on external relations (19 March 2015), Council document EUCO 11/15.

17 Council Conclusions on Complementary efforts to enhance resilience and counter hybrid threats (10 December 2019), Council document 14972/19.

18 European Parliament Hearing on foreign interference on 25 and 26 January 2021: China, Iran, India and Saudi Arabia and UAE; European Parliament Hearing on Foreign Interference on 1 February 2021: Turkey and Russia.

19 Complementary efforts to enhance resilience and counter hybrid threats – Council Conclusions (10 December 2019), Council document 14972/19.

20 June 2018 European Council Conclusions and December 2019 Council Conclusions on complementary efforts to enhance resilience and counter hybrid threats.

21 P8_TA(2019) 0187, P9_TA(2019) 0031.

22 Information Manipulation: a Challenge for Our Democracies. A report by the Policy Planning Staff (CAPS, Ministry for Europe and Foreign Affairs) and the Institute for Strategic Research (IRSEM; Ministry of the Armed Forces) France, August 2018.

23 For example, Democratic Defense against Disinformation, Atlantic Council, February 2018; GMF Policy Paper No.21, August 2019; Winning the Information War, CEPA, August 2016.

24 Officiele bekendmakingen.

25 RAS terms of reference.

26 Automated Tackling of Disinformation, EPRS study, March 2019.

27 First baseline reports – Fighting COVID-19 disinformation Monitoring Programme.

28 Second set of reports – Fighting COVID-19 disinformation Monitoring Programme.

29 Multi-stakeholder forum on disinformation.

30 ERGA Report on disinformation: Assessment of the implementation of the Code of Practice.

31 Commission’s own evaluation of the code of practice.

32 Assessment of the Code of Practice on Disinformation – Achievements and areas for further improvement.

33 Audiovisual Media Services Directive (AVMSD) (EU) 2018/1808.

34 COM(2020) 252 final of 19 June 2020.

35 See Annex to Council conclusions on media literacy in an ever-changing world (2020/C 193/06).

36 Mapping of media literacy practices and actions in EU-28, European Audio-visual Observatory, Council of Europe, January 2016.

37 Out of a total proposed budget of €61 million of the Cross-sectoral strand of the Creative Europe programme (budget line 07 05 03).

38 Exploring Media Literacy Education as a Tool for Mitigating Truth Decay, RAND Corporation, January 2019.

39 Council conclusions on media literacy in an ever-changing world (2020/C 193/06).

40 Created by the Poynter Institute. Presently, it has 82 active members.

41 https://ec.europa.eu/info/sites/info/files/soteu2018-cybersecurity-elections-recommendation-5949_en.pdf; Reported in https://ec.europa.eu/info/files/com_2020_252_en.pdf_en

Timeline

Event Date
Adoption of Audit Planning Memorandum (APM) / Start of Audit 4.2.2020
Official sending of draft report to Commission (or other auditee) 4.3.2021
Adoption of the final report after the adversarial procedure 27.4.2021
Commission’s (or other auditee’s) official replies received in all languages 25.5.2021

Contact

EUROPEAN COURT OF AUDITORS
12, rue Alcide De Gasperi
1615 Luxembourg
LUXEMBOURG

Tel. +352 4398-1
Enquiries: eca.europa.eu/en/Pages/ContactForm.aspx
Website: eca.europa.eu
Twitter: @EUAuditors

More information on the European Union is available on the internet (http://europa.eu).

Luxembourg: Publications Office of the European Union, 2021

PDF ISBN 978-92-847-5963-7 ISSN 1977-5679 doi:10.2865/337863 QJ-AB-21-008-EN-N
HTML ISBN 978-92-847-5945-3 ISSN 1977-5679 doi:10.2865/772885 QJ-AB-21-008-EN-Q

COPYRIGHT

© European Union, 2021.

The reuse policy of the European Court of Auditors (ECA) is implemented by Decision of the European Court of Auditors No 6-2019 on the open data policy and the reuse of documents.

Unless otherwise indicated (e.g. in individual copyright notices), the ECA’s content owned by the EU is licensed under the Creative Commons Attribution 4.0 International (CC BY 4.0) licence. This means that reuse is allowed, provided appropriate credit is given and changes are indicated. The reuser must not distort the original meaning or message of the documents. The ECA shall not be liable for any consequences of reuse.

You are required to clear additional rights if a specific content depicts identifiable private individuals, e.g. in pictures of the ECA’s staff or includes third-party works. Where permission is obtained, such permission shall cancel and replace the above-mentioned general permission and shall clearly indicate any restrictions on use.

To use or reproduce content that is not owned by the EU, you may need to seek permission directly from the copyright holders.

Software or documents covered by industrial property rights, such as patents, trade marks, registered designs, logos and names, are excluded from the ECA’s reuse policy and are not licensed to you.

The European Union’s family of institutional Web Sites, within the europa.eu domain, provides links to third-party sites. Since the ECA has no control over them, you are encouraged to review their privacy and copyright policies.

Use of European Court of Auditors’ logo

The European Court of Auditors logo must not be used without the European Court of Auditors’ prior consent.

GETTING IN TOUCH WITH THE EU

In person
All over the European Union there are hundreds of Europe Direct Information Centres. You can find the address of the centre nearest you at: https://europa.eu/european-union/contact_en

On the phone or by e-mail
Europe Direct is a service that your questions about the European Union. You can contact this service

FINDING INFORMATION ABOUT THE EU

Online
Information about the European Union in all the official languages of the EU is available on the Europa website at: https://europa.eu/european-union/index_en

EU Publications
You can download or order free and priced EU publications at: https://op.europa.eu/en/web/general-publications/publications. Multiple copies of free publications may be obtained by contacting Europe Direct or your local information centre (see https://europa.eu/european-union/contact_en)

EU law and related documents
For access to legal information from the EU, including all EU law since 1952 in all the official language versions, go to EUR-Lex at: http://eur-lex.europa.eu/homepage.html?locale=en

Open data from the EU
The EU Open Data Portal (https://data.europa.eu/euodp/en/home) provides access to datasets from the EU. Data can be downloaded and reused for free, both for commercial and non-commercial purposes.