Author: Ethan Shattock, PhD Candidate at Maynooth University
- Free Elections and Free Expression in a Digital Democracy
As a “cornerstone of democracy”, elections facilitate collective decision making by citizens. Katz describes elections as “the institutional framework for aggregating expressed preferences into a collective decision”. Richards posits that “an input mechanism” that enables “citizen involvement in the political process” is an “essential democratic function”. In addition to being recognized as a precondition for democracy, the need for a robust electoral process has been instantiated in human rights instruments. Article 3 Protocol 1 of the European Convention on Human Rights guarantees the right to free elections, and this may impose “a positive obligation on states to secure free elections”. As well as the need for elections, democracy must ensure that free expression is protected for its citizens in their decision making. This is also a prescient element of international human rights instruments, with the right to free expression protected under Article 10 of the ECHR. In an increasingly digital setting, concerns over free expression are pervasive in discussions about the role of social media in democracy. While free expression is widely discussed in conversations about potential regulation of social media platforms, the need to secure free elections has recently become a key cause for concern among technological and governmental stakeholders. Social media allows for the rapid spread of information. While much of this information can be useful and informative, growing concerns relate to how false information may misinform and deceive voters by being circulated on social media. In 2019, 61% per cent of Irish people were at least somewhat concerned about what news is real, above the EU average of 51 per cent. Moreover, a 2018 survey showed that only 28% of Irish respondents “understood the role of algorithms” and had a “limited understanding of how news appears in their social media feeds”. This presents a challenge for democracy and requires a call to action by Irish policy makers.
The protection of free elections along with the protection of free expression are two requirements that should be considered when determining how to respond to a widely discussed problem for contemporary democracy. That is, the growing presence of “fake news”, or disinformation, on social media platforms. The emergence of this problem is now recognised as posing a threat to democracy, however discussions about how policies should be framed are at a very early stage. In order for Irish policy solutions to be successful, the question must be asked as to how policies can be effective while being cognisant of human rights considerations. These considerations are relevant when looking at international human rights instruments and considering how their relevant provisions have been abdicated in existing responses to this growing democratic problem.
- Fake News and European Elections
Gelfert describes fake news as any “deliberate presentation of false or misleading claims as news, where these are misleading by design”. Allcott and Gentzkow characterise it as “news stories that have no factual basis but are presented as facts.” This type of “news” has contributed to fears and concerns from European voters about what news is true. A 2018 study across European member states found that “most respondents say they encounter fake news at least once a week”, while 37% “come across fake news every day or almost every day”. Furthermore 85% of member state respondents “think that the existence of fake news is a problem in their country, at least to some extent,” and 83% said that its existence is “is a problem for democracy in general  The emergence of fake news threatens to compromise the integrity of elections in Europe. In examining the 2017 French election, Ferrara tracks tweets from April to May 2017 related to candidates Marine Le Pen and Emmanuel Macron. Most users who engaged with the hashtag “Macronleaks” were non-French nationals, with “preexisting interest in alt-right topics”, suggesting a “possible existence of a black-market” for false news. Leading up to the 2018 Swedish election, Hedman et al. find that “for every two links of professional news content shared Swedish users shared one junk news story – with 22% of all URLs shared“, and that voters “shared a substantial amount of junk news in the run up to the 2018 Swedish General Election”. A vast amount of this information involved “various forms of extremist, sensationalist and conspiratorial material.” This is exacerbated by the fact that fake news stories gain traction as elections draw closer. In the Swedish elections, bogus content was widely shared “in the final stages of a tightly-contested campaign.” Desigaud et al. find that the flow of “junk news” on social media was far heavier in the second round of voting in the 2017 French general election than the first round. Technology was instrumental in this spread, in that “the proportion of traffic originating with highly automated accounts doubled between the first and second round of voting” and that “the ratio of links to professionally produced news content to other political content has gone from about 2 to 1 in the first round of voting to about 1 to 1 in the second round of voting.” In the same election, Ferrara highlights an “uptake in discussion on election day”.
Social media platforms such as Facebook inadvertently provide a space for this “news” to spread during elections. Marchal et al. state that on the platform, “individual junk news stories can still hugely outperform even the best, most important professionally produced stories, drawing as much as four times the volume of shares, likes, and comments.” In addition, “the most successful junk news stories” invoke “populist themes such as anti-immigration and Islamophobic sentiment”. This problem has also affected major referendums in Europe. After the victory of “Vote Leave” in the Brexit campaign, discussions were raised about the how fake news online influenced the lean margin of victory. In particular, misleading advertisements from the Leave campaign were persistently exposed to voters. Fake news has been found to have a lasting effect on the 2018 abortion referendum in Ireland. Murphy et al. find that out of 3,140 participants who were exposed to six news reports (out of which two were false), almost half reported a memory of at least one false story.
After the 2017 French election, Emmanuel Macron vowed to take active steps to mitigate this problem. He announced, “if we want to protect liberal democracies, we must be strong and have clear rules.” Since then, the seminal legislative movement at European Union level has been to issue Codes of Practice to guide efforts to combat fake news. The voluntary codes prescribe that focus should be directed towards low quality information that “may cause public harm.” The Commission outlined that the codes were “essential steps in ensuring transparent, fair and trustworthy online campaign activities ahead of the European elections in spring 2019.” However, these soft measures are likely the first step of what may amount to more robust legislative efforts to prevent fake news from affecting elections. Combatting fake news online is an important step in protecting current and future elections in Ireland and Europe. Consequently, existing laws must be examined in order to better inform policy makers for future efforts.
- Red Flags: Dangers to Free Expression in Existing Legal Measures
A number of governments have advanced legislation to curtail fake news online. Many responses have used misguided legislative instruments that harm free expression through the imposition of restrictive and punitive measures. In 2018, Kenyan cybercrime legislation was initiated to combat “false publication” and the “publication of false information.” Under the new legal regime, Kenyan citizens are prohibited from sharing “false, misleading of fictitious data” with potential sentences of up to 2 years in prison and fines of $50,000 for violations. Despite attempts to encourage responsibility from social media platforms, provisions of this legislation were challenged by in the “Constitutional and Human Rights Division of the Kenyan High Court.” The Committee to Protect Journalists objected to the law on the basis that it could disproportionality silence journalists and bloggers. Human rights group Article 19 argued that laws criminalising fake news are extremely problematic and are frequently subject to abuse by the authorities due to the power they give the authorities to determine ‘truth’.”
In Malaysia, the adoption of the Anti-Fake News Act in 2018 demonstrates the imbalance between curtailing fake news and protecting free expression. The law defines fake news loosely, criminalising the publication of “any news, information data and reports which is or are wholly or partly false, whether in the form of features, visuals or audio recordings or in any other form capable of suggesting words or ideas”. The first arrest under the law involved a Danish citizen who complained about the delayed arrival time of police officers after a shooting of a foreign national. Bangladesh introduced the Digital Security Act 2018 on foot of similar concerns. This reshaped the information communications technology legal framework in the country, with Section 17 of the Act banning the use of “digital media” to “intimidate individuals or cause harm to the state” and imposes a maximum sentence of 14 years for offences committed under this section. The circulation of negative political propaganda is also an offence and this could arise through contravening the state’s narrative on the liberation war in 1971, when Bangladesh achieved independence from Pakistan. These sweeping changes were condemned by Amnesty International, who highlighted that the government either repudiated or largely dismissed both recommendations suggested by journalistic stakeholders as well as concerns from the general public. When the Minister for Justice allowed for consultation with the selected “Editors Council” to precede the final legislative drafting, recommendations were improperly implemented and in some cases absent from the final draft.
Human rights concerns have also arisen in more developed countries. In Germany, the Network Enforcement Act (NetzDG) was passed in 2017. Based on concerns about illegal and harmful content on social media platforms, the Act imposes numerous obligations on platforms. If “manifestly unlawful content” is found, the platforms must “remove or block access” to it within 24 hours after the receipt of complaint. The Act elicited criticism in its enforcement after German comedian Sophie Passmann had her Twitter account blocked on foot of the implementation of the law after posting a satirical tweet. Rights group Article 19 cautioned that the framing of the law is “already setting a dangerous example to other countries that more vigorously apply criminal provisions to quash dissent and criticism, including against journalists and human rights defenders.” It has also been alleged that the law violates Article 52 of the Charter for Fundamental Human Rights, which stipulates that restrictive impositions on freedoms like the freedom of expression need to be necessary and proportionate. The Act prompted an open letter by six civil society organisations to the European Commission in its “guardian of the treaties” institutional capacity, arguing that in spite of potential success in reducing the spread of harmful content, the law “would unquestionably undermine freedom of expression and information.”
These regulatory measures claim to attempt to improve the quality of information on social media platforms. However, they have all elicited human rights concerns, in light of their potential manipulation that may be used to limit free expression. This is an important regulatory caveat that should guide future policy making in Ireland and in Europe.
- Rights Based Principles: Striking the Balance in Future Responses
Legal efforts to combat fake news should not view the protection of elections and the protection of free expression as mutually exclusive policy aims. Free expression under Article 10 is not an absolute right, with Article 10(2) stipulating that:
“The exercise of these freedoms, since it carries with it duties and responsibilities, may be subject to such formalities, conditions, restrictions or penalties as are prescribed by law and are necessary in a democratic society, in the interests of national security, territorial integrity or public safety, for the prevention of disorder or crime, for the protection of health or morals, for the protection of the reputation or rights of others, for preventing the disclosure of information received in confidence, or for maintaining the authority and impartiality of the judiciary.”
Human rights jurisprudence underscores the need to balance public policy needs with fundamental rights. In Soering v UK, it was stated that “inherent in the whole of the Convention is a search for a fair balance between the demands of the general interest of the community and the requirements of the protection of the individual’s fundamental rights”. Regulatory intervention in the electoral process can be justified on human rights grounds. As enunciated in Danoka v Latvia, the European Court of Human Rights “may endorse limited and strictly proportionate action by the state to protect its democracy, by finding an interference with electoral rights is justified”. Furthermore, the Declaration of Principles for International Election Observation states that, “genuine democratic elections are an expression of sovereignty, which belongs to the people of a country, the free expression of whose will provides the basis for the authority and legitimacy of government”. Consequently, intervention with the electoral process based on the need to secure free elections must be cognisant of potential effects that restrictive measures may have on free expression. These rights exist in a contemporary environment shaped by digital developments. In Cengiz and Others v. Turkey, a violation of Article 10 was found in blocking access to YouTube. The platform was recognised here as facilitating “information of specific interest, particularly on political and social matters”. In light of the important function that digital platforms undertake in sharing politically relevant and electoral information, legislative action must be carefully guided. Staksrud argues that reactive and “knee jerk regulation” will likely involve disproportionate regulatory responses to areas that try to regulate online problems. Expressing a similar caveat, critics fear how a broad or excessively punitive structure of content based legislation may chill or otherwise affect free expression. In this way, achieving a measured balance that prevents fake news while protecting free expression is an important policy goal that emerges. This is a balance that should inform the Irish legal framework, which is evolving in this area.
- Developments in the Irish Legal Framework
Ireland is at an important juncture in this regulatory area, with an evolving legal framework focusing on political advertising transparency. In previous electoral legislation, Section 140 of the Electoral Act 1992 stipulates that electoral advertisements and posters should have “the name and address of the printer and the publisher thereof”. An offence is imposed for advertisements that do not bear such details. Section 22 of the Electoral Act 1997 imposes a need for “disclosure of donations” for “political purposes”. In spite of existing regulatory restrictions for traditional advertisements, such requirements are lacking in online advertisements. This was the impetus behind the Social Media Online Advertising (Transparency) Bill 2017. The Bill addresses “online advertising” and defines it as “any communication which is placed or promoted for a fee on an online platform”. In attempting to close gaps by bringing online advertising requirements in line with offline requirements, Part 2 of the Bill codifies a need for online political advertisements to display “transparency notices” to show funding information and targeted audiences “in a clear and conspicuous manner”. Fines would also be imposed for failure to display transparency notices and the use of “bot” accounts “to cause multiple online presences directed towards a political end to present as an individual account or profile on an online platform” would be proscribed. Moreover, the recognition of false information online as a threat to electoral security has been formally expressed in the 2018 Interdepartmental Report on the Security of the Electoral Process and Disinformation. While the report ascertained that the overall risk level posed to Ireland’s electoral process is “relatively low”, the unique threat of electoral risks posed online were identified as key concerns. In particular, “the spread of disinformation online” the risk of potential “cyber attacks” were identified as key areas of concern.
The conclusions from this report come at a time where calls for legislative and electoral reform permeate the European landscape. As well as the initiation of soft law mechanisms from the European Union, the Electoral Commission in the United Kingdom has called for “a very clear change in the law to make parties and campaigners say on the face of their advert, who they are, who’s paid for that advert and who is promoted.” Addressing the regulatory gaps between offline and online advertising is a fundamental area that will likely shape future Irish attempts to soften the effects of fake news online.
- Public and Private Actors: Collegial Efforts Going Forward
In protecting human rights while combatting fake news online, information about fake news must be open to the public. In this way, rights based principles such as transparency and accountability can be used to describe how policy makers must shape future efforts. These concepts should be seen as linked. Bovens highlights that public accountability necessitates “openness”. In this way, public accountability is “not rendered discretely, behind closed doors, but is in principle open to the general public.” Ferile stipulates that “openness” through accountability means that “the account giving is done in public” and relevant information “is open or at least accessible to citizens”. Meijir outlines that “transparency facilitates accountability when it actually presents a significant increase in the available information”. In describing the European Union as a “composite democracy”, Héritier describes a requirement of transparency for information access, stating that “transparency and access to information determine who has the right to know who the decision-makers are, what procedures they employ, what their areas of interest are, and what the consequences of their decisions are.”
The problem of fake news online requires “concrete solutions that can be readily implemented, tested, and refined.” Weber outlines that in the age of technology, “the implementation of new general principles” related to accountability may be necessary to achieve “a stable and foreseeable legal framework”. However, the implementation of principles and resulting measures will not be effective unless clear roles are delineated in response to technologicallyspawned problems. Consequently, the positions of relevant stakeholders are important when broaching future responses.
Principles of transparency and accountability are useful to enshrine in public policy responses to fake news online. However, the requisite stakeholders must implement these principles within a framework that details specific responsibilities. As Tambini notes when assessing public policy responses to fake news, “there is some evidence that the system is building in checks and balances”. However, such responses are at a “very early stage in this process”. Ireland stands at an important juncture in the fight against fake news online. At a regulatory level, transparency is a prominent feature of current development surrounding the legal framework of political advertising online. A communication issued by the European Commission in 2018 on tackling online disinformation advised that stakeholders ensure “transparency about sponsored content” and called for the requisite fact checkers and public authorities to “continuously monitor online disinformation.”
In spite of increased calls for transparency in political campaigning, such efforts could yield only piecemeal results in the absence of concrete measures to incorporate accountability mechanisms. As Rini identifies, effective solutions to “fake news” may involve “changes to institutions, such as social media platforms”. Relevant institutions and stakeholders must ensure that collegial responses are sincere and must be accountable if measures do not achieve a required level of success.
The Commission’s Codes of Practice advise that if “results prove unsatisfactory, the Commission may propose further actions, including of a regulatory nature.” In order to both assess performance of the Codes of Practice and to posit how regulation may evolve, it should be recognisable if stakeholders are adhering to standards of practice and are doing their best in their capacity. Private stakeholders who have the potential to initiate change in this area and public institutions that represent the electorate must be accountable in taking sincere steps to curtail the spread of disinformation and for making potential changes to the legal framework that might be necessary if efforts do not yield satisfactory outcomes.
Effective transparency is aided by effective accountability. As Lindstedt and Naurin find, “just making information available will not prevent corruption if such conditions for publicity and accountability as education, media circulation and free and fair elections are weak.” While attempts to ensure that political advertisements display funding sources are encouraging, attention should also focus on the quality of relevant information disseminated in the run up to elections. Information that is presented online in the run up to elections that is bogus and misleading must be addressed by the requisite independent body. Voters must be informed where information online is aimed at influencing votes but also where the information is dubious and requires clarification or correction. This prevents voters from making decisions on the basis of fake news disseminated online, and underscores the need for accountability in public policy responses to fake news. The need to directly address false information that gains traction in the run up to elections should be a formative aspect of electoral oversight. To this end, a competent body must carry this out with independence and the necessary expertise required to fulfil such a mandate during elections.
The 2018 interdepartmental report highlights the need for a new independent oversight body. It noted that the “absence of an Electoral Commission with a complete oversight role” is of key concern in Ireland and recommended to “expedite the establishment” of this Commission. Departing from the previous Referendum Commission, the new body will assume a consolidated supervisory role in elections and referendums. While entailing traditional functions such as informing voters on electoral questions and encouraging turnout, the functional scope of the new Commission is not conclusive. In a public consultation that discussed potential functions, the Irish Council for Civil Liberties (ICCL) cautioned that the Commission’s functions must be constructed with human rights in mind, highlighting that “any regulations in this area must be proportionate with regard to the right to freedom of expression”. Reflecting the potential for regulation to impede free expression, the ICCL also stated that efforts must ensure “dissent is not stifled by over-burdensome regulation.” The establishment of the Electoral Commission represents an opportunity to incorporate accountability into public policy responses to fake news. By engaging with and informing voters on the threats posed by disinformation and empowering voters to avoid being deceived, public accountability can manifest through the Commission.
The need to inform voters already exists in the current body. The Referendum Commission states that in each referendum it aims to “explain to the public what the referendum proposal means.” It also states its primary role as imparting “factual information about referendums”. The Australian Electoral Commission lists one of its functions as “targeted education and public awareness programs”. It is suggested that this function should constitute a function of the new Commission, with particular awareness spread about disinformation in the run up to elections. Researchers positing responses to fake news have argued that “in the long term, education and media literacy are also critical.” This will help voters become “more discerning users of online information”. The Irish government has also reiterated the importance of digital media literacy and thus far public awareness initiatives have been established with the aim of advising how to carefully and critically navigate online news. These are positive steps as part of the existing framework, and must continue.
Non-governmental responses to disinformation have often involved fact checking mechanisms, with websites verifying disputed and bogus claims. Fact checking has been adopted by Facebook in conjunction with third parties in order to verify the authenticity of news stories, with selected stories “flagged as disputed”. Such stories will be downgraded in their news feed positioning. Despite this progress on social media platforms, active fact checking for political or “issue based” content online has not yet been raised as a potential part of the Electoral Commission’s functional scope. It is argued that the new Commission should have an active role in discerning factual and credible statements from dubious and poorly supported ones.
Sir Julian King postulates, “the subversion of trusted channels to peddle pernicious and divisive content requires a clear-eyed response based on increased transparency, traceability and accountability.” As digital channels of information, social media companies are important stakeholders that will shape future responses. The digital architectures of social media facilitate “low cost, easy access, and rapid dissemination of information.” This allows for fake news stories to travel with unprecedented speed and efficiency. Irrespective of whether the Electoral Commission’s functions involve fact checking, there will be a need for collaboration with social media platforms. Solutions that aim to stifle this spread will require technological expertise and resources that these platforms primarily have access to. The European Commission states that social media platforms “play an important role in speeding up the spread of such news and they enable a global reach.” Therefore, the Commission has outlined that “a comprehensive policy response must reflect the specific roles of different actors” including “social platforms, news media and users. The pressing need for technological expertise means that digital platforms will likely play a critical role in the process of countering disinformation.
Legislative efforts to secure transparency of online advertising are linked to the anonymity of online advertisements on social media platforms. Guess et al. frame “paid political advertising” on Facebook as a source of “political misinformation”, highlighting how “revelations about misleading Facebook ads, including Russian influence operations, have reinforced broader concerns about the lack of transparency in advertising on the site.” Digital platforms already assume responsibilities for content on platforms. Platforms have “hosting” obligations in addressing “illegal activity or information” under the E Commerce Directive. Platforms also curate information and news stories that appear on their users’ news feeds. As Bunting notes, “big platforms are already deeply implicated in a kind of ‘regulation’ of news” through their algorithms and news feed selection as well as “promote and suppress” chosen content. In this way, the involvement and related responsibilities of social media platforms with regard to fake news regulation would build on existing responsibilities and initiatives. The involvement of social media platforms in perpetuating harmful content has elicited calls for more robust accountability.. In the United Kingdom, a Committee recommendation expressed that public bodies responsible for preventing fake news online “should have statutory powers to obtain any information from social media companies that are relevant to its inquiries”.
A more active role is being taken by social media platforms in combatting disinformation. As Facebook C.E.O Mark Zuckerberg acknowledged, “our responsibility at Facebook is to amplify the good and mitigate the bad.” This is reflected by Facebook’s efforts in establishing a “regional operations” hub in Dublin focused on “election integrity”. The new centre focuses on “the run-up to elections”, in an attempt to “add a layer of defence against fake news, hate speech and voter suppression.”
This represents a positive step by platforms in addressing disinformation and provides a potential area for collaborative efforts between the proposed Electoral Commission and online platforms. Transparency in this area should have the overarching aim of imparting and communicating information that can legitimately inform voters. Citizens must be educated on how information travels online in the run up to elections, and how to more readily discern credible information from spurious and poorly supported claims. This requires an emphasis on digital literacy and engagement of both online platforms and public bodies as part of collegial efforts to stem online disinformation in the run up to elections.
Social media platforms have become an integral repository for information that fuels public opinion. However, the heightened presence of fake news online has spurred new concerns about how information is manipulated in the run up to elections. In this way, public opinion may be based on false claims, in a way that subverts informed voting in the electoral process. Irish and European regulatory discussions have recognised the regulatory gaps in online political advertising that manifest online. At present, concrete public policy that prevents fake news online is in its early stages. While a heightened acknowledgement of this particular threat to Irish elections is a positive step, subsequent regulatory responses should be framed with respect to human rights principles. In recognising the need to protect elections under Article 3 of Protocol 1 while also preserving free expression rights under Article 10 of the European Convention on Human Rights, responses must balance public needs with fundamental rights. In attempting to enshrine transparency and accountability on responses, policy makers should be cognisant of the poorly crafted responses that have failed to strike this balance, resulting in condemnation from human rights groups.
For effective transparency and accountability in public policy responses, stakeholder roles must be clearly delineated. As a consolidated electoral supervisory body, the proposed Electoral Commission should engage online platforms and take advantage of the unique position that Ireland has, given the proximity of Facebook’s efforts in its Dublin centre to combat fake news on the platform in the run up to elections. Synthesising policy responses with human rights principles will inform measured steps against fake news online while avoiding the ultimately harmful measures that have characterised other jurisdictions. In actively responding to this threat to democracy, efforts from stakeholders will need to build on encouraging initial steps and inculcate rights based principles into more concrete responses.
 Richard Katz, Democracy and Elections (Oxford 1997) <https://books.google.ie/books/about/Democracy_and_Elections.html?id=VNez0rhiE44C&redir_esc=y>
 Harris, O’Boyle and others, Law on the European Convention on Human Rights, (Oxford 4th Edn 2018) chapter 22, pg 910.
 Robert A Dahl, ‘What Political Institutions Does Large-Scale Democracy Require?’ Political Science Quarterly Vol. 120, No. 2 (Summer, 2005), pp. 187-197 <https://www.jstor.org/stable/20202514?seq=1#page_scan_tab_contents>
 Freedom of expression in Europe: Case-law concerning Article 10 of the European Convention on Human Right (Council of Europe 2005) <https://www.echr.coe.int/LibraryDocs/DG2/HRFILES/DG2-EN-HRFILES-18(2007).pdf>
 Digital News Report Ireland 2019 <https://www.bai.ie/en/increase-in-number-of-irish-media-consumers-concerned-about-fake-news-on-the-internet-reuters-digital-news-
 Use of social media for news amongst Irish consumers declines while understanding of how news appears in their social media feeds remains low Broadcasting Authority of Ireland (14 Jun 2018) <https://www.bai.ie/en/use-of-social-media-for-news-amongst-irish-consumers-declines-while-understanding-of-how-news-appears-in-their-social-media-feeds-remains-low/>
 Axel Gelfert, ‘Fake News: A Definition’ (2018) Technical University of Berlin , Informational Logic Vol 38, No.1 <https://www.academia.edu/36258758/Fake_News_A_Definition>
 Hunt Allcott & Matthew Gentzkow, (2017) ‘Social Media and Fake News in the 2016 Election.’ Journal of Economic Perspectives—Volume 31, Number 2—Spring 2017—Pages 211–236. <https://web.stanford.edu/~gentzkow/research/fakenews.pdf>
 Flash Barometer 464, Fake News and Disinformation Online, European Commission April 2018 <http://ec.europa.eu/commfrontoffice/publicopinion/index.cfm/ResultDoc/download/DocumentKy/82797>
 Emilio Ferrara, ‘Disinformation and Social Bot Operations in the Run Up to the 2017 French Presidential Election’. University of Southern California, Information Sciences Institute (2017) <https://arxiv.org/pdf/1707.00086>
 Freja Hedman and others, ‘News and Political Information Consumption in Sweden: Mapping the 2018 Swedish General Election on Twitter’ The Computational Propaganda Project, Oxford Internet Institute (2018) <https://comprop.oii.ox.ac.uk/research/sweden-election/>
 Louise Nordstrom, Report: Swedes bombarded with ‘fake news’ ahead of election, The Local (2018) < https://www.thelocal.se/20180906/report-swedes-bombarded-with-fake-news-ahead-of-election>
 Jack Stubbs and Johan Ahlander, ‘Exclusive: Right-wing sites swamp Sweden with ‘junk news’ in tight election race’ Reuters (Sep 6 2018) <https://www.reuters.com/article/us-sweden-election-disinformation-exclus/exclusive-right-wing-sites-swamp-sweden-with-junk-news-in-tight-election-race-idUSKCN1LM0DN>
 Clementine Desigaud and others, ‘Junk News and Bots during the French Presidential Election (Round II)’ ,The Computational Propaganda Project, Oxford Internet Institute <https://comprop.oii.ox.ac.uk/research/junk-news-and-bots-during-the-french-presidential-election-round-ii/>
 Nahema Marchal and Bence Kollanyi ‘Junk News During the EU Parliamentary Elections: Lessons from a Seven-Language Study of Twitter and Facebook’ (Computational Propaganda Project 2019) <https://comprop.oii.ox.ac.uk/wp-content/uploads/sites/93/2019/05/EU-Data-Memo.pdf>
 Andrew Grice, ‘Fake news handed Brexiteers the referendum – and now they have no idea what they’re doing’ The Independent (Jan 18 2017) <https://www.independent.co.uk/voices/michael-gove-boris-johnson-brexit-eurosceptic-press-theresa-may-a7533806.html>
 Gillian Murphy and others, ‘Fake News can lead to false memories’, (2019) Science Daily. <https://www.sciencedaily.com/releases/2019/08/19082108222.htm/>
 Angelique Chrisafis, ‘Emmanuel Macron promises ban on fake news during elections’ The Guardian (Wed 3 2018) <https://www.theguardian.com/world/2018/jan/03/emmanuel-macron-ban-fake-news-french-president>
 EU Codes of Practice on Disinformation (European Commission 2018) <https://ec.europa.eu/digital-single-market/en/news/code-practice-disinformation>
 National Assembly of the Republic of Kenya <http://www.parliament.go.ke/index.php/the-national-assembly/house-business/hansard>
 Initiatives to Counter Fake News: Kenya <https://www.loc.gov/law/help/fake-news/kenya.php>
 Kenyan president should not sign cybercrime bill into law (Committee to Protect Journalists, 2018) <https://cpj.org/2018/05/kenyan-president-should-not-sign-cybercrime-bill-i.php>
 Kenya: Passage of flawed Computer and Cybercrimes Act threatens free expression (Article 19, 2018) <https://www.article19.org/resources/kenya-passage-of-flawed-computer-and-cybercrimes-act-threatens-free-expression/>
 “Anti- Fake News Act 2018”, Interpretation 2.
 First person convicted under Malaysia’s fake news law (The Guardian, 30 April 2018) <https://www.theguardian.com/world/2018/apr/30/first-person-convicted-under-malaysias-fake-news-law>
 Experts call for review of Digital Security Act (Dhaka Tribune 2018)<https://www.dhakatribune.com/bangladesh/law-rights/2018/09/30/experts-call-for-review-of-digital-security-act>
 ‘Bangladesh: New Digital Security Act imposes dangerous restrictions on freedom of expression’ (Amnesty 2018) <https://www.amnesty.org/en/latest/news/2018/09/bangladesh-new-digital-security-act-imposes-dangerous-restrictions-on-freedom-of-expression/>
 The Network Enforcement Act 2018, Section 3 Handling of Complaints about Unlawful Content (2) 2.
 ‘Free Speech vs. Censorship in Germany’ Politico (2018) <https://www.politico.eu/article/germany-hate-speech-netzdg-facebook-youtube-google-twitter-free-speech/>
 Ibid (No.115).
 Article 52, Scope of Guaranteed Rights, Charter of Fundamental Rights of the European Union, (2000/C 364/01) <http://www.europarl.europa.eu/charter/pdf/text_en.pdf>
 Maryant Perez, EU Action Needed: German NetzDG Draft Threatens Freedom of Expression (European Digital Rights 2017) <https://edri.org/eu-action-needed-german-netzdg-draft-threatens-freedomofexpression/>
 Article 10(2) < https://www.echr.coe.int/Documents/Convention_ENG.pdf>
 A 161 (1989).
 2006- IV; EHRR 478 GC
 ‘Declaration of Principles for International Election Observation’, United Nations (Oct 27 2005) <http://eeas.europa.eu/archives/docs/eueom/pdf/declaration-of-principles_en.pdf>
 1 December 2015 (judgment).
 Section 140, Electoral Act 1992 <http://www.irishstatutebook.ie/eli/1992/act/23/section/140/enacted/en/html>
 Ibid (2).
 Section 22, Part 4, Electoral Act 1997 <http://www.irishstatutebook.ie/eli/1997/act/25/section/22/enacted/en/html>
 Online Advertising and Social Media (Transparency) Bill 2017 Part 1, 2.
 Ibid Section 4.
 Ibid Section 5
 Overview- Regulation of Transparency of Online Political Advertising in Ireland (2019) <https://www.gov.ie/en/policy-information/7a3a7b-overview-regulation-of-transparency-of-online-political-advertising-/>
 (n 123).
 Ewan Ferlie, ‘The Oxford Handbook of Public Management’ (Oxford 2005)
 Albert Meijer, ‘Transparency’, (2014) The Oxford Handbook on Public Accountability
 Adrienne Héritier (2003) ‘Composite democracy in Europe: the role of transparency and access to information’, Journal of European Public Policy, 10:5, 814-833,
 Daniel Fred and Alina Polyakova, ‘Democratic Defense Against Disinformation’ Atlantic Council (2018) < https://www.atlanticcouncil.org/images/publications/Democratic_Defense_Against_Disinformation_FINAL.pdf>
 Rolf H.Weber, ‘Accountability in the Internet of Things’ (2011) Computer Law & Security Review 27, 133-138 <https://www.dhi.ac.uk/san/waysofbeing/data/governance-crone-weber-2011a.pdf>
 Damian Tambini, ‘Fake News: Public Policy Responses’ (2017) London School of Economics <http://eprints.lse.ac.uk/73015/1/LSE%20MPP%20Policy%20Brief%2020%20-%20Fake%20news_final.pdf >
 Tackling online disinformation: Commission proposes an EU-wide Code of Practice Brussels, 26 April 2018 <http://europa.eu/rapid/press-release_IP-18-3370_en.htm>
 Lindstedt, C., & Naurin, D. (2010). Transparency is not Enough: Making Transparency Effective in Reducing Corruption. International Political Science Review, 31(3), 301–322. <https://www.researchgate.net/publication/258142264_Transparency_Is_Not_Enough_Making_Transparency_Effective_in_Reducing_Corruption>
 (n 90) pg 18.
 ICCL Submission to the Public Consultation on the Establishment of an Electoral Commission No EC19.008
 ‘Who We Are: The Referendum Commission’ <https://www.refcom.ie/the-commission/who-we-are/>
 ‘How can we combat fake news? – The role of platforms, media literacy, and journalism’ Reuters Institute for the Study of Journalism (24 Mar 2017) <https://reutersinstitute.politics.ox.ac.uk/risj-review/how-can-we-combat-fake-news-role-platforms-media-literacy-and-journalism>
 Be Media Smart: A New Public Awareness Campaign Launched Webwise (Mar 19 2019) <https://www.webwise.ie/trending/be-media-smart-new-public-awareness-campaign-launched/>
 Social Media and Fake News in the 2016 Election Hunt Allcott Matthew Gentzkow
JOURNAL OF ECONOMIC PERSPECTIVES VOL. 31, NO. 2, SPRING 2017
 Working to Stop Misinformation and False News, Facebook (April 7 2017).
 Tackling online disinformation: Commission proposes an EU-wide Code of Practice (European Commission 2018) < https://europa.eu/rapid/press-release_IP-18-3370_en.htm >
 Kai Shu and others, Fake News Detection on Social Media: A Data Mining Perspective
 ‘Fake News and Online Disinformation’ European Commission (2018).
 Andrew Guess and others. ‘Fake News, Facebook Ads and Misperceptions’ (2018) <http://www-personal.umich.edu/~bnyhan/fake-news-2018.pdf>
 Article 14, Directive 2000/31/EC of the European Parliament and of the Council
 Mark Bunting, ‘Regulating Online Platforms for Misinformation and Disinformation’ (2018) London School of Economics
 ‘Disinformation and ‘fake news’: Final Report: Government Response to the Committee’s Eighth Report’ <https://publications.parliament.uk/pa/cm201719/cmselect/cmcumeds/2184/218402.htm>
 Mark Zuckerberg, ‘Building Global Community’ Facebook (Feb 16 2017)
Colin Gleeson, ‘Facebook to Set Up Dublin Centre to Combat ‘Fake News’’ Irish Times (Jan 28 2019) <https://www.irishtimes.com/business/technology/facebook-to-set-up-dublin-centre-to-combat-fake-news-1.3773265>