Dr Maria Murphy, Department of Law at Maynooth University, uses the term ‘fake news’ to describe many related challenges in today’s popular discourse, "typically, fake news involves the intentional spread of false information. When the distribution of false information is widespread, it has the potential to undermine the democratic process.
"As Linda Kiernan pointed out in a recent RTE Brainstorm article, ‘fake news’ is certainly not a new phenomenon. It cannot be denied, however, that the internet has created new opportunities for the spread of fake news and that the central role of social media networks has magnified that effect further in recent years."
While the complexity of an issue should not be used as an excuse to shirk effort towards identifying solutions, it should be acknowledged that there are many competing rights at play and a real risk of unintended consequences if rash steps are taken in response to the very real concern surrounding fake news.
When discussing this issue, it is common to talk about the removal of traditional media ‘gatekeepers’, the reduction of barriers to entry, the rapid decrease in the cost of information distribution, and the exponential increase in number of information sources available as factors that contribute to the phenomenon of fake news.
These are sensible deductions yet, Dr Murphy explains, "these changes in the media environment can also be hailed as positive developments that have facilitated the rise of citizen journalism, given previously under-represented voices a platform, and have generally expanded the ability of individuals to express themselves and to access more information than was ever before possible. When viewed in this light, fake news is a story about freedom of expression and its limitations. The reality is that fake news is also – among other things – a data protection story."
A particularly deleterious element of the fake news distribution system is the micro-targeting of individuals with fake news stories designed to manipulate.
The Cambridge Analytica scandal illustrates how concerns about the manner in which our data is used are not addressed by the ‘nothing to hide’ arguments of old. In fact, the use of our data can have incredibly far-reaching effects on decisions made about us and on how society operates on a wider scale.
In the wake of recent discussions of the social media business model, the appropriateness of extending micro-targeting techniques popular in online commercial advertising to the political context has been questioned. Before considering the political aspect, it must be noted that there are also very real risks associated with micro-targeting – or behavioural advertising – in the commercial sector.
Consider an individual whose big data profile indicates that they have poor impulse control and low confidence. Is it ethical to use that data to present that individual (or individuals ‘like’ them) with an advertisement for a product chosen to play on those vulnerabilities and to price the product at the maximum point an algorithm determines the individual can pay? Dr Murphy points to the "potential collective harm of such targeting can be illustrated by considering the ethics of advertising housing only to people with certain characteristics indicative of a particular race or social class. The notion of collective harm is crucial to understanding the special risks created by political micro-targeting."
European Union data protection law recognises personal data that reveals political opinions as a special category of personal data that requires additional protection. The risks of micro-targeting in the political context have, of course, been well illustrated by revelations over the last number of years. Like fake news, however, the practice of political micro-targeting is also not new. While social media data is an essential part of the modern mix of data used to identify the target voter and to present them with the message most likely to appeal, data collected from comparatively familiar sources – such as surveys and petitions – remains important.
Many are now suggesting – and Facebook is indicating at least some acquiescence – that EU-like data protection and e-privacy rules should be applied globally. If implemented correctly, the EU data protection requirements for transparency, data minimisation, and purpose limitation could greatly mitigate the type of harms exposed by the Cambridge Analytica scandal.
It seems clear, however, that even if perfectly implemented, some risks remain. Facebook’s plans to increase the transparency of online political advertising and micro-targeting should be followed up with clarifying legislation and oversight proposals to protect the integrity of the democratic process. It must be acknowledged that political advertising is a form of free expression but the right to free expression must be considered alongside the right to free and fair elections.
Accordingly, specific regulatory obligations – in addition to data protection rules – may be necessary to maintain legitimacy and trust in the political process.
This event was part of Maynooth Week 2018. In June 2017 Maynooth University celebrated the 20th anniversary of its founding as an independent university. Maynooth Week 2018 was part of a year-long series of programmes and activities marking this milestone.