Healthy democracies are threatened by false information, but not everyone is keen on a cure
Our democracy relies upon the choices of an informed electorate. But today, it is threatened by mis- and disinformation. More and better fact-checking is needed according to the University of Copenhagen’s Isabelle Augenstein, who points to the automated control of Web content as an important part of the solution.
With polling card in hand, a voter walks deliberately towards a local school where she will soon cast her ballot. Along the way, she reflects upon the information about candidates that she has been exposed to in various media during recent months: a lengthy newspaper article, several TV ads and a slew of articles online, along with various social media posts. They all vie for influence on her worldview. An opinion is formed and is manifested in her vote. But was it a well-informed vote?
Democracy depends on informed voters, and thereby information that voters can trust.
Massive amounts of information flood society today. False information and claims are easily concealed within this torrent and can serve to spread misunderstandings that widen chasms in society, where there might have been shared fact-based understandings. This is why we as a society should redouble our fact-checking efforts, says Associate Professor Isabelle Augenstein from UCPH’s Department of Computer Science. Augenstein researches misinformation and possibilities of automated fact-checking.
"Democracy depends on informed voters, and thereby information that voters can trust. Unfortunately, not everything we read, see and hear should be allowed to influence our elections if democracy is to be legitimised by informed voters. The positive news is that we actually have good opportunities to curb misinformation, not least on the internet, by means of automated fact-checking, among other things," she says.
The cure is there, but the patient refuses
Isabelle Augenstein points to the possibility of introducing social media algorithms that fact-check posts for false claims. However she also notes that while such "misinformation firewalls" are technically feasible, implementing them may be an uphill battle as major interests could be eager to maintain the status quo.
"Powerful social media companies have financial interests in preserving all of the traffic on their platforms – including traffic generated by false information. So, enormous political muscle would be needed to enforce the fact-checking of content before it was published," she says, adding:
"People typically understand fact-checking as a process that happens after an article is published. But, it is far better if efforts are made beforehand, as research demonstrates that it is extremely difficult to change people's minds once they have read something untrue. This places extra pressure on journalists and politicians to have their facts straight before speaking," says Isabelle Augenstein.
She points out that there may be financial pressure in the press that makes it difficult to find the resources needed for a fact-check, but that this simply highlights the need for automated solutions.
"It is important to keep doing the work of digging into allegations. And therefore, it is also important that the development of automated fact-checking continues, because it can save time and money, as the process of checking whether something is true becomes faster,” she says.
Unfortunately, there are also forces interested in disseminating false information. Disinformation, that is to say deliberately false misinformation, from Russia for example, intentionally influences elections in western democracies and is a widely-recognized problem. Here, the effort is more on the back burner.
"It's hard to keep up with this kind of dissemination because we're dealing with conscious actions. These are people who aren’t interested in being truthful. Here, the focus should be on combatting false content whenever it pops up," says Isabelle Augenstein.
A joint effort
Augenstein believes that we should enhance, not diminish our fact-checking efforts as a society.
"If the task is neglected, it will become a problem for society because politicians won’t be able to be held accountable on a factual basis," she says, highlighting the cancelling of the Danish Broadcasting Corporation's (DR) Detector programme as one problematic example.
According to Augenstein, the fight against false information needs to be a broad-based effort in which professional communicators such as journalists and politicians check their claims – as well as those of others. It also necessitates a cultural shift whereby citizens become more critical and check up on whether the information they are sharing is actually truthful.
"If we are to meet the challenge that misinformation and disinformation pose to our society, it will require efforts across the board – from automated solutions with fact-checking algorithms to critical journalists and, not least, a cultural shift towards more critical information sharing among ordinary citizens," she says.
Here, youth are at the forefront, says the researcher, emphasizing that young people are more used to being online, and therefore more adept at spotting misinformation. Nevertheless, everyone should be trained to detect false information.
"We should simply educate people about how to approach information critically. If we do so, the future doesn't look bleak," concludes Isabelle Augenstein.
Contact
Associate Professor Isabelle Augenstein
Department of Computer Science
Copenhagen University
augenstein@di.ku.dk
+45 93 56 59 19
Journalist og Press Contact
Kristian Bjørn-Hansen
Faculty of SCIENCE
Copenhagen University
kbh@science.ku.dk
+45 93 51 60 02