This week, as I write this, I was asked to celebrate Ada Lovelace Day by talking about online misinformation. It seems unfair to her; Lovelace was the first to imagine the power of software, but I don’t think even she was visionary enough to foresee its use to foment disagreement about basic facts.
My main goal was to compare the problem of online misinformation (we can’t use “fake news” anymore because of its reinvention as an all-purpose insult for high-quality media that offend certain people by pointing out the truth about them) to that of cybersecurity and to argue that it’s going to require just as broad and complex a response.
Cybersecurity has taken a while to catch up with the idea computer science isn’t enough. Partly, this is because it’s such a young field; most of today’s senior security practitioners got their start because they were (for example) the only person in their company at the time who had ever configured a firewall (or was willing to try). A sea change began about 20 years ago, when usability experts such as Angela Sasse began to approach the topic from the point of view of the user. That work also brought a new focus on economics; bad security policies cost an enormous amount in lost productivity and user frustration. Worse, bad policies induce workarounds and therefore new vulnerabilities.
From there, the importance of incentives became clear…and that paved the way for the many cross-disciplinary, multi-disciplinary, and trans-disciplinary efforts that now include experts from crime science, mathematics, social sciences, psychology, and philosophy. Something like this mix will be required to make an impact on misinformation.
One reason is that, as in cybersecurity, the sources of misinformation, their motives, and their resources are widely varied. Cyber attackers may be teenagers using pre-written scripts, criminals trying to drain random strangers’ bank accounts, or, if you are unlucky, a highly motivated, well-resourced, state-sponsored actor who can overwhelm your ability to protect yourself. Similarly, misinformation may be promulgated by teens in Macedonia or Montana wanting pocket money, by political actors seeking to manipulate voters, or by foreign actors seeking to destabilise another country by sowing division. The responses to these different scenarios must be appropriately tailored.
Journalists tend to view all this narrowly and suggest that fact-checking is the solution. Fact-checking is important, particularly when the source of the misinformation is PR people or politicians who are straight-out lying to serve their own interests. We have been remarkably tolerant of institutionalised lying through PR for decades; see, for example the 1997 book by John Stauber and Sheldon Rampton, Toxic Sludge Is Good for You. But fact-checking is, as the last five years have shown, utterly inadequate to deal with high-profile politicians and others who do not care whether what they say is true as long as it gets them attention and are happy to frame experts as enemies.
The resulting widespread distrust eventually becomes widespread damage. And that’s where we are now; as a Liverpudlian told BBC’s Newsnight in mid-October regarding lockdowns, they don’t trust government in general to have their best interests at heart, and they certainly don’t trust this one. In another example, if you’ve been listening to US president Donald Trump talk big about how he was pushing vaccines through to production as fast as possible, telling a questioning pollster that you aren’t sure you’ll take a vaccine against coronavirus is entirely rational; it’s no longer a sign that you’re an anti-vaxxer spreading distrust against something with a long track record of safety.
In this distrustful environment, skeptics will have to change, too. The Skeptic‘s new editor is Mike Marshall, well-known for his indefatigable activism. He co-founded the Merseyside Skeptics, led the 10:23 anti-homeopathy campaign, and created the QED annual conference. He is also executive director of the Good Thinking Society, where he’s looked into things like the money crowdsourcing pours each year into dubious alternative cancer treatments. Marshall believes the new wave for skepticism is conspiracy theories. In this view, the 1980s and 1990s (I founded the magazine in 1987), were classical skepticism, debunking and examining the New Age clutch of beliefs, many of which were reinventions of old beliefs long discredited by science, such as astrology, ghosts, and so on. The second wave, starting in the late 1990s, when I ran the magazine again for a couple of years, moved into contested science. At the time, that included climate change, but I also recall disputes over whether mobile phones caused brain cancer, and so on. Now, the influence of the internet is pulling together stray fragments of paranoia so we get the emerging 5G standard for mobile telephony somehow fused with the coronavirus, with George Soros and Bill Gates cast as evil masterminds despite their enormous philanthropic efforts.
Skepticism has always been a difficult sell, because what we are generally offering is the willingness to live with uncertainty. At a time when uncertainty is everywhere, it may be even harder to agree on which path leads to reason.