Cory Doctorow and Jessikka Aro in Conversation: Disinformation and Societal Resilience

22.04.2020

As we were unable to gather in person for 6 Degrees Montréal due to COVID-19, we convened two brilliant writers — Jessikka Aro and Cory Doctorow — for a discussion on the challenges that disinformation poses and the need for societal resilience.

Aro is a Finnish journalist and information warfare expert.  Doctorow is a British-Canadian writer and activist. In a wide-ranging conversation, the two laid out the current “epistemological crisis” we face, how the West’s push for improved media literacy can be weaponized by disinformation experts, the rise of big tech, and the challenges of defining, addressing, and removing hate speech online.

This conversation has been edited for clarity and length.

6 Degrees: Jessikka, can you start with some reflection on your work and experience, and then you two can take it from there?

Jessikka Aro: I started investigating Russian trolls, and specifically their impact on citizens in 2014. I very soon learned that Facebook, Twitter, and YouTube were enabling this state-sponsored information warfare, propaganda, and disinformation. Unfortunately Facebook, Twitter, and YouTube have been really naive, even negligent. The companies have been more interested in securing their finances than our security. From my point of view, this all comes down to the rights of consumers. We as consumers, as the users of products, should be able to consume safe content. And at the moment, the content is not safe.

Cory Doctorow: I think that when we talk about disinformation, we put a lot of emphasis on the fact that people disagree about what’s true, and we don’t talk enough about how people know that things are true, which is to say, we over-theorize differences in beliefs, and under-theorize differences in epistemology. And I think that if we’re going to understand disinformation and why it’s effective right now, we need to examine our current epistemological crisis, the rise and rise of big tech, and the thing that enables the rise and rise of big tech, which is the rise and rise of monopoly and inequality.

I think that in a complex technological society, it’s not possible for individuals to assess the validity of everything just to survive the day. It’s been the historic media literacy claim: do your own homework, ask yourself what the speaker has in it for them, read up on the sources, and find out if it’s true.

Danah Boyd has talked about how that traditional media literacy rule of thumb is actually an incredibly easy thing to weaponize. “Do your homework” has its place, but I think rather than rendering technical subjects legible to lay people – and we’re all lay people in subjects we’re not experts in – what we’ve historically done to navigate this complexity is to have legible processes. This involves experts presenting different views, adjudicating claims, showing conflicts of interest, reassessing based on new evidence. And historically, that’s been how we know things: through a valid process.. What we’ve experienced over the last 40 years is a de-legitimization of the process, and that has created the fertile ground for believing in conspiracies.

This collapse of trust is what disinformation campaigns exploit, and while we should be trying to figure out how to get people to stop believing untrue things, we really have to reassure people — through action that makes power more pluralistic and accountable and attuned to the truth — that what our truth-seeking exercises conclude is true is actually true. And that way, they can rely on official accounts of truth – advice about vaccines, advice about economics – rather than having to find these cults of personality, where you find someone who seems like they know what they’re talking about and believe them, rather than looking to the experts.

JA: During my investigations, I found regular Finnish people who had changed their ideas, attitudes, or behaviour, specifically because they were impacted by Russian troll accounts with fake profiles, and pro-Kremlin blogs. We’re seeing now in this global health crisis that Russia is aggressively pushing more and more conspiracy theories about coronavirus. For example, that it’s actually the United States who created this virus, or that this virus is a way to attack China. The western world is trying to pre-empt the spread of this virus, and instead Russia is attacking us with trolls.

CD: Russians are gonna Russian. I come from a family of Soviet refugees, and am no apologist for Russian human rights abuses, or the corruption in the Russian state. We can do things to try to down-regulate Russian bad activity, but what we really should be focusing on doing is increasing our resilience.

Russia is a failing, desperate, flailing state: it has increasing mortality, it has decreasing productivity, it is in chaos.

JA: And think about that country being our neighbour!

CD: Right? So if this disintegrating basket-case of a petrostate presided over by a guy who’s frankly not that smart, and who manages to run the country by either terrorizing people or impressing them by wrestling bears, if they can have such an out-sized impact on the rest of the world, we should worry about that guy, but we should also worry about what it says about our lack of resilience. What did 10 years of austerity in the Eurozone do to people’s trust in their institutions? People believe in terrible things in concert with the terrible things going on in their lives, and people stop believing in institutions when those institutions fail them.

JA: You talked about this distinction between true or false — that is one aspect of disinformation. In my experience, the most harmful forms of information warfare are crimes conducted; we’re talking about illegal threats, libel, inciting hatred against minorities. That’s what’s in the toolbox of the Russian security services. I understand that for social media giants it’s really difficult to start judging whether they’ll allow people to spread conspiracy theories, but please, can they stop the crimes?

CD: Patrick Ball, from the Human Rights Data Analysis Group, worked closely with the Yazidi during the genocidal attacks, and one of the things that the Yazidi did was upload video with first person accounts of the attacks and video evidence. These were taken down as extremist violent content. This is the central question of how we get justice for the most brutally victimized victims of this kind of criminal act. If we’re worried about hate speech inciting violence, then we need also to worry about the violence in itself.

JA: About defining hate speech, and the difficulty of defining hate speech, I agree to a certain point. It can be really difficult, for even highly educated moderators of traditional media comment sections. Because language is beautiful and eloquent, and you can do so many wonderful, and horrible, things with it. But then, part of the stuff that Facebook and other companies are letting slip through the cracks is not really rocket science. Part of it is groups that harass and attack and stalk individuals, and fantasize about their deaths. Users have tried to report these groups, and I went to Silicon Valley to the Facebook headquarters to report them personally, and they still exist.

CD: I don’t use Facebook, WhatsApp, Instagram. I think they’re a force for evil in the world.

What you’ve identified is the impossibility of responsibly operating a platform at the scale that Facebook operates at, or that Google or Twitter operate at. If you have 2.5 billion users and you’re Mark Zuckerberg, you have to handle 25,000 one-in-a-million user cases a day. You’re not going to get that right.

I think we should just break them up. I think we should just shatter these companies. With something like GDPR, we said “if you want to collect all this data, then you have to spend all this money to create a compliance regime to ensure that you’re handling it responsibly” and a year later there are no European ad tech companies, only American ones, because they’re the ones who can afford it. Instead, we should just say “Don’t collect data. No collecting data. Illegal. And if we catch you doing it, we break your company up, we fine you so much that you go out of business, we jail your executives.”

JA: I completely agree. Thank you for bringing that up. I’ve been wondering why Facebook and other companies haven’t been sued by so many people, by so many nations.

In Finland, everyone is quite well educated. We have free university, too. That might be one reason we’re often referred to as a really resilient nation. It’s a great philosophy, and policy. Of course it’s not possible everywhere, but that’s really where the best resilience comes from. It comes from being able to critically read the media, to critically read the internet, to recognize fake news, to recognize argumentation mistakes, the real basics of what they’re teaching in Finnish classrooms. Also, regulatory options are needed, not just by the European Union, but also in North America, where they have the most power to really regulate these companies.

Then of course any type of voluntary efforts – journalists are training kids for free, going around to schools. But what I’ve really been wondering is that there is not a massive citizen-led voluntary action against these social media giants; the tyranny, basically. I don’t know why so many consumers are accepting this as the new norm and the new reality. Robert Mueller’s investigation about Russian interference in the U.S. election revealed that 126 million Americans had been impacted and touched by Russian trolls before the election. As a government, wouldn’t you want to protect your citizens?

There are different roles for governments, tech platforms, and citizens to play in addressing, mitigating, and building resilience to disinformation. Read about the ICC’s Citizen Resilience Project here, and check out interviews with Swedish philosopher Åsa Wikforss, and members of the global online counterspeak movement.

Stay
Connected

Stay in touch by signing up for our newsletter.

Topics

ICC Insights

OTHER ARTICLES/VIDEOS/PODCASTS/RESOURCES

View All

Interview with new citizen Joy Abasta on  Canada Day, Indigenous Peoples, and colonization 

In advance of Canada Day, the Institute for Canadian Citizenship (ICC) spoke with Joy Abasta, a new Canadian citizen originally from the Philippines, about what Canada Day means to her  and how she planned to mark the day.  

Article
Publication
Canoo
ICC Insights

Meet CAP Alumnus Lorena Chinprahat

Lorena Chinprahat become a Canadian citizen in July of 2017 after moving from the Philippines four years earlier with her family. “We have a young family, and we want to […]

Article
Cultural Access Pass

Meeting prime minister a treat for new citizens (The Daily Courier)

Article
Building Citizenship
ICC in the News