Social Media, COVID & Ukraine: Fighting Disinformation

  • Details
  • Transcript
  • Audio
  • Downloads
  • Extra Reading

Organised disinformation about the Covid-19 crisis has degraded public understanding of the crisis and threatened the reputation of credible vaccines and health policy. 

This talk looks at the broad structures and recent history of computational propaganda - the use of algorithms, automation and human curation to distribute misleading information over social media. 

Dr Howard reviews the latest evidence about social media use during our current health crisis, and reports on the very latest themes in Russian information operations about its invasion of Ukraine. He discusses the opportunities for using social media to deepen democracy and 'build back better'.

Download Transcript

Social Media, COVID, and Ukraine: Fighting Disinformation

Professor Philip Howard

11th April 2022


Do we have a right to the truth?

Misinformation has derailed our response to the COVID 19 pandemic. Misinformation is everywhere around us, ranging from the political, economic, and health spheres, to all aspects of our modern life. In offering some answers to this big question I will present some of the latest research on the global misinformation trends, show off what we know about the dynamics and impact of covid misinformation, and offer a briefing on the important efforts to curtail misinformation flows.

One of the big ideas out there for addressing the global problem of misinformation is the possibility of creating something like an international panel on the information environment. This will help coordinate researchers around the world who can identify Information Operations at work on social media platforms. And what is critical here is to imagine that this is to realise that to an important degree, this is not about evaluating content for truth, but about identifying the ways in which algorithms can be manipulated by foreign governments by lobbyists and regular PR firms.

Our lab at Oxford University found the Russian information operations against voters in the US voters in 2016. Using open-source data, we know the security services had access to the close stuff, but we found it using the open-source data. And over the last few years have been looking at the kinds of operations that have grown out of this, this toolkit that have continued and that have disseminated globally.

I have brought a few copies of my latest book called Lie Machines, which tells the story of how these kinds of operations are produced, disseminated, and marketed. There is a stack at the back for the in-person audience. Today I want to give you a taste of the latest research on the global problem of misinformation. Each year our Democracy and Technology Programme at Oxford does an inventory of the countries around the world with organised misinformation campaigns. These are not lone wolf hackers—solitary individuals—launching Information Operations. These are organised in the sociological sense, they have secretaries and hiring plans and performance bonuses retirement plans, they have office space and telephones. In 2017, the first year we did the study, there were just over thirty countries with active Information Operations at work. Last year, we are at 85 countries with active Information Operations.

This year, we are thinking of not doing the study because every country we look at reveals something right some organised information operation. But one of the real surprises over time has been that in the early years, these information operations were run by military units that had been retasked to doing social media. Now, it is most of the operations are regular PR and comms firms based in Toronto based in London based in New York. It is a standard part of the communications toolkit. And so many of the great operations we know about hit the news and there is a different one every week, right? They are coordinated, facilitated by regular comms professionals. It is not simply about government operations.

The other chunk of research that we do on a regular basis now is a global poll of Internet users about their fears. And last year, we did 150 countries. Well over 150,000 respondents, we asked about fear of credit card fraud, fear of having your identity stolen, we asked about sexual harassment online. And we asked about the fear of misinformation being misled globally. The fear of misinformation, the fear of being misled, is the number one concern consistently in every country, there is plenty of regional variation. In China and across Southeast Asia, this is not perceived as a threat. In Latin America, particularly amongst women, the primary fear is sexual harassment online, it is not misinformation. There is certainly regional variety. But this is the number one trust issue for internet users. This should be concerning for anyone whose business model relies on an internet that is trustworthy and stable. This is a threat to those business models. But last year, so we have also been working on COVID misinformation. And there are some standard themes to the COVID mis info that originates from Russia Today, the Russian backed state media agency, or a CGTN the PRC-backed media agency.

The first broad theme is that your elected leaders cannot help you. Democratically elected leaders cannot make tough decisions, they do not act in time, they give out mixed messages on COVID. The second message is that China and Russia are leading on the research, they will find the cures. Their vaccines are the best. And the third, of course, is that Russia and China are leading on the humanitarian aid, that they are the ones who are sending equipment around the world to the countries that are having trouble dealing with COVID crisis. There are some standard themes that permeate all the messaging around COVID. Miss info that originates from other countries.

This fall, my career hit a new low. Since we have been spending so much time identifying these campaigns, we found a campaign to blame COVID on Maine lobster. And the message was that there was a shipment of lobster from Maine that arrived in Wuhan just before the COVID outbreak. And it was the lobster fishing fleet in Maine that had been the cause of the world's COVID epidemic. Now the reason I call this a career low is that it took us weeks to study the network, there must have been hundreds of 1000s of fake accounts across multiple platforms all triggered the number of personnel hours that would have gone into to backing up a ridiculous story like this was significant. And it is at that point that I realised I did not want to chase misinform stories anymore, right? That it is not a rewarding thing to do as a researcher.

And one of the ways we think we can get on top of the problem globally, is by trying to coordinate the world's research community—the same way the Intergovernmental Panel on Climate Change works on consensus around climate issues. They may not have acted fast enough for those of us interested in climate. They have had their own methodology issues, but they do avoid policy. Right, they do not, they do not really issue statements on climate policy. But they will tell you how many degrees’ change in the global temperature relates to how many feet New York City will be underwater. They can measure those things. I think the world's researchers and data scientists can do the same kind of thing it is possible with or without the firm's consent to study how misinformation operations work on scale. It is possible to identify through algorithmic audits, whether an algorithm is being fair, unequal, discriminatory, in some sense.

If we were to try to build something like an international panel on the information environment, it might not make sense to have it deeply tied to nation-states or UN agencies. If there was ever any industry funding by it, we would want it administratively firewalled and financially behind a blind trust so that there was no influence over the priorities of the organisation. But a civic movement is needed here because the solutions to misinformation are complex. The solutions must be nuanced. Fortunately, there is evidence that some things work at corralling misinformation.

Misinformation is the one existential threat that prevents us from addressing all the other existential threats. There is not going to be action on climate, the world's great conflicts are going to get worse, right and an environment of missing in which misinformation flows. We are not going to be able to solve many of the other great gender inequalities, racism, that stuff gets, gets perpetuated through Miss info. And you all have an election again in 2022. And again, in 2024. India, the world's biggest democracy votes also in 2024. There are some milestones ahead of us in which we want to be able to have something in place to tackle the world's misinformation problems.


Do we have a right to the truth?

Rights create obligations. However, we do not need a Ministry of Truth to enforce that right. Evaluating the content is not just too hard to do it at scale. The firms themselves cannot do it at scale. People outside the firms, and outside the engineering teams, are not going to be able to evaluate content over entire platforms.

I think what we can detect are big shifts in patterns, when communities get a huge swath of misleading information on a particular issue. It is not the rhetorical form or content of the issue that some that is so easy to detect. But the sudden bloom of fake accounts, or the straightforward evidence that something is gaming a search algorithm or taking advantage of a loophole in Facebook policy—that is something a data scientist can pick out. Evaluating the veracity of claims, through judges and courts, is a slow-moving process. But revealing if there is network manipulation, or if an algorithm has been written in such a way as to torment a particular part of society. That kind of mechanism you can spot. Securing a right to truth will require building that mechanism.


© Professor Howard 2022

This event was on Mon, 11 Apr 2022

Phil Howard

Professor Philip Howard

Philip N. Howard is a professor and writer. He has written numerous empirical research articles, and published in several disciplines, on the use of digital media for both civic engagement and social control in countries around the world.

Find out more

Support Gresham

Gresham College has offered an outstanding education to the public free of charge for over 400 years. Today, Gresham plays an important role in fostering a love of learning and a greater understanding of ourselves and the world around us. Your donation will help to widen our reach and to broaden our audience, allowing more people to benefit from a high-quality education from some of the brightest minds.