Investigating Disinformation and Media Manipulation
Written by: Craig Silverman
Craig Silverman is the media editor of BuzzFeed News, where he leads a global beat covering platforms, online misinformation and media manipulation. He previously edited the “Verification Handbook” and the “Verification Handbook for Investigative Reporting,” and is the author of, "Lies, Damn Lies, and Viral Content: How News Websites Spread (and Debunk) Online Rumors, Unverified Claims and Misinformation."
In December 2019, Twitter user @NickCiarelli shared a video he said showed a dance routine being adopted by supporters of Michael Bloomberg’s presidential campaign. The video’s lackluster enthusiasm and choreography immediately helped it rack up retweets and likes, mostly from people who delighted in mocking it. The video eventually attracted more than 5 million views on Twitter.
Ciarelli’s Twitter bio said he was an intern for the Bloomberg campaign, and his subsequent tweets included proof points such as a screenshot of an email from an alleged Bloomberg campaign staffer approving budget for the video.
But a quick Google search of Ciarelli’s name showed he’s a comedian who has created humor videos in the past. And that email from a Bloomberg staffer? It was sent by Ciarelli’s frequent comedic partner, Brad Evans. That information was also just a Google search away.
But in the first minutes and hours, some believed the cringeworthy video was an official Bloomberg production.
Maggie Haberman, a prominent New York Times political reporter, tweeted that journalists who covered Bloomberg’s previous mayoral campaigns had reason to not dismiss it right away:
Knowledge can take many forms, and in this new digital environment, journalists have to be wary of relying too much on any given source of information — even if it’s their own firsthand experience.
Apparently, some reporters who knew Bloomberg and his style of campaigning felt the video could be real. At the same time, journalists who knew nothing about Bloomberg and chose to judge the video by its source could have found the correct answer immediately — in this case, simply Googling the name of the man who shared it.
The point isn’t that experience covering Bloomberg is bad. It’s that at any given moment we can be led astray by what we think we know. And in some cases our base of knowledge and experience can even be a negative. We can also be fooled by digital signals such as retweets and views, or by efforts to manipulate them.
As the Bloomberg video showed, it takes little effort to create misleading signals like a Twitter bio or a screenshot of an email that appears to back up the content and claim. These in turn help content go viral. And the more retweets and likes it racks up, the more those signals will convince some that the video could be real.
Of course, there are far more devious examples than this one. Unlike Ciarelli, the people behind information operations and disinformation campaigns rarely reveal the ruse. But this case study shows how confusing and frustrating it is for everyone, journalists included, to navigate an information environment filled with easily manipulated signals of quality and trust.
Trust is the foundation of society. It informs and lubricates all transactions and is key to human connection and relationships. But it’s dangerous to operate with default trust in our digital environment.
If your default is to trust that the Twitter accounts retweeting a video are all amplifying it organically, you will get gamed. If you trust that the reviews on a product are all from real customers, you’ll waste your money. If you trust that every news article in your news feed represent an unbiased collection of what you most need to see, you will end up misinformed.
This reality is important for every person to recognize, but it’s essential for journalists. We are being targeted by coordinated and well-funded campaigns to capture our attention, trick us into amplifying messages, and bend us to the will of states and other powerful forces.
The good news is this creates an opportunity — and imperative — for investigation.
This handbook draws on the knowledge and experience of top journalists and researchers to provide guidance on how to execute investigations of digital media manipulation, disinformation and information operations.
We are operating in a complex and rapidly evolving information ecosystem. It requires an equally evolving approach built on testing our assumptions, tracking and anticipating adversaries, and applying the best of open-source investigation and traditional reporting techniques. The vulnerabilities in our digital and data-driven world require journalists to question and scrutinize every aspect of it and apply our skills to help guide the public to accurate, trustworthy information. It also requires journalists to think about how we can unwittingly give oxygen to bad actors and campaigns designed to exploit us, and rush to point fingers at state actors when the evidence does not support it.
The goal of this handbook is to equip journalists with the skills and techniques needed to do this work effectively and responsibly. It also offers basic grounding in the theory, context and mindset that enable journalists to deliver work of high quality that informs the public, exposes bad actors, and helps improve our information environment. But the first thing to understand is that hands-on knowledge and tools are useless unless you approach this work with the right mindset.
This means understanding that everything in the digital environment can be gamed and manipulated, and to recognize the wide variety of people and entities with incentive to do so. The beauty of this environment is that there is often, though not always, a trail of data, interactions, connections and other digital breadcrumbs to follow. And much of it can be publicly available if you know where and how to look.
Investigating the digital means taking nothing at face value. It means understanding that things which appear to be quantifiable and data-driven — likes, shares, retweets, traffic, product reviews, advertising clicks — are easily and often manipulated. It means recognizing that journalists are a key focus of media manipulation and information operations, both in terms of being targeted and attacked, as well as being seen as a key channel to spread mis- and disinformation. And it means equipping yourself and your colleagues with the mindset, techniques and tools necessary to ensure that you’re offering trusted, accurate information — and not amplifying falsehoods, manipulated content or troll campaigns.
At the core of the mindset is the digital investigation paradox: By trusting nothing at first, we can engage in work that reveals what we should and should not trust. And it enables us to produce work that the communities we serve are willing and able to trust.
Along with that, there are some fundamentals that you will see emphasized repeatedly in chapters and case studies:
- Think like an adversary. Each new feature of a platform or digital service can be exploited in some way. It’s critical to put yourself in the shoes of someone looking to manipulate the environment for ideological, political, financial or other reasons. When you look at digital content and messages, you should consider the motivations driving its creation and propagation. It’s also essential to stay abreast of the latest techniques being used by bad actors, digital marketers and others whose livelihood relies on finding new ways to gain attention and earn revenue the digital environment.
- Focus on actors, content, behavior and networks. The goal is to analyze the actors, content and behavior and how they are to document how they might be working in unison as a network. By comparing and contrasting these four things with each other, you can begin to understand what you’re seeing. As you’ll see in multiple chapters and case studies, a fundamental approach is to start with one piece of content or an entity such as a website and pivot on it to identify a larger network through behavior and other connections. This can involve examining the flow of content and actors across platforms, and occasionally into different languages.
- Monitor and collect. The best way to identify media manipulation and disinformation is to look for it all the time. Ongoing monitoring and tracking of known actors, topics and communities of interest is essential. Keep and organize what you find, whether in spreadsheets, screenshot folders or by using paid tools like Hunchly.
- Be careful with attribution. It’s sometimes impossible to say exactly who’s behind a particular account, piece of content, or a larger information operation. One reason is that actors with different motives can behave in similar ways, and produce or amplify the same kind of content. Even the platforms themselves — which have far better access to data and more resources — make attribution mistakes. The most successful and compelling evidence usually combines digital proof with information from inside sources — an ideal mix of online and traditional investigative work. That’s becoming even more difficult as state actors and others evolve and find new ways to hide their fingerprints. Attribution is difficult; getting it wrong will undermine all of the careful work that led up to it.
Finally, a note on the two handbooks that preceded this edition. This work builds on the foundations of the first edition of the Verification Handbook and the Verification Handbook for Investigative Reporting. Each offers fundamental skills for monitoring social media, verifying images, video and social media accounts, and using search engines to identify people, companies and other entities.
Many of the chapters and case studies in this handbook are written with the assumption that readers possess the basic knowledge laid out in these previous publications, particularly the first handbook. If you are struggling to follow along, I encourage you to start with the first handbook.
Now, let’s get to work.
Time to have your say
Instead of trying to verify the authenticity of said video (or anything else), we should first ask ourselves if there is any newsworthy content there, even if it were real. I studied politics and work with media all day long, but time and again I am baffled by how a campaign video can be news at all, when it is apparently lacking all political content. By covering this sort of entertainment instead of political issues, the media have helped autocrats like DJT or Bolsonaro to power.
The biggest problem is, especially in developing countries, that the huge part of civil society is not developed enough to avoid manipulation through regime media and tabloids.