On January 6, some of us watched the storming of the Capitol with horror and surprise. Others of us watched with horror and resignation—an awful feeling that this was an inevitable outcome of the past four years of increasing right-wing extremism and surging disinformation incited by President Donald Trump.
In a BuzzFeed article entitled “In 2020, Disinformation Broke the US,” reporter Jane Lytvynenko recapped a perfect storm of disinformation that led to this point. Conspiracy theories around a “plandemic” shadowed scientific research about the novel coronavirus and how it spread across the globe. Racist lies about antifa-led violence marred the beauty of mass global protests for Black lives. Trump and his GOP loyalists’ attacks on the integrity of the elections eclipsed conversation on record voter turnout. Together, these streams of disinformation have undermined trust in public-serving institutions and even our democracy as a whole.
Disinformation like this has been effective in part because it preys on the raw emotion of fear. In moments of heightened uncertainty, disinformation offers easy scapegoats and appeals to a primal “us versus them” mentality. Disinformation also depends on old and often racialized narratives to gain traction in people’s minds and in the public debate. For example, false claims of voter fraud piggyback off of old narratives about government corruption and Black and brown criminality. “Plandemic” disinformation relies upon anti-Asian and anti-communist narratives. Because of this, we need to combat disinformation not only at the level of social media posts, news articles, and communications platforms, but also at the broader level of narrative strategy.
Over the past four years, we have seen facts take a beating from a number of abusers. But of course, disinformation did not begin in 2020, or even in 2016. As Steven Pool writes in the Guardian, there has never been “a golden age of perfect transparency.” Misinformation, disinformation, propaganda, and hoaxes can be traced back to ancient Rome.
Today, we are in a particularly evolved (or devolved) era where we’re plagued with what Claire Wardle of First Draft calls “Information Disorder.” The US as a nation has contributed a great deal to this global state of affairs. It has also contributed much of the technology through which disinformation propagates while remaining relatively buffered from its effects, until now. Renowned Philippine journalist Maria Ressa has characterized the current American confrontation with Orwellian disinformation as blowback, saying “Silicon Valley’s sins have come home to roost.”
The disinformation we’re facing today is no less than a technology-assisted form of soft power and social control. Dr. Joan Donovan, research director of the Shorenstein Center at Harvard, defines disinformation as “the creation and distribution of intentionally false information for political ends.” Bad actors seed false information online by manipulating algorithms and relying on unwitting actors to spread it, creating cascades and echo chambers where the misinformation is reinforced. The result is a spectrum of harmful impacts, from general confusion to vaccine rejection to the radicalization of white nationalists. All the while, harmful narratives of scarcity, competition, and survival of the fittest become more deeply entrenched.
There’s an old adage that says, “A lie can travel around the world while the truth is putting on its shoes,” and these days, those analog lies have the power of billions of bots and digitally connected humans behind them. The pandemic has put more and more people across the globe online for more hours in the day and has limited our access to trusted community sources of information that relied on in-person connections, such as church gatherings and neighborhood meetings. Disinformation now travels at the speed of the internet and has been shown to spread faster than the truth. In this context, disinformation is becoming more effective at generating chaos and seeding doubt in reality.
But we can fight back. And as mission-driven institutions committed to uplifting unifying values, the nonprofit sector has an important role to play.
In this context, we offer six action steps for nonprofits to combat disinformation, defend democracy, and build narrative power for progressive change:
1. Train staff and stakeholders in disinformation literacy.
Much like a virus, disinformation can only spread through susceptible hosts. We can help our staff and stakeholders inoculate themselves and their communities by training them to recognize misinformation and disinformation, and to resist the urge to share it.
There is a wealth of existing tools for nonprofits to draw on to build disinformation literacy in our organizations. Donovan and her colleagues created the excellent Media Manipulation Casebook with examples of disinformation campaigns and how they have spread. ReFrame and PEN America created a Disinfo Defense Toolkit with election-specific as well as general tools for building disinformation literacy.
In Minnesota, ISAIAH Communications Director JaNaé Bates says they first and foremost train staff and members to use their own “Spidey senses” and deeply held values to detect disinformation designed to harm their communities. Specifically, they use the Race/Class Narrative curriculum to train organizers, influencers, and member leaders to help them recognize and respond to racist dog-whistles.
Bates also started a disinformation alert newsletter with Faith in Minnesota and statewide partners. The newsletter, Repugnant, features a pug dog who calls out disinformation and racially coded dog whistles. One of the issues was titled “Don’t use the F word”; it advised readers to avoid repeating the word “fraud” at all costs when talking about voting—even when trying to debunk claims of voter fraud. This is because repetition of words like “fraud” directly contributes to disinformation around voter fraud, both by increasing the volume of conversation around fraud, and by reinforcing the cognitive frame of fraud.
This is one key mechanism by which disinformation spreads—through humans more than bots, and sometimes these humans are actually trying to debunk the disinformation by sharing it. If nothing else, nonprofits must train our stakeholders to not feed disinformation to the algorithms, and to share vetted and engaging stories that advance our larger narratives instead.
2. Listen for misinformation in your communities.
Oftentimes, full-blown disinformation streams begin as murmurs within our own communities. Nonprofits can add methods to listen for misinformation to the feedback and communication loops you already have with the communities you serve. ReFrame has created a START [Strategic Threat Analysis and Response] tool to help nonprofits document this process.
For example, organizers with Florida for All created a Slack channel that allows volunteers to record misinformation they hear from community members they call and text. Other methods include creating a misinformation tip form on your website or putting out a call for direct messages about misinformation through your social media accounts. If you have a communications person or team, they might devote a half hour every day to scanning social media channels for misinformation shared by followers and allies.
3. Integrate real-time narrative research into your program work.
Since disinformation can go from low chatter to trending topic in an internet minute, it’s critical for nonprofits to have access to real-time research on these trends. To this end, nonprofits can develop partnerships with institutions that conduct research on how conversations spread. This research can help keep your organizational communications from amplifying brewing disinformation and can indicate areas of political education or training necessary to inoculate stakeholders against new trends. This research can also inform new areas of work like the platform accountability campaigns run by MediaJustice and Kairos and the disinformation-specific program work of The Leadership Conference and United We Dream.
Potential partnerships abound: Research institutions like First Draft News specialize in daily and weekly research on disinformation trends. The Shorenstein Center conducts research on how disinformation spreads through various corners and various types of actors of the internet. ReFrame and its sister c4, This Is Signals, conduct research on narrative weather trends that include disinformation as well as trends in broader stories and conversation.
ReFrame and This Is Signals’ approach, adapted from Upwell, combines machine intelligence with human intelligence to monitor the “narrative weather” and to track conversations over time. The tools used for machine intelligence scrape data from different platforms (YouTube, Twitter, reddit, news sites, etc.) to yield broad trends such as spikes in conversation on topics like “police” or “socialism.” Then, researchers apply human intelligence to home in on the content of these conversations among specific audiences (for example: what Black elders 65–80 years old were saying about police after George Floyd was murdered, or what Venezuelans on the right versus the left were saying about socialism in the month before the presidential election). Taken together, these methods allow researchers to aggregate what people are saying and where they are saying it to identify what is resonating and what isn’t with different audiences in moments across time.
Sign up for our free newsletter
Subscribe to the NPQ newsletter to have our top stories delivered directly to your inbox.
Groups in Florida partnered with ReFrame and This is Signals during the election to apply this research. Natalia Jaramillo and Jonathan Alingu of Florida for All both identified the pairing of narrative research and constituent-based communications as best practices.
“Yes, let’s have our content banks and messaging guides,” says Alingu. “And we need the ingredients to adapt and tailor messages in real time to different constituencies.”
“We tried to feed the research into spokesperson prep and media appearances,” says Jaramillo. “We have to invest in infrastructure that allows us to be more spot on and respond to the emerging conversations, and that doesn’t treat communities as a monolith.”
4. Tell stories that engage feelings.
We also need to up the emotional content of our storytelling. While we can’t just fight disinformation with content, no matter how constituency-specific it may be, we can make sure that the content we do create has more impact.
Disinformation travels faster than factual information in part because of sensationalism, which activates people to share out of deep emotional impulses like fear and excitement. Disinformation streams give new emotional urgency to old narratives and thrive in voids of clear, factual, and equally emotional information. Therefore, our content must engage feelings, but rather than prey on fear, our content can focus on movement-building emotions like joy, rage, humor, and pride. We can do this without giving into sensationalism because there is so much authentic emotion in our work. Examples include the Movement for Black Lives’ GOTV content and victory video.
The secret lies in not being afraid to focus on individual characters and relationships who represent larger communities and issues. In work with the Disinfo Defense League, Donovan has used the example of sharing accurate information about voting by explaining how your grandmother is going to vote, rather than just sharing the dry facts.
5. Fill the voids, and plan ahead to prevent the spread.
Here is how Alingu is thinking about future integration with nonprofits in Florida:
We need to incorporate disinformation research into opposition planning and support members in critical thinking. We also need to look at information voids and make sure we’re communicating with people to fill those voids, because otherwise what fills those voids is disinformation.
We know that once disinformation is amplified, it’s difficult to erase its impacts; once the genie is out of the bottle, it’s hard to squeeze it back in. So, as much as possible, we have to prevent disinformation from spreading as early as possible in the chain of amplification, and provide accurate information to spread in its stead. To accomplish this requires planning.
Nonprofits can incorporate disinformation defense into various levels of planning to inoculate communities against disinformation for the long term. All it takes is knowing what makes the communities you serve vulnerable, and proactively moving narratives that are both explanatory and values-based to create a foundation of inspired understanding that leaves no room for disinformation to creep in.
For example, in Florida, when Alingu talks about information voids, one of the voids they identified was a lack of information reaching eligible Black voters that both acknowledged historical conditions of voter suppression and offered detailed information to help people overcome these obstacles. What thrived in that void was disinformation about rigged elections that ultimately discouraged some from coming out and voting at all. Alingu says he will apply this lesson to planning for future electoral campaigns and for their upcoming legislative sessions.
6. Collaborate across organizations.
When we asked Donovan about the role of community organizers and nonprofits in combating disinformation, she replied, “While I know the pandemic will end, or at least we will manage it through treatment and vaccines, I do not know how misinformation-at-scale will be slowed without a similar whole-of-society approach.”
One hub in this approach is the Disinfo Defense League. The League was started last year by the Media and Democracy Action Fund to fill a void in the larger disinformation field and to focus specifically on disinformation targeting communities of color heading into the 2020 election. This is an important formation for nonprofits to connect with, contribute to, and learn from.
Organizations interested in collaborating in the fight against disinformation can develop specific partnerships to share research, collaborate on communications, co-create narrative strategies, and train overlapping constituencies. Whether we are focused on slowing the spread of misinformation specifically or on shifting the narrative terrain to make it more hostile to the manipulation of facts, it will take a whole ecosystem response to seed new trust in our institutions and our democracy.
Unchecked disinformation poses an existential threat to our society as a whole. But the same technology that allows for the spread of disinformation also allows for the spread of beauty, connection, and collaborative creation that was completely unfathomable to our ancestors.
Similar to responding to pandemics, we cannot rely solely on the efforts of a few good people or a few good organizations to beat back disinformation. Neither can we solely rely on one network of organizations, nor the self-regulation of social media giants. We can take steps to curb the rising influence of disinformation, and we also need to challenge and overturn old narratives that give disinformation a foothold in the public imagination. In their place, we can seed new narratives that reflect the aspirational values of a vibrant multiracial democracy.
Jen Soriano, Co-Founder of ReFrame and MediaJustice, is a writer and nonprofit consultant who has spent twenty years doing cultural and political work to shift narratives toward justice.
Hermelinda Cortés (she/they), Program Director at ReFrame and This Is Signals, is a strategist working at the crossroads of politics, culture, and narrative to build powerful movements towards the liberated world we and future generations deserve.
Joseph Phelan, Co-Founder and Executive Director of ReFrame, is a creative strategist grounded in social movements working towards liberation for all people and the planet.