
The horrific assassination of Minnesota State Representative Melissa Hortman and her husband in June, allegedly perpetrated by a self-styled anti-abortion crusader, represented a clear example of rising political violence in the United States, bolstering fears of democracy watchdog groups that US civil society itself is buckling under a barrage of threats from the political right.
But the tragedy also represented a kind of feeding frenzy for right-wing disinformation.
Within hours of the Hortmans’ deaths, a right-wing disinformation “machine,” had been activated, spreading a combination of hateful rhetoric with outright falsehoods.
As Patrick Coolican, editor in chief of the Minnesota Reformer, wrote:
Just hours after Minnesotans learned that Democratic House leader Melissa Hortman had been assassinated, right-wing influencer Collin Rugg, who has 1.8 million followers on X, posted a “report” that hinted that she’d been killed because of a recent vote on ending undocumented adults’ ability to enroll in MinnesotaCare, a subsidized health insurance for the working poor.
Mike Cernovich, another right-wing influencer who has 1.4 million followers on X, took Rugg’s post and amped it up, but in the “just asking questions” style of many conspiracy theories: “Did Tim Walz have her executed to send a message?”
This was just the beginning. In short time, Elon Musk, former operative of the Trump administration and CEO of X—which has actively dismantled protections against disinformation and hate speech—took up the call, amplifying X posts that claimed the assassination was the work of the “far left.”
Disinformation…prevents the true information from really rising to the top.
Meanwhile, US Senator Mike Lee (R-Utah) joined in amplifying the baseless claims. As reported by the New York Times:
“This is what happens When Marxists don’t get their way,” Mr. Lee wrote on Sunday on his personal X account, a message accompanied by photographs of the suspect released by law enforcement officials.
An hour later, in a second post showing the suspect, Mr. Lee wrote: “Nightmare on Waltz Street,” in an apparent reference the Democratic governor of the state, Tim Walz.
The speed with which these baseless claims spread is a function of a broken media and information ecosystem, according to Joan Donovan, founder of the (CISI) and assistant professor of journalism at Boston University.
CISI focuses on understanding when online rhetoric escalates to real-world violence. “We’re mainly concerned with what happens when the wires meet the weeds. That is, how does conversation and rhetoric online heat up to a point where then people are going out in the streets potentially armed and ready for conflict,” Donovan explained in an interview with NPQ.
Sign up for our free newsletters
Subscribe to NPQ's newsletters to have our top stories delivered directly to your inbox.
By signing up, you agree to our privacy policy and terms of use, and to receive messages from NPQ and our partners.
The institute looks for “instances of network hate, network incitement, and network harassment” and studies how disinformation convinces people to not only adopt extreme positions but sometimes take extreme action based on fictional narratives.
“Breaking news is probably the most important variable here,” Donovan said, explaining that in breaking-news situations, people spout off disinformation for different reasons. “Whether it’s because they want to be first and they want the attention, they might insinuate that they know something more about the story than other people do; depending upon how that person is situated, whether they’re a public figure or a celebrity or a politician, that’s going to matter more because the press will quote those people.”
Following the assassinations in Minnesota, “we saw an incredible increase in accusations that this person was politically motivated and from the left wing,” Donovan said. “People as popular as Elon Musk, the most connected man in the world, pushing this idea that the left is, quote-unquote, murderously violent. And so, in those situations, you can really cause a lot of confusion.”
The consequences, and often the purposes, of such disinformation go beyond merely muddying the waters for low-information consumers of media, Donovan says, noting that disinformation often causes legitimate media to begin covering the disinformation itself as news, obscuring the underlying reality.
“Disinformation…prevents the true information from really rising to the top,” Donovan said, applying the point to the Minnesota case. “Instead of focusing on the issues and on the motives here, we’re focused on trying to understand why some of these top politicians took this as an opportunity to cause more confusion.”
Social Media Deregulation and Political Acceptance
The spread of disinformation in breaking news situations has been exacerbated by the recent rollback of content moderation efforts by social media companies.
“There was a serious effort in 2020 by many of the major platform companies to at least moderate some of the more dangerous and prolific misinformation, especially about the pandemic and then also about the election that was going on,” Donovan noted. “And what’s happened since then, though, is there’s been a very big political pushback from the Republicans who do not want social media companies to be exercising their power to moderate their platforms.”
The tragedy also represented a kind of feeding frenzy for right-wing disinformation.
The real-world consequences of allowing disinformation to spread are serious, and potentially, as in this case, deadly.
“Misinformation can be a very powerful mobilizer,” said Donovan. “It can get people to leave their houses and go fight for something they think is happening.”
Meanwhile, the lack of overt condemnation of political violence from political leaders is fuel on the fire.
“If all political figures are not condemning the violence, there is this tacit acceptance of it,” Donovan warned. She contrasted the response to this event with other incidents of political violence, noting that “we need everyone across the different political aisles to weigh in on this and say that this is not how we should settle differences in the United States.”
The targeting of researchers and the dismantling of disinformation studies has made the problem worse. Donovan contends that she herself was pushed out of Harvard in an effort to stifle her criticism of the social media company Meta, and, she related, “there’s another lab out at Stanford called the Internet Observatory that was shut down as well after a years-long legal battle. So, the field of disinformation became very fractured due to the pressure from the tech companies as well as the way in which Republicans have come to see content moderation as an incredibly important issue for their political campaign.”