In the aftermath of this year’s Hurricane Henri, a tweet Aug. 22 featuring a picture of a shark swimming in flood waters made its round on Twitter.

That shark certainly gets around, as the same picture was shared following numerous hurricanes, all the way back to Hurricane Irene in 2011, according to UW professor Kate Starbird who gave a presentation on how easily misinformation and disinformation spreads through social media during the September Initiative on Community Engagement presentation, hosted by Queen Anne Community Council.

Starbird, an associate professor in the Department of Human Centered Design and Engineering, specializes in how social media and communication technologies are used in crisis events. Her most recent work focuses on the direction and spread of online rumors through misinformation and disinformation in the context of crisis events.

Why do people share information that is false, such as the tweet with the hardest-working shark in the ocean?

“Well, often it’s to get engagements, to get retweets, to get likes, to get attention, to grow their audiences,” Starbird said.

When the same tweet was shared in 2017 following Hurricane Harvey, the tweet got 150,000 engagements by sharing the fake photo.

Misinformation and disinformation, are pervasive in online spaces, Starbird said.

Starbird said when she started her work in 2008 or 2009, misinformation was a small part of what she and her team saw during crisis events. Now, misinformation has become a larger piece of crisis events and the discourse surrounding those events. It has especially gotten bad the past 18 months with the country experiencing the long-term crisis event of the pandemic.

“We’ve seen it continue to escalate just in terms of the amount of content that’s out there that’s false or misleading, and even in some cases intentionally misleading, including content with the potential to do much more harm than a fake shark in the flood waters,” Starbird said, referencing false claims about COVID treatments and conspiracy theories about the pandemic and the disease’s origins.

She said one of the interesting things in online spaces about COVID-19 is that “misinformation gets more engagement than factual information.” Starbird said, according to a research study this month, misinformation about COVID gets six times more engagement, which includes liking or commenting or sharing a post.

“That’s the stuff that kind of makes the whole thing churn,” she said. “And so, a lot of us are addicted to these platforms, and it’s hard to get off of them, and when we do go to them, we’re more likely to engage with, spend more time with and pass along things that aren’t true than things that are true, and that is part of why there’s toxicities there.”

Starbird clarified that misinformation is information that is false, but not necessarily intentionally false, whereas disinformation is false or misleading information “purposefully seeded and/or spread for a specific objective.” One of the problems in identifying disinformation is that it’s often built around a “true or plausible core and then layered with distortions and exaggerations intending to shape how others perceive reality,” Starbird said.

Fact checking isn’t even always a solution because, Starbird said, in many cases, disinformation is resistant to fact checking because the false claims are sent out as part of a campaign using multiple pieces of content.

Another problem with disinformation campaigns is that while they intentionally mislead, many participants are “unwitting agents.”

“They’re unaware that the information they’re spreading is false or of their role in the larger campaign,” Starbird said.

While disinformation is difficult to pinpoint, and many people may accidentally contribute to its spread by engaging or sharing social media posts, Starbird said disinformation campaigns are dangerous because they are often designed to confuse people, rather than convince them. Disinformation also erodes trust in information, in institutions — such as media, science, government, and in each other. It also destabilizes common ground that citizens in democratic countries need to stand upon to govern themselves, such as faith that election results are accurate.

Starbird stated that, while disinformation is a serious problem and difficult to eliminate altogether, people can take some steps to recognize and stop spreading it. People should first slow down when filtering through information and begin to think critically about the message and the source before automatically sharing. Starbird also recommends people tune into their emotions before sharing information online.

“Misinformation manipulates us through our emotion,” Starbird said. ... “So just, when you really feel self righteous about something, that’s the best time to go double check or even triple check and maybe even wait a couple hours for that to resolve before passing it along.”

Starbird said people should also take responsibility when spreading information, correct themselves when necessary and correct other people with empathy rather than animosity. She also said people should try to be patient with people who disagree with them.

“I’m not going to tell everybody that they have to keep the connections, but I do think if we want to move forward as a country, we have to somehow keep some of the connections to have something to build back on because we’re in a bad place with some of this,” Starbird said.

To listen to the whole presentation, including Starbird’s research on election disinformation, go to https://youtu.be/esalocr66he.