The plague of misinformation about the novel coronavirus — and pretty much any subject that garners public attention these days — can seem overwhelming. But it’s possible to inoculate yourself against being sucked in by falsehoods and attempts to manipulate you, says Mike Caulfield, an expert in digital literacy at Washington State University.

COVID-19 meets Election 2020: the perfect storm for misinformation

Caulfield has spent several years determining the simplest, most effective ways to teach students to spot bogus information and help shut down its spread. He’s boiled it down to a four-step process that takes less than a minute once you get the hang of it.

It’s called SIFT, which stands for:

1. Stop.

2. Investigate the source.

3. Find better coverage.

4. Trace quotes, claims and media to the original context.

The approach is based on the work of a team at Stanford University that found even digital-savvy young people are easy to fool online, with many unable to distinguish ads from news stories or trace the sources of information. The researchers were able to improve the students’ skills by teaching them a technique used by professional fact checkers called “lateral reading,” which involves quickly turning to other sources to verify or debunk claims.

“What people need are quick techniques and rules of thumb, so you don’t have to get bogged down in a 30-minute investigation,” Caulfield said.

He created a website, infodemic.blog, that walks through the method with detailed examples.

Step number one — stopping before you share — might be the most important, particularly with content that causes a strong emotional response or taps into your political identity and views. People trying to manipulate others or sow division know the most effective way to spread their message is by invoking fear, anxiety, anger, or outrage, said Kate Starbird, of the University of Washington’s Center for an Informed Public (CIP).

Advertising

“It’s a good idea to reflect on your own emotional response before you share something,” she said.

Investigating the source, step two, can be as simple as hovering over the Twitter bio of the sender to see if the person is credible or has relevant expertise. If it’s still not clear, another quick step is a Wikipedia search of the web address linked in the tweet or Facebook post. Is it a reputable news organization, a news aggregator that does no fact-checking, or a well-known conspiracy theorist?

It’s also not uncommon for fake accounts to masquerade as legitimate by clever misuse of names. Before it was suspended, the “@Breaking9ll” account tried to mimic “@Breaking911,” by substituting lowercase “Ls” for the “1s.” @Breaking911 is a news aggregator which, itself, sometimes shares false rumors and misleading information during crises, Starbird has found.

Finding better coverage — step three — means checking to see if other sources are confirming, or debunking, the information. One example cited on Caulfield’s website is a tweet from a questionable sender early in the epidemic that claimed money was being disinfected because it might harbor the virus on its surface. Simply highlighting the words “money being disinfected” in the tweet and right-clicking, which can initiate a Google search, showed that several reputable news organizations were reporting that the Chinese government was, in fact, cleaning bank notes.

But another tweet, claiming “eight chopped pieces of garlic and seven cups of water” would cure the virus, was easily proved false when the same type of search showed several warnings from medical experts.

When questions still remain, the final step — tracing to the original context — can usually provide answers. Sometimes a simple date check is all that’s needed to show that the information is being presented in a misleading context or is out of date. A tweet with the headline “New flu ‘unstoppable’ ” circulated in early February and seemed to be solid because Reuters, a reputable news organization, was quoting the World Health Organization, a reputable source. But it turned out to be a story from 2009 about the H1N1 flu.

Advertising

Another trick used to propagate disinformation is falsely framing a story, which can be easily foiled by clicking through to the original source. Caulfield cites the example of a tweet that said Harvard’s head coronavirus researcher was charged for lying about his ties with the Chinese government. But the story linked in the tweet says nothing about coronavirus. In fact, the charges were related to alleged Chinese attempts to steal U.S. scientific and technological advances.

Jevin West, CIP director and co-creator of the UW’s popular “Calling Bullshit” class, suggests that once you’ve mastered the SIFT method, you might consider adjusting your “information diet” to include only sources you are convinced are reliable and feel comfortable sharing.

“I always say: Share less. Think more,” he said.

When possible, it’s also worthwhile to — gently — correct misinformation shared by your friends and family, he said. But first, be sure you’re right — and stay humble.

“We’re all wrong sometimes,” West said. “If you call BS on yourself, then it makes you a little more careful when you go to question someone else’s argument or question their sources.”

For more information: