Six tips for inoculating yourself against disinformation

Six tips for inoculating yourself against disinformation

Katya Vogt


A group of people brainstorm

Manipulative content—disinformation, misinformation, clickbait—thrives in today’s digital environment. Like a virus designed to infect as many people as possible, it has evolved to be insidiously effective. In the era of “informational abundance,” social media algorithms, smartphones, and 24-hour news services are designed to exploit weaknesses in our cognitive processes to get us to consume more content, more frequently—regardless of quality. These conditions are ideal for the rise of the disinformation epidemic, resistant to common sense and traditional education methods.

“Inoculating” oneself from this virus is not easy. It requires awareness and effort to take charge of your emotions, mind, and body. But the ability to differentiate between fact and fiction, resist manipulation, and decrease the amount of digital nonsense that impacts your well-being is worth the effort. Exercise the Daily Defense Against Digital Disinformation (4D) practices and you will be on the path of building healthy habits of engaging with information.

1. Graphic of woman surrounded by different media typesBe Mindful of Your Information Space: “What’s Your Digital

The first step is to recognize your role in today’s information environment. Take stock of how much media and information you engage with and whether it is by choice. How do you compare to an average American who picks up their phone about 58 times a day, watches 5 hours of TV, and spends 2.5 hours on social media? Ask yourself: Where do you invest your attention and do you actually want to spend that much time consuming information? Do you trust what you see and are you influenced by it?


2. Graphic showing handcuffed hands using cellphoneDitch Your Digital Addiction: “Turn Off the Autopilot”

The digital information economy borrows tactics from the gambling industry. Clickbait provides enticing linguistic tricks to get you to click, but never quite gives you the reward you crave– that “one weird trick” to boost your credit score or reduce belly fat. Mindless scrolling and clicking in anticipation of something that will finally pay off is an addiction. It diverts us from productive activity, and – spoiler alert – there is no “weird trick.” Smartphone notifications stimulate the reward centers of the brain. YouTube serves up recommended videos algorithmically chosen to keep you hooked. This could lead straight to conspiracy theories if you don’t turn off the autopilot. One of the best things you can do to reject manipulative content is to limit your exposure. Turn off autoplay on YouTube and take charge of what you want to watch next, if anything. Remove addictive apps from your phone. Try a search engine like DuckDuckGo that doesn’t provide you search results based on what it thinks you will engage with. Resist clickbait—there is never a reward at the end that’s worth your valuable time.


3. Take Back Your Brain: “Name It to Tame It”

A graphic of two people and a checklistNotice your own emotions when watching the news or scrolling through your feed and reading the updates. If you catch yourself experiencing strong negative feelings of anger, panic, or despair, especially in combination with a desire to act on it, pause and consider who benefits from your negative reactions. Use of highly emotional narratives, language, or graphic images often signals an attempt to manipulate. Take a deep breath, name the emotion you are feeling, then decide if this is something important enough to look up in other sources to get the full picture.


4. Graphic of a man in cellphone with megaphoneCheck Your Biases: “Don’t Tell Me What I Want to Hear”

Information and news that confirm our preexisting beliefs—that make us feel smart, right, and good about our choices—is very attractive. When you engage with news and information this way, digital platforms will show you more of the same, restricting the diversity of thought and opinion you see. Your feed will become more one-sided to keep you clicking. Before long, you are stuck in an information bubble of news that confirms what you want to hear. Manage your feed by checking the “Why am I seeing this post?” option on Facebook, which will show you how your data is being used to target what you see. Consider “snoozing” some of story streams that feed you the same information all the time. Check out these settings across your social media profiles.


5. Consult Multiple Sources: “Verify Then Trust”Graphic of woman using computer

Even if you avoid emotionally exploitative content, diversify your intake of information to guard against slickly produced false stories. Deliberate propaganda and disinformation are often well-financed and sophisticated, making it harder to spot than common clickbait. When you encounter a source that you are not sure can be trusted, exercise “lateral reading”: Check multiple sources to verify credibility and accuracy. Enter the subject of the story in any search engine and see who else is talking about it. Was it a recent event or a recycled story? Is the author an expert on the subject and do they cite credible sources and evidence?


6. Be Part of the Solution: “Care Before You Share”

TGraphic of woman using computer and thinkingoday, the lines between producers and consumers of information are blurred. What we post, “like,” and “share” can influence how others respond to events, treat minorities, address conflict, or vote. Because of our human need to fit in and be accepted by our peers, we both favor and trust information that comes from our circles and share information that we expect our peers will agree with. Sometimes, in this fast-paced world, we do it without verifying it first—and contribute to distribution of bad information. We all need to take responsibility for our own information consumption and the role we play in what our friends, family, and coworkers consume. Don’t share any story you have not verified to be true. Encourage others in your circle to do the same.

Download this page (PDF, 894 KB)