Media Literacy: An important factor

If the world is going to stop deliberate or unintentional misinformation and its insidious effects, we need to radically expand and accelerate our counterattacks, particularly human-centered solutions focused on improving people’s media and information literacy.


The deliberate or unintentional spread of misinformation, despite capturing widespread public attention, remains as rampant as ever, showing up recently in the form of false claims about COVID-19 vaccines, the Capitol riot, and many other topics. This “infodemic” is polarizing politics, endangering communities, weakening institutions, and leaving people unsure what to believe or whom to trust. It threatens the foundations of democratic governance, social cohesion, national security, and public health.

Misinformation is a long-term problem that demands long-term, sustainable solutions as well as short-term interventions. We’ve seen a number of quicker, technological fixes that improve the social media platforms that supply information. Companies like Facebook and Twitter, for example, have adjusted their algorithms or called out problematic content. We’ve also seen slower, human-centered approaches that make people smarter about the media they demand to access online. Evidence-driven educational programs, for instance, have made people better at discerning the reliability of information sources, distinguishing facts from opinions, resisting emotional manipulation, and being good digital citizens.

It hasn’t been enough. If we’re to stop misinformation and its insidious effects, we need to radically expand and accelerate our counterattacks. It will take all sectors of society: business, nonprofits, advocacy organizations, philanthropists, researchers, governments, and more. We also need to balance our efforts. For too long, too many resources and debates have focused on changing the technology, not educating people. This emphasis on the supply side of the problem without a similar investment in the demand side may be a less effective use of time and energy.

While technology-centered, self-policing solutions—filtering software, artificial intelligence, modified algorithms, and content labeling—do have the ability to make changes quickly and at scale, they face significant ethical, financial, logistical, and legal constraints.

For one, social media business models thrive on engagement, which incentivizes emotionally charged and freely flowing content. Tech leaders like Facebook’s founder, Mark Zuckerberg, hesitate taking action over concerns about free speech and have tried to avoid political debates until pressed. When they do take action, they face scrutiny for an inconsistent approach. Additionally, research shows that some of the most commonly employed methods for combatting misinformation on social media—such as banners that display fact-checks—have little impact on people’s likelihood to believe deliberately misleading news, and some even backfire. And because people often have a deeply held desire to share what they know with others—particularly information that seems threatening or exciting—tech companies can only go so far to regulate content. There is also the challenge of volume. Tech platforms struggle to keep pace with the many forms and producers of disinformation. Stopping them resembles a high-stakes, never-ending game of Whac-A-Mole.

Given these challenges, we need to invest more into human-centered solutions focused on improving people’s media and information literacy. They not only demonstrate a much deeper and longer-lasting impact, but also may be easier and cheaper to implement than commonly believed.

Research from the RAND Corporation and others shows media and information literacy improves critical thinking, awareness of media bias, and the desire to consume quality news—all of which help beat back misinformation. Even brief exposure to some training can improve competencies in media literacy, including a better understanding of news credibility or a more robust ability to evaluate biases. Media literacy has a stronger impact than political knowledge on the ability to evaluate the accuracy of political messages, regardless of political opinion. Digital media literacy reduced the perceived accuracy of false news, and training remains effective when delivered in different ways and by different groups.

Media literacy training has lasting impact. A year and a half after adults went through a program from IREX (a nonprofit where the authors work), they continued to be 25 percent more likely to check multiple news sources and 13 percent more likely to discern between disinformation and a piece of objective reporting. In Jordan and Serbia, participants in IREX’s training also improved their media literacy skills up to 97 percent.

Media literacy programs can also be affordably and extensively delivered through schools. Finland and Sweden incorporated media literacy into their education systems decades ago with positive results, and Ukraine is beginning to do the same. In Britain, youth who had training in schools showed an improvement in media literacy skills.

Critics may say that improving people’s media literacy and other human-centered solutions are resource-intensive and will not address the problem quickly enough or at sufficient scale. These are real challenges, but the long-term efficacy of such programs is exactly what is needed in the never-ending battle with misinformation. We need to invest more in them while continuing to pursue technology solutions, or we may never create and sustain the accurately informed citizenry that healthy democracies demand.

The effort will require all sectors of societies across the globe collaborating to fully understand and solve the problem. We need nonprofits and advocacy organizations to raise the alarm with the people they serve. We need philanthropists to step up with funding to scale solutions. We need more researchers to provide evidence-based answers to the full scope of the problem and the efficacy of fixes. We need governments to integrate media literacy standards into schools and incentivize training. We need tech companies to do more than tweak their platforms—they need to invest in educating the people who use them, too.

The tools to blunt the power of misinformation are in our hands, but we have to work smarter and faster or risk losing an ever-intensifying fight. Much learning, coalition-building, scaling, and communication remains to be done to “emerge from information bankruptcy.” Solutions are complex but within our reach. And the consequences of inaction are dire: the increasingly severe and invasive destabilization of our societies and daily lives as lies trample the truth.


All copyrights pertaining to this article are owned by CIO Global.