HEATED CONVERSATIONS, COLD REALITIES

How Disinformation About Climate Change is Rooted in Conspiracy Theories

NIKI KAMPSEMBALIS

In 2007, about the time the United States withdrew from the Paris Agreement on climate change mitigation, chatter on social media platforms began heating up.

From chemtrails to flat-earth hypotheses to melting glaciers, climate-change deniers volleyed conspiracy theories back and forth across cyberspace, mounting an increasingly polarized discussion about a topic that has long generated debate among the general public, if not among scientists.

When you’re doing disinformation work, it’s helpful to look at conspiracy theories,” she said. “They’re at the extreme end of disinformation, so if you find those, you often find other forms of disinformation.
— Kathleen M. Carley, professor in ISR and director of the Center for Computational Analysis of Social and Organization Systems

Such significant disagreement discourages public policy that could more effectively address climate change, reasoned Aman Tyagi, then a doctoral student in engineering and public policy who graduated in 2021. But by better understanding the mindset of climate- change deniers, Tyagi believes scientists and others who recognize climate change as real could potentially design better strategies for communicating with doubters and possibly change their minds.

Together with Kathleen M. Carley, a professor in the Institute for Software Research and director of the Center for Computational Analysis of Social and Organization Systems (CASOS), Tyagi tackled conspiracy theories in a 100-week sojourn through Twitter. They collected 38 million unique tweets for analysis from more than 7 million unique users by searching for the terms: “climate change”; “#ActOnClimate”; and “#ClimateChange”.

They published the paper “Climate Change Conspiracies on Social Media” as part of the working paper series at the 2021 International Conference on Social Computing, Behavioral-Cultural Modeling & Prediction and Behavior Representation in Modeling and Simulation. A forth-coming paper, “Heated Conversations in a Warming World: Affective Polarization in Online Climate Change Discourse,” uses similar methods to analyze the widening divide between factions.

The idea is to root out disinformation and trace its journey as it connects different, yet related ideologies, Carley said. By understanding how and where these individual threads join together, researchers gain a fuller perspective on how they compose a wider tapestry of belief systems.

“When you’re doing disinformation work, it’s helpful to look at conspiracy theories,” she said. “They’re at the extreme end of disinformation, so if you find those, you often find other forms of disinformation.”

Sorting the authors of those tweets into climate change believers and nonbelievers was a herculean task made feasible through the use of ORA-PRO, a commercial network analytic tool that grew from software developed by CASOS. The tool’s algorithm used hashtags and URL links in the tweets to classify 3.1 million users as disbelievers and 3.9 million as believers.

Carley and Tyagi then identified links between the tweets and popular conspiracy theories, searching for keywords and terms such as “deep state,” “illuminati,” “flat earth” and “chemtrail.” They also identified bots, or automated accounts that pose as actual people and contribute to the spread of disinformation, using CMU’s Bot-Hunter, another CASOS-developed tool.

Tyagi and Carley found that most climate change deniers subscribed to two particular conspiracy theories: the chemtrails theory, which claims that plumes left in the sky by airplanes contain toxic chemicals that were purposely added to poison the public; and, the geoengineering theory, which claims government experiments cause climate change.

Social media platforms should curtail sharing of disinformation
— AMAN TYAGI (ENG 2015, 2018, 2021) AI/NLP Research Data Scientist. Procter & Gamble

The variations on disinformation are a form of modern myth-making, said Carley, in that they represent an attempt to explain phenomena the person doesn’t understand. They become a way of weaving together a lot of kinds of disinformation; it becomes a way of substantiating conspiracy theories.

Low-credibility websites posing as news outlets seem to bolster the disinformation by reinforcing false claims, Carley added. In reality, the content comes from marketing firms, or people looking to support an agenda — even from the state-sponsored media of U.S. adversaries.

“You see lots of variety, but there is that common core,” said Carley.

Key takeaways from the research focus on recommendations for combating false information. Tyagi and Carley suggest using the findings to better target messages about climate change, taking into account the existing beliefs and tendencies of nonbelievers.

For example, policymakers should work to debunk chemtrail and geoengineering theories, knowing how closely they align with climate change doubters. But beyond that, social media platforms should curtail sharing of disinformation, said Tyagi.

Platforms appear to be listening. In September 2021, YouTube announced that it would remove videos making false claims about COVID-19 vaccines. Such work is critical in an era when “everyone has their own loudspeaker, so much is intertwined, and people are finding information based on algorithms of social media,” said Tyagi. “There’s no accountability for ‘news’ that has been reported by pseudo-media.”

Compounding the problem, thanks to the increased cost of producing traditionally sourced news, more legitimate sources are drying up, leaving “news deserts” in their wake, said Carley. Filling the gap are “pink slime” sites, which pose as local news but actually originate in other countries. Named after the paste used as a filler in processed meat, pink slime sites tend to be hyperpartisan, often using automated story writing. And while fact-checking sites also have proliferated, some people view them as biased, Carley noted.

The solution, Carley believes, lies in teaching people to think more critically in a digital world, and for companies, universities and schools to foster trust by employing social cybersecurity experts who maintain a “clean air space” of credible information — perhaps including some kind of objective, such as a Good Housekeeping-style seal of approval that identifies a site as legitimate.

Research on disinformation is very valuable,” Carley added. “It helps give awareness of just how much disinformation is out there and how people are being played. It’s become this way of keeping people from even participating in the democratic process.”