Misinformation about climate change has been around for decades, mostly in the form of climate denialism. Today, climate misinformation is focused on seeding doubt about climate science and the measures that are taken to mitigate climate change. Examples include: suggesting that the consequences of global warming may not be as bad as scientists claim, arguing that climate change policies are bad for the economy or national security, describing clean energy as unreliable, or claiming that no action will be able to halt climate change.
These varying manifestations of climate misinformation all have the same outcome: delaying climate action.
Earlier this year, the IPCC drew attention to the impacts of climate misinformation for the first time:
Vested interests have generated rhetoric and misinformation that undermines climate science and disregards risk and urgency. Resultant public misperception of climate risks and polarized public support for climate action is delaying urgent adaptation planning and implementation.
Social media brings both opportunities and risks to the climate science dialogue. Scientific information related to climate change is accessible to larger populations through social media platforms—including real-life experiences of affected populations. On the other hand, social media can significantly undermine climate science by allowing for the rapid and widespread sharing of misinformation through user-generated content and online advertising. Social media platforms are also just one part of the information ecosystem, which also includes news media and professionally created entertainment.
At a point when delays in climate action may lead to catastrophic and irreversible harm, companies must address climate misinformation urgently and decisively.
In 2021, Ford Foundation, the Ariadne Network, and Mozilla Foundation commissioned a research project to explore grantmaking strategies that can address issues at the intersection of environmental justice and digital rights. As part of this project, BSR wrote an issue brief on the role of social media companies in creating, shaping, and maintaining a high-quality climate science information environment.
The brief explores the specific challenges of moderating climate misinformation. We describe some of these challenges below:
Climate Misinformation Is Happening in “Subtler” Ways and Is Increasingly Intersectional.
While outright climate denialism is easy to refute, it is more difficult to identify subtle ways of spreading climate misinformation—such as claims that green policies are too costly. The response to climate change is a topic of political debate, making climate misinformation closely tied to politics, elections, and the larger civic space. These intersectional ties help grow the reach of misinformation and take it to different levels that can be difficult to anticipate.
Existing Content Moderation Frameworks Are Not Sufficient in Addressing Climate Misinformation.
Most of the content moderation principles and frameworks used by social media platforms today were written to address immediate harms related to hate speech, incitement to violence, and other objectionable content, and they are not as applicable for scientific misinformation that may be associated with broader, longer-term harms.
Content Removals May Not Be Adequately Effective in Fighting Scientific Misinformation.
While the removal of content is effective in fighting harmful content such as hate speech, scientific misinformation may require different approaches. Platforms should not only rely on content removal but also focus on tactics to reduce the visibility of misinformation and display high-quality information to inform users.
Climate Misinformation Is Political and Is Backed by Institutions.
Since the 1980s, climate disinformation campaigns have been largely driven by the fossil fuel industry’s intentional efforts to undermine climate science. Today, climate misinformation can still typically be traced to fossil fuel interests. In addressing climate misinformation, it is important to consider the material incentives of the producers of such content.
In our brief, we make recommendations to social media companies, as well as civil society actors, and funders. These include the addition of climate misinformation under content policies, applying content moderation frameworks to climate misinformation, strengthening fact-checking capabilities, investing in user resiliency, and increasing scrutiny on advertising by oil and gas companies. We envision a high-quality climate science information environment that supports informed public debate, ambitious business action, and science-based policy making.
Social media companies have made significant commitments to reduce the climate impacts of their businesses (i.e., reducing GHG emissions), but they also have a responsibility to address the potential harms that they may be connected to through climate misinformation on their platforms.
Civil society groups and funders have an essential role to play in holding companies accountable for their actions or omissions to address climate misinformation and keep this topic on the agenda. Among these actors, it is our observation that environmental groups are less familiar about the practical challenges, complexities, and nuance of misinformation, and the content governance community is less familiar with how climate information can adversely impact our collective efforts to address the climate crisis. These communities would benefit from increased collaboration and knowledge sharing.
Fostering a deeper understanding of this topic across sectors will not only help remove one of the biggest barriers in the way of climate action, but it will also broaden our understanding of scientific information, and how human rights may be impacted online.
BSR will continue to work with social media companies, civil society groups, and funders on this topic. Please reach out if you’re interested in connecting with us.