In chapter 4 of Climate of Contempt I summarize the academic literature showing how (1) online platform algorithms mislead and anger us, and (2) political persuaders use that anger to push us toward ideological extremes and increasingly intense partisan animosity. These are the dynamics that drive the centrifugal forces inside the online political hurricane. That first dynamic is mostly passive or unintentional, but the second is active and intentional. This is Steve Bannon’s famous plan to “flood the zone with sh*t.”

In the heat of partisan political battle no one wants to hear that Russian bots or other malicious online agents tried to aid their cause. And so in today’s gatekeeper-free online information environment, people spin the facts of those disinformation efforts, to dispute their practical effectiveness and to accuse the other side of spreading fake news.

By now it is no secret that a segment of MAGA Republicans admire Vladimir Putin, that Putin probably would welcome a MAGA Republican victory in the 2024 presidential election, and that neither Putin nor Trump favors the energy transition. But Russian disinformation efforts in 2016 were not about any of that. Instead, they were about Russia’s pursuit of a traditional Russian foreign policy objective: namely, destabilizing an adversary.

The axiom that “war is politics by other means” is attributed to various historical figures (Carl von Clausewitz, Lenin, Henry Kissinger). Whether or not anyone actually uttered that phrase, political-economic rivalry in the global north has included the constant threat of war –interrupted only by actual wars — for most of modern history. Only relatively recently has the Cold War nuclear age interrupted that violent history by increasing the stakes of armed conflict. New international institutions (the UN, the EU) arose concurrently to try to defuse and prevent war. But nations have found other ways to fight.

Vladimir Putin is only the latest in a long line of Russian leaders who see western liberal democracy and eastward expansion of NATO and the EU as a political and cultural threat, and an encroachment on Russia’s rightful sphere of influence.  Russian troll farms did try to influence the 2016 election by inciting racial hatred, angering evangelicals and veterans, and promoting populist candidates on the right (Trump) and left (Jill Stein and Bernie Sanders).

Those disclosures provoked intra-party fights in both parties over whether bots and trolls actually influenced the electoral outcomes, fights that continue today. Subsequently, MAGA Republicans adopted a friendlier attitude toward Putin’s Russia, some embracing the Russian narrative on the Ukraine war; Reagan conservatives, for their part, retained their traditional wary attitude toward Russia. A recent article by Columbia Journalism Review summarizes much of this post-2016 history.

But dissecting the effects of 2016 disinformation is a less pressing problem than identifying and muting the effects of ongoing organized, well-disguised, intentional disinformation. Efforts to mislead and anger us continue unabated. A recent BBC study looked into the workings of a Moscow troll farm run by an ex-policeman from the United States.

Another fake [news story] which went viral earlier this year was more directly aimed at American politics. It was published on a website called The Houston Post – one of dozens of sites with American-sounding names which are in reality run from Moscow – and alleged that the FBI illegally wiretapped Donald Trump’s Florida resort.

It played neatly into Trump’s allegations that the legal system is unfairly stacked against him, that there is a conspiracy to thwart his campaign, and that his opponents are using dirty tricks to undermine him. Mr Trump himself has accused the FBI of snooping on his conversations.

Experts say that the operation is just one part of a much larger ongoing effort, led from Moscow, to spread disinformation during the US election campaign. While no hard evidence has emerged that these particular fake news websites are run by the Russian state, researchers say the scale and sophistication of the operation is broadly similar to previous Kremlin-backed efforts to spread disinformation in the West.

The volume of this sort of information is enormous, though the magnitude of its effects on voting behavior are disputed. During last summer’s elections in the U.K., bot accounts on Twitter were viewed more than 150 million times. And today there are truth bots that are designed to counter disinformation bots.

Chapters 5 and 6 of my book address the myriad ways these dynamics distort the politics of the energy transition. Counteracting their effects is about more than simply ignoring climate disinformation. (An analysis from Central European University teases out a variety of different Russian disinformation strategies around climate and environmental policy – click here.) It is about understanding political disinformation, because the energy transition has become (in 2024, at least) a partisan issue.

Overconfidence bias makes the task very difficult – impossible if you are in a FOMO hurry to gather as many headlines and topline conclusions as you can. I have linked resources on this web site that address the problem of identifying and counteracting misinformation. The best remedy is to find ways to gather political information from traditional journalistic sources and discuss it with people offline. But if you can’t do that, consider browsing these sites as a second best way to become a more savvy learner. – David Spence