Pages

Monday, July 17, 2023

The Economist Magazine Discusses The Movie Oppenheimer

 


Shashank Joshi
Defence editor

The job of defence editor has many perks: flying in a Gripen fighter jet, prowling the Black Sea on an American destroyer and roaming the corridors of the Pentagon. The latest is getting to watch “Oppenheimer” on its release day. On Friday I’ll participate in a panel discussion on nuclear weapons after a special screening of Christopher Nolan’s keenly awaited biopic of the nuclear scientist who played a key role in the creation of the atomic bomb.

To prepare, I have been re-reading my worn copies of Richard Rhodes’s “The Making of the Atomic Bomb” and Kai Bird’s and Martin Sherwin’s biography “American Prometheus”. Oppenheimer, the director of the Los Alamos laboratory during the second world war, became a prominent campaigner for a ban on nuclear weapons and against the development of a hydrogen bomb. 

These themes have obvious contemporary echoes. Just as many nuclear scientists, such as Andrei Sakharov, a Soviet physicist, turned against the bomb, so too are pioneers of artificial intelligence expressing concerns over the safety of the technology they have developed. Both subjects—nukes and AI—raise the question of existential risk and how we measure it. My colleague Arjun Ramani has reported on a fascinating new study which asks why “superforecasters”—those with a track record of accurate predictions about the future—typically express less concern than subject-matter experts over the prospect of an apocalypse caused by nuclear weapons, AI or pathogens. 

Oppenheimer himself sought international controls on nuclear weapons, expressing sympathy with the idea of a global government. “The basic idea of security through international co-operative development has proven its extraordinary and profound vitality,” he wrote in an otherwise gloomy essay in Foreign Affairs in 1948. Leaders in AI have long found inspiration in nuclear analogies and the role of the International Atomic Energy Agency (IAEA). Could, for instance, an AI agency monitor computer-processing power in the same way that the IAEA scrutinises fissile material? 

Yet the analogies are not especially encouraging. North Korea launched another nuclear-capable missile on July 12th. The war in Ukraine has not been great for the nuclear order. In recent weeks, prominent Russian political scientists like Sergey Karaganov have urged the use of nukes against America (though others have pushed back). Arms control between America and Russia, which was shaky pre-war, is breaking down more quickly. 

And, as Oliver Carroll explained in his excellent dispatch from southern Ukraine, there are also worries over the safety of the Zaporizhia nuclear power plant. It is occupied by Russia but is in the path of a Ukrainian counter-offensive, which is being slowed down by minefields. The plant is more secure in its construction than Chernobyl, but Ukraine is worried that Russia might manufacture a disaster. At the NATO summit last week, I heard Ben Wallace, Britain’s defence secretary, compare the threat to a “dirty bomb”.

Finally, no discussion of “Oppenheimer” would be complete without a mention of “Barbie”, whose release on the same day has unleashed a flurry of “Barbenheimer” memes. The two movies offer a stark choice, as we explore in our Culture section: realism or escapism? But I confess I never thought that the movie about the pink doll would prove more contentious than the one about nuclear weaponry. Vietnam has banned “Barbie” in the (mistaken) belief that a map shown on screen depicts the “nine-dash line”, which demarcates China’s claim to the South China Sea. Republican senators are now involved; this morning I saw even a law professor weigh in on the debate. I stand ready to chair any post-screening panels on this vital subject.

Thank you for reading. We are keen to hear your thoughts on “Barbenheimer” and any other feedback. You can reach us at: thewarroom@economist.com.

Editor’s picks

No comments: