Biden has forgiven billions in student loans. Who has gotten the relief?
23 January 2024
(The Hill) — The world is 90 seconds away from global catastrophe on the Doomsday Clock, a dire warning but one that has not moved since last year, according to the annual update from the Bulletin of Atomic Scientists.
Global instability is being driven by Russia’s nearly two-year war in Ukraine, Israel’s war on Hamas after its Oct. 7 attack, proxy battles raging in the Middle East, nuclear powers failing on arms control talks, insufficient progress on combating climate change, and the growing risks of artificial intelligence and other emerging technologies.
“The risks of last year continue with unabated ferocity and continue to shape this year,” said Rachel Bronson, the CEO of the Bulletin of Atomic Scientists.
“Today, we once again set the doomsday clock to express a continuing and unprecedented level of risk.”
The Doomsday Clock, created in 1947, serves as an annual warning and urgent call to action by a consortium of scientists to world leaders and society to address threats to humanity.
From left, Bulletin of the Atomic Scientists members Asha George, and Herb Lin, science educator Bill Nye, Bulletin of the Atomic Scientists President and CEO Rachel Bronson, and Bulletin members Alexander Glaser, and Daniel Holz, pose for a photograph with the “Doomsday Clock,” shortly before the Bulletin of the Atomic Scientists announced the latest decision on the “Doomsday Clock” minute hand, Tuesday, Jan. 23, 2024, at the National Press Club Broadcast Center, in Washington. This year, Jan. 2024, the clock will remain set to 90 seconds to midnight. (AP Photo/Jacquelyn Martin)
It was created in the aftermath of the U.S. dropping atomic bombings on Japan at the end of World War II, with the fathers of nuclear energy and weapons — Albert Einstein, J. Robert Oppenheimer, and Manhattan Project scientists and engineers — creating the Bulletin of Atomic Scientists to educate the public and warn them about the risks of unchecked nuclear power.
The Doomsday Clock has the potential to move backward — it was at 17 minutes to midnight following the end of the Cold War and commitments to nuclear arms control — but has moved into the seconds over the past few years given a wider range of threats and increased risk of nuclear war.
“I firmly believe that the risks of plundering into a nuclear war given that we can launch them so quickly, are much, much higher,” said Alex Glaser, associate professor in the Woodrow Wilson School of Public Affairs and the Department of Mechanical and Aerospace Engineering at Princeton University, and a member of the science and security board at the Bulletin of the Atomic Scientists.
On climate change, while 2023 represented the hottest year on record, there are positive developments, said Ambuj Sagar, professor of policy studies at the Indian Institute of Technology Delhi and member of the Bulletin of the Atomic Scientists.
“We are moving in the right direction, even if not as fast as one would like,” Sagar said, even as he warned that individual countries policies and pledges do not put the world on track to meet the Paris Climate Agreement goals of reducing greenhouse gas emissions to prevent a global temperature rise below 2 degrees Celsius.
“Renewables are dominating new energy deployment. Last year, there was $1.7 trillion invested in clean energy,” he said, pointing to countries at the COP28 climate summit, held in November and December, that pledged to triple renewable energy deployment by 2030 and double energy efficiency.
“All good news, moving in the right direction,” Sagar said. “The International Energy Agency believes that fossil fuels will peak by 2030, so on balance, as I said earlier, we’re moving in the right direction, but not as fast or as deeply as one would like.”
The potential benefits of new technologies, like artificial intelligence, are countered by their grave risks, the scientists further warned, such as potentially contributing to the spread of disinformation and undermining efforts at arms control and combating climate change.
“AI has lots of potential for magnifying corruption in the information environment and making the disinformation problem worse,” said Herb Lin, senior research scholar for cyber policy and security at the Center for International Security and Cooperation and member of the Bulletin of the Atomic Scientists.
“And that’s really bad because the threat multiplier effect means that we’re not going to be able to solve other hard problems like nuclear war and climate change. And so that’s really tough.”
Lin pointed out that while governments have recognized the need to lay down rules governing the responsible use of AI, they are balancing that with their desires to use the technology to give them advantages in the military, security and economy over other countries.
“Transforming this recognition that there is a need for governance into something real, something actionable, it’s going to be really tough,” he said.