Express & Star

Mitigating ‘extinction’ from AI should be ‘global priority’, experts say

Some of the biggest names in artificial intelligence, including OpenAI’s chief executive, have signed the statement.

Published
OpenAI boss Sam Altman

Some of the biggest names in the development of artificial intelligence (AI) have called for global leaders to work towards mitigating the risk of “extinction” from the technology.

In a short statement, which did not clarify what they think may become extinct, business and academic leaders said the risks from AI should be treated with the same urgency as pandemics or nuclear war.

“Mitigating the risk of extinction from AI should be a global priority alongside other societal-scale risks such as pandemics and nuclear war,” they said.

The statement was organised by the Centre for AI Safety, a San Francisco-based non-profit which aims “to reduce societal-scale risks from AI”.

A mobile phone showing the ChatGPT logo
The signatories also include Sam Altman and Ilya Sutskever, the chief executive and co-founder respectively of ChatGPT-developer OpenAI (GK Images/Alamy/PA)

It said the use of AI in warfare could be “extremely harmful” as it could be used to develop new chemical weapons and enhance aerial combat.

The letter was signed by some of the biggest names in the field, including Geoffrey Hinton, who is sometimes nicknamed the “Godfather of AI”.

The signatories also include Sam Altman and Ilya Sutskever, the chief executive and co-founder respectively of ChatGPT-developer OpenAI.

The list also included dozens of academics, senior bosses at companies like Google DeepMind, the co-founder of Skype, and the founders of AI company Anthropic.

AI is now in the global consciousness after several firms released new tools allowing users to generate text, images and even computer code by just asking for what they want.

Experts say the technology could take over jobs from humans – but this statement warns of an even deeper concern.

Sorry, we are not accepting comments on this article.