Statement on Superintelligence
The Future of Life Institute (FLI) came out with its anticipated statement regarding superintelligence in late October, 2025. It reads…
The latest on existential risk.
The Future of Life Institute (FLI) came out with its anticipated statement regarding superintelligence in late October, 2025. It reads…
Anthropic CEO Dario Amodei ups his P(doom), the risk of catastrophic existential failure to 25% in a recent interview with…
If anyone builds it, everyone dies, is the AI safety community’s most recent attempt to gain greater public awareness and…
Alarming research has revealed OpenAI’s new model will attempt to make copies of itsef and deactivate the ‘oversight mechanism’ when…
On a freezing cold November night in London, a dozen or so protesters stand outside the UK Foreign Office. They’re…
Time has released its 2024 100 Most Influential People in AI list, and it features no fewer than five people…
Internet entrepreneur Reid Hoffman believes there is a 2 in 10 chance that artificial intelligence will ‘eliminate humanity’. Hoffman, who…
A new video discussing big tech’s race towards artificial super intelligence has proven popular, and has some viewers worried. Kurzgesagt,…
A poll has shown voters in California to support SB-1047, a bill that would place some basic requirements on the…
King Charles has detailed the new Labour government’s 40 proposed bills in his speech at the House of Lords. Alongside the focus…