“I mean, sometimes you get these like, late civilization vibes,” said Elon Musk, Tesla's chief executive officer, at a recent event for the Cybertruck, his piece of absurdist automotive art. “The apocalypse could come along at any moment. And here at Tesla, we have the finest in apocalypse technology.” There's a lot of this end-of-days talk around right now. Even before the Covid-19 pandemic, there were stories about Silicon Valley billionaires prepping for Armageddon by purchasing bunkers in New Zealand. But this year I've been hearing and reading more and more of it, especially linked to artificial intelligence.
I find it more fascinating than troubling, because I see eschatological obsessions as social phenomena not rational analyses of where we're headed. Still, such thinking can be dangerous when exploited by political opportunists. At the very least, it's a wasteful distraction from addressable problems right in front of us. The real question we should be concerned with is why cataclysmic prophets sometimes attract big followings. Understanding this can help us avoid the paths they may lead us down.
The big Doomsday theme this year has been the existential risk from rapidly evolving AI technology. In 2023, everyone seemed to be experimenting with ChatGPT and other sophisticated large language models, feeding anxiety not only about how these tools might destroy jobs, but also about how AI was inching toward sentience and might some day kill us all.
Many of the venture capitalists and engineers behind this technology are adherents of the effective altruism movement and overlapping philosophies concerned about the future of humanity. They're not all apocalyptically inclined, but many are: One famous effective
Read more on tech.hindustantimes.com