What's the most important Century in human history?
Some might argue it's a period of extensive military campaigning like Alexander the Great's in the 300 BCE which reshaped political and cultural borders. Others might cite the emergence of a major religions such as Islam in the 7th Century, which codified and spread values across such borders. Or perhaps it's the industrial revolution of the 1700 the transformed Global Commerce and redefined Humanity's relationship with labor. Whatever the answer it seems like any Century vying for that top spot.
Is it a moment of great change when the actions of our ancestors shifted humanities trajectory for centuries to come?
So if this is our metric, is it possible that right now?
This century is the most important one yet? The 21st century has already proven to be a period of Rapid technological growth. Phones and computers have accelerated, the pace of life. And we're likely on the cusp of developing new transformative Technologies like Advanced artificial intelligence that could entirely change the way people live. Meanwhile many Technologies, we already have contribute to Humanity's unprecedented levels of existential risk. That's the risk of our species. Going extinct or experiencing some kind of disaster that permanently limits Humanity's ability to grow and Thrive. The invention of the atomic bomb. Marked a major rise in existential risk and since then we've only increased the odds against us. It's profoundly difficult to estimate the odds, of an existential collapse occurring. This Century very rough. Guess has put the risk of existential catastrophe due to nuclear winter and climate change around. Round point one percent with the odds of a pandemic causing the same kind of collapse at a frightening 3%, given that any of these disasters, could mean the end of Life, as we know it. These aren't exactly small figures and it's possible this Century could see the rise of new technologies that introduce more existential risks. AI experts have a wide range of estimates regarding when artificial general intelligence will emerge, According to some surveys many believe it could happen this Century currently we have relatively narrow forms of artificial intelligence which are designed to do specific tasks like play chess or recognize faces, even narrow a eyes that do creative. Work are limited to their singular specialty. But artificial, general intelligence has or AGI, would be able to adapt to, and perform any number of tasks quickly outpacing their human counterparts. A parts. There are a huge variety of guesses about what AGI could look like. And what it would mean for Humanity to share the Earth with another sentient entity. AGI, might help us achieve our goals. They might regard us as inconsequential, or they might see us as an obstacle to swiftly remove. So, in terms of existential risk, it's imperative the values of this new technology aligned with our own. This is an incredibly difficult Difficult philosophical and engineering challenge. That will require a lot of delicate, thoughtful work, get even if we succeed AGI could still lead to another complicated outcome. Let's imagine an AGI emerges with deep respect for human life and a desire to solve all Humanities troubles.
But to avoid becoming misaligned, it's been developed to be incredibly rigid about its beliefs. If these machines became the dominant power on Earth, their strict values. Might become hegemonic locking Humanity into one ideology, that would be incredibly resistant to change. History has taught us that no matter how enlightened the civilization thinks they are, they are rarely up to the moral standards of later generations. This kind of value lock-in, could permanently distort or constrain Humanity's moral growth. There's a ton of uncertainty around AGI and it's profoundly difficult to predict. How many existential risks will play out over the next Century. It's also possible that new more pressing concerns might render these risks moot But even if we can't definitively say that, ours is the most important tree. It still seems like the decisions we make. Might have a major impact on Humanity's future. So maybe we should all live like the future depends on us because actually it just might.