We Need to Think Outside the Box on AI
Listening to a recent edition of the Firewall podcast (The AI Takeover is a Luxury Concern), one of the speakers said that AI isn’t a big issue because it isn’t as immediate a threat as certain others. Specifically, they said that nuclear war, pandemics and climate change were all bigger concerns. This is such a poor understanding of the topic that I had to unsubscribe immediately. These ‘existential threats’ are not on the level of runaway AI.
First, let’s consider pandemics and climate change. Both are conceivable to the average human. That is to say, we’ve had pandemics before and we know what extreme weather can do. We also have processes in place to deal with both. But they are not to be feared as much as runaway AI should be. Pandemics, for example, will come from a closed pool of known quantities. We are not going to be faced with a completely alien virus with 50% or more mortality. Climate change is also not something we should be frightened of. For one we are working at mitigation, and for another the doomsday cult nature of the problem makes it laughable. Both these concerns are far from the life-killers that AI is.
Nuclear war, on the other hand, IS at least on par with the AI apocalypse. Like runaway AI, we have never experienced such an event. We can’t understand the problem beyond theory. AI and nuclear weaponry are both born of human ingenuity; both are also fraught by the nature of human impulsivity. Pandemics and climate change are natural, and while there are multiple factors that will determine how bad they turn out to be, it is still drawing from known unknowns. AI and nuclear war are in the realm of unknown unknowns.
The thing about nuclear war, however, is that we are, let us say, at saturation point. We kinda sorta know how many bombs are out there, who has them, and what kind of flashpoints we should avoid. Runaway AI doesn’t have this luxury. Anything (everything) is on the table. And if you remember Moore’s Law and then extrapolate to what could come next then things are going to get exponentially more dangerous every passing year. If you consider that at heart nuclear war is a result of the number of consciousnesses that can contribute to the result, then just imagine consciousnesses on an another level to humans all contributing to runaway AI.
It’s safe to say that unless you are thinking in these terms - of the unknown unknowns - then you should not even open your mouth on AI, let alone make it the topic of a podcast.