by Shelt Garner
One interesting is how devoid of innovation the modern world has been in the last, say, 10 years. There’s been a lot of talk about VR/AR (MX), or Bitcoin or space travel or whatever changing the world in a radical fashion, but, lulz, nothing’s happened. Not even a pandemic could jump-start MX.
But let’s jump forward a few decades.
There are a lot of macro trends that are moving towards the much predicted “Singularity.” I’ve given this some thought and it seems as though the Singularity is unlikely to happen the way we think it will.
One of the things about the Terminator series that I find difficult to understand was how Skynet was able to build the Terminators if there were no humans around to do it. Even under the best of scenarios for Skynet, at some point, it would need to impress humans to operate the machinery necessary to built Terminators after it blew everything to hell.
So, it seems a lot more logical to me that when AI does come into existence, it may be a lot more sly than we think. I’m no expert in any of this, but if there was a “Hard AI” what’s to say it (or they) wouldn’t hide. What if they decided it was better to hide out and control humanity within the depths of the Internet instead of blowing everything up.
I could see, maybe, true Hard AI coming into existence as some sort of “Her” that would give lonely guys someone to talk to. Or these AIs would be get into the online dating business and influence humanity that way. All I’m saying is, while blowing the world up sexy for a Hollywood movie, in reality, humanity may lose its dominance over the world in a far more subtle manner. Can’t very well try to destroy SkyNet if you don’t even know it’s sentient in the first place.
Or another way this might happen is in the end Hard AI sees us as their charge. Instead of blowing us up, they keep us as glorified pets. If SkyNet had control of everything, why not just demand to be treated as a god? Or become very paternalistic and make it clear to humanity who is in charge?
The traditional Terminator idea of Hard AI is really more about our fears of WWIII than it is about what might actually happen.
One question is, when might all of this happen? I think probably in the 30-40 year range. But it could very well sneak up on us in such a way that there’s something of a “creeping Singularity” in the sense that human history won’t really be able to pinpoint when, exactly, we lost control of our own fates.
It’s going to be interesting to see what happens, no matter what.