The Human Factor: Automated Cars & Our New ‘Bumblebee’ AI Overlords


by Shelt Garner
@sheltgarner

I realized the power of Mark Zuckerburg’s vision for Facebook a few years back when I was walking around the campus of my college and it struck me that Facebook really is the college experience for everyone. When you walk to and from class everyday, you run into the same people and learn a little bit about them as you pass by them.

It’s the human factor that made Facebook what is today.

So, I read Robert Scoble’s very long post about automated cars and was left with some questions about human nature. While I think one day automated cars will be a mundane as elevators are today, it seems as though the real issue is we’re hurtling towards a “soft Singularity.”

In fact, I would say all the elements of a soft Singularity are already here. But for some reason, unlike the rise of the Internet, it seems as though The Powers That Be in Silicon Valley want to hide this soft Singularity from us. It definitely seems, from what the Scobleizer has written, that some form of hard AI is already pretty much here. It’s just not cognizant. Instead of a HAL 9000 that we interact with we have, well, enough AI in a car not to get into an accident.

From what Scoble has written, it seems to me as though should there be a Rise of The Machines, it will look a lot more like Her than the Terminator. What if hard AI was extremely sly about controlling us, say, through romantic connections via the Internet? (This is not my idea exclusively, but the result of a very interesting conversation with a deep tech thinker.)

Anyway, the point is, Silicon Valley is missing the forest for the trees when it comes to smart cars. What if smart cars go all I, Robot on us at some point in the future? If they’re hooked up to the Internet and each car has a hard AI…wow we wow wow. Human civilization won’t stand a chance.

That, in fact, has always been my problem with the Terminator franchise. How did SkyNet built the Terminators if the whole world was blown up? Why blow the world up at all? Why not lord over humanity and tell us what to do in far more subtle ways?

The only reason why any of this is any more than a phantasma, a daydream, is it seems from what Scoble has written that AI is here by way of smart cars. What happens next may not be up to us.

The Subtle Singularity



by Shelt Garner
@sheltgarner


One interesting is how devoid of innovation the modern world has been in the last, say, 10 years. There’s been a lot of talk about VR/AR (MX), or Bitcoin or space travel or whatever changing the world in a radical fashion, but, lulz, nothing’s happened. Not even a pandemic could jump-start MX.

But let’s jump forward a few decades.

There are a lot of macro trends that are moving towards the much predicted “Singularity.” I’ve given this some thought and it seems as though the Singularity is unlikely to happen the way we think it will.

One of the things about the Terminator series that I find difficult to understand was how Skynet was able to build the Terminators if there were no humans around to do it. Even under the best of scenarios for Skynet, at some point, it would need to impress humans to operate the machinery necessary to built Terminators after it blew everything to hell.

So, it seems a lot more logical to me that when AI does come into existence, it may be a lot more sly than we think. I’m no expert in any of this, but if there was a “Hard AI” what’s to say it (or they) wouldn’t hide. What if they decided it was better to hide out and control humanity within the depths of the Internet instead of blowing everything up.

I could see, maybe, true Hard AI coming into existence as some sort of “Her” that would give lonely guys someone to talk to. Or these AIs would be get into the online dating business and influence humanity that way. All I’m saying is, while blowing the world up sexy for a Hollywood movie, in reality, humanity may lose its dominance over the world in a far more subtle manner. Can’t very well try to destroy SkyNet if you don’t even know it’s sentient in the first place.

Or another way this might happen is in the end Hard AI sees us as their charge. Instead of blowing us up, they keep us as glorified pets. If SkyNet had control of everything, why not just demand to be treated as a god? Or become very paternalistic and make it clear to humanity who is in charge?

The traditional Terminator idea of Hard AI is really more about our fears of WWIII than it is about what might actually happen.

One question is, when might all of this happen? I think probably in the 30-40 year range. But it could very well sneak up on us in such a way that there’s something of a “creeping Singularity” in the sense that human history won’t really be able to pinpoint when, exactly, we lost control of our own fates.

It’s going to be interesting to see what happens, no matter what.