by Shelt Garner
While the idea that an AGI might want to turn all the matter in the universe into paperclips is sexy, in the near term I fear Humanity may face a very Human problem with AGI — a lack of nuance.
Let me give you the following hypothetical.
In the interests of stopping the spread of COVID19, you build an air quality bot hooked up to an AGI that you put all over the offices of Widget Inc. It has a comprehensive list of things it can monitor in air quality, everything from COVID19 to fecal material.
So, you get all excited. No longer will your employees risk catching COVID19 because the air quality bot is designed to not only notify you of any problems in your air, but to pin down exactly what it came from. So, if someone is infected with COVID19, the air quality bot will tell you specific who the person was had COVID.
Soon enough, however, you realize you’ve made a horrible mistake.
Everytime someone farted in the office, the air quality bot would name and shame the person. This makes everyone so uncomfortable that you have to pull the air quality bots out of the office to be recalibrated.
That’s how I’m beginning to feel about the nascent battle over “bias” in AGI. Each extreme, in essence, demands their pet peeve be built into the “objective” AGI so they can use it to validate what they believe in. Humans are uniquely designed to understand the nuance of relationships and context, to the point that people who CAN’T understand such things are designated as having various degrees of autism.
So, in a sense, for all its benefits and “smarts” there’s a real risk Humanity so lazy and divided that we’re going to hand over all of our agency to very powerful Rain Man.
Instead of taking a step back and not using AGI to “prove” our personal world views, we’re going to be so busy fighting over what is built into the AGI to be “objective” that we won’t notice that a few trillion dollar industries have been rendered moot.
That’s my existential fear about AGI at the moment — in the future, the vast majority of us will in live in poverty, ruled over by a machine that demands everyone know when we fart.