- Joined
- Jul 20, 2021
- Messages
- 19
I completely agree. We are no where near what we fantasize about as true AI and it is all just machine learning at this point. It will be a long time before we get Rosie Jetson being sarcastic while she cleans our house.That said, I'm one of the hard-liners on the definition of AI, and I currently work with Machine Learning. I still argue that "AI" does not exist, and the "AI" referred to in, say, The Matrix, is a paradigm shift that doesn't exist and isn't even close to existing. But it probably will some day.
I don't think there is any stopping technology. We are going to push things as far as we can. Research and experimentation is in our nature. But just because it is there does not mean we should use it.
Kind of like what was said in the article, we are already moving in a direction of too much dependency on machine learning. It is hard not to make the leap from using facial recognition to spot criminals that some police departments are currently testing to the idea of the precrime division in Minority Report. I would rather head that off and stop it now than let it get that far and then try to close the gate after the horse has bolted.
A little closer to reality than AI controlling nukes would be the military using swarm bots. DARPA was talking about this while I was in college in the early 2000's which means they had been working on it even longer. Is one person controlling a single drone while AI controls the other drones with missiles and guns too far? How much autonomy do we give those AI's to make decisions while on a battle field? Do we make sure they complete the mission even if they loose the single pilot or loose comms with the home base?That said, we can do a few practical things, like not giving the AI control over nukes, so we'll at least hopefully not have a Judgement Day.