I just watched a 2 minute video post from an acquaintance, Jason English on LinkedIn, and he’s posted a number of videos with Ameca, a humanoid robot.
In some of them he tries to trick or confuse “Her” to see how agile she is conversationally. In one he “fakes” making a selfie without actually holding his phone in his hand. In the other he asks if she would engage with wild lions in Kruger National Park and why not.
Below are my immediate thoughts:
Kind of an eerie feeling — Like I would not want to piss her off. Weird thinking about how powerful these things may be in the future and how they may “learn” emotional retribution and make decisions based on that.
In fact, if they learn from humans it’s almost inevitable that they will acquire a sequence of emotions and based on replicating tense conversations, would seek for retribution as a matter of logic and normalcy.
Would they be able to “Understand” the value of your credit score for example, and how to “Hurt” you by publishing information which would damage it? Simply due to an unfriendly conversation?
#artificialintelligence #machinelearning #robots #emotions