The way we train robots may reflect the dark side of human nature
It’s winter. You’re walking through the forest with your robot friend by your side. It slips, but regains its balance. Stumbling over the uneven terrain is a central part of its training. As is being violently attacked by an engineer with a hockey stick. . .
Our robot friends
The field of advanced robotics has come on leaps and bounds in recent years. Humanoid, biped robots are capable of walking on their own and are performing ever more complex tasks. Boston Dynamic’s Atlas robot demonstrated a perfectly executed backflip – an extraordinary feat even by human standards. But a look at a previous Atlas training video provides a worrying glimpse into how we treat robots, and where this could lead. Take a look at this video:
The motivation for pushing robots to their limits – and for physically pushing them around – is clear. Training and testing of a robot in all kinds of ways is integral to its development process, especially when that robot is designed to work in difficult conditions in the real world. Robots learn by repetitive action, just like humans. If they fall over, they need to be able to get up again. But here’s the cinch. Unlike humans, if they’re pushed, the robots won’t complain. You can push them over and over again, for days on end if you so desire, and they’ll simply keep on standing up. Treating an uncomplaining, submissive entity in this way could start to look a lot like abuse. Now that robots can emulate the movements and behaviours of living things, it is difficult to watch a video of a robot being physically attacked without feeling that this is in some way wrong.
Intelligent life deserves protection
Feeling empathy for a robot could be an overreaction. We wouldn’t protest on moral grounds at anyone smashing up a computer, or a smartphone, so why should a robot be any different? They are machines after all. Other than our tendency to anthropomorphise anything that looks human, wanting to protect robots largely depends on whether or not we consider them to be intelligent. Just as we wouldn’t mistreat another person, an animal, or any other kind of sentient being, if robots are deemed to have intelligence then they deserve our protection. The question is, how are we to judge this?
Back in 2014 claims were made that Eugene Goostman, a chatbot designed to simulate a 13 year old boy, had passed the Turing test of machine intelligence. Although this was disputed at the time, as technology develops it is becoming more and more difficult to distinguish humans from computers. Robots may not have recognisably human intelligence, but most people now accept that they are intelligent in some way. The problem is that much like our own brains, the neural networks of AI work in ways that we don’t fully understand. We simply don’t know what is going on in their robot minds.
What are they learning from us?
Even if robots aren’t intelligent in a way that would turn their mistreatment into abuse, the way that we train them could provide further problems. When we push them around, robots which learn by imitation – by copying what we do – are learning that it is acceptable to behave in this way towards others. This could have disastrous consequences in the future, when the autonomy of robots develops, and we live alongside them in society. What’s more, even if the intelligence of machines never reaches humanlike levels, the mistreatment of robots by people could have a crossover effect into how we treat each other. Consider the well known argument against violent video games because of how they might be affecting our actions in the real world. If we become accustomed to treating human looking machines badly, we might find this attitude seeping in to the way we interact with real living beings. One thing’s for certain: our attitudes towards robots will shape the moral landscape of the future.
Do we have an obligation to treat robots well? Is the development of robots a new kind of slavery? Is our use of technology already affecting the way we behave towards each other? Comment below with your thoughts.