Has the human race come one final step closer to its end? Autonomous weapon systems, also referred to as Killer robots could be the next chapter in the development of weapons, but could also be its last one. They mark the 3rd revolution in warfare, after gunpowder and nuclear bombs. What are they, and why do they cause such a threat to humanity?
Looks like robots with lethal weapons that we’ve watched in the movies and read about might already be here. Autonomous weapon systems are developed to operate independently, and identify targets by sensor processing, not by humans. The exclusion of humans from the decision making process and lack of empathy, emotions, morale and ethics can easily spiral to tragic mistakes with unanticipated and unprecedented consequences.
As the saying goes, the road to hell is paved with good intentions. Most if not all innovations come from a good intention, but we already witnessed how some of them in the wrong hands, or triggered by a series of events, turn out to cause more harm than good. A lot of times we cannot even imagine the impact one innovation can have and the path it will take. And this time the dark side of innovations are posing a real threat to our existence. That is why many are concerned with further development of Autonomous Weapon Systems and AI.
The silver lining in these new technologies is that less human soldiers are deployed, and therefore more lives are saved. Using these innovative technologies not only enables faster and more efficient actions, but also missions that would otherwise be too risky, costly or not even possible. Being able to fight from afar using these innovative technologies would replace face-to-face combat and save lives of one’s troops, but also decrease compassion and empathy. It would be as if one is playing a video game. The decision of going to war would be a no-brainer.
The first obvious reason for concern is the lack of human judgement. Although a machine is more precise and has faster reactions than humans, it lacks in everything that makes us human, and has a hard time understanding context. Technologies such as vocal and facial recognition have already proven to have high failure rates when it comes to recognizing certain segments, such as women, people of colour and persons with disability. How would a robot distinguish between a hostile target and a child playing with a toy gun?
Humans make mistakes all the time, but if we take into account the scale, speed and scope that the autonomous weapon systems would be deployed at, and their possible error, the outcome is devastating. Even with a safeguard these systems can always be vulnerable to hacking. In the end, when a mistake is made, who is to blame and held accountable? The machine, operator or commander? The legal and moral challenges and question of responsibility are hard to answer. Human rights and humanitarian organizations have fought hard over the years to save as many lives as possible by trying to regulate the development of such weapons. Landmines and nuclear weapons have caused hundreds of thousands of deaths and have been banned by humanitarian disarmament treaties.
The second obvious reason for concern is the threat that such a technology might fall into the wrong hands. If we think about possible combinations of these technologies with other technologies and weapons such as chemical, biological, radiological, nuclear, and so on, the outcomes can be horrifying. It could easily result in weapons of mass destruction and our extermination in the blink of an eye. Asymmetric wars have already existed, these disruptive autonomous technologies could only widen the gap.
Drones, drone tanks and autonomous armed vehicles are already in use and could be the first steps to fully autonomous warfare. It is not a question of the possibility of such technology coming true but its probable outcomes. There is no doubt that once innovative technologies that made airplanes and submarines possible, that raised many questions back in the day, pose no threat from our point of view today. Will it be the same with these technologies or do we have a real reason to fear?