Former Google Executive Says ‘Killer Robots’ Will Cause ‘Mass Atrocities’

killer robots google

A former top Google software engineer fears that robots could cause massive atrocities in the future. The engineer, Laura Nolan, resigned from her position at Google over being tasked with helping the United States’ military drone development. She is now pushing for bans on AI technology that could precipitate killing.

Specifically, Nolan is concerned with robots that can function sans human restrictions. She believes this gives way to robots making decisions that could lead to mass killings. She believes these robots should be governed under an international treaty, similar to how treaties govern states of war.

Nolan stopped short of condemning drones, which she says still require human controls. She’s killer robots that might do “calamitous things that they were not originally programmed for”. In other words, much like science fiction scripts assert, the robots might turn on their creators.

But there is no such known plan to produce or evolve killer robots by Google. At least, so far. The core issue is derived from a UN panel finding Google helping the creation of AI driven weapons.

Nolan, who runs Stop Killer Robots, has spoke before the UN on several occasions, pleading for action.

According to TheGuardian.

“The likelihood of a disaster is in proportion to how many of these machines will be in a particular area at once. What you are looking at are possible atrocities and unlawful killings even under laws of warfare, especially if hundreds or thousands of these machines are deployed.

“There could be large-scale accidents because these things will start to behave in unexpected ways. Which is why any advanced weapons systems should be subject to meaningful human control, otherwise they have to be banned because they are far too unpredictable and dangerous.”

Nolan graduated from Trinity College Dublin and was quickly scooped up by Google to run a Pentagon project known as Project Maven. But Nolan says her ethics caused her to resign from her position. Nolan, at the time of her resignation, was considered one of Google’s most elite software engineers. Nolan was not the only Google employee to disapprove of Project Maven. Over 3,000 Google employees also signed on to protest the military agenda.

Nolan believes that drone warfare pales in comparison to killer robots.

“You could have a scenario where autonomous weapons that have been sent out to do a job confront unexpected radar signals in an area they are searching; there could be weather that was not factored into its software or they come across a group of armed men who appear to be insurgent enemies but in fact are out with guns hunting for food. The machine doesn’t have the discernment or common sense that the human touch has.

“The other scary thing about these autonomous war systems is that you can only really test them by deploying them in a real combat zone. Maybe that’s happening with the Russians at present in Syria, who knows? What we do know is that at the UN Russia has opposed any treaty let alone ban on these weapons by the way.

“If you are testing a machine that is making its own decisions about the world around it then it has to be in real-time. Besides, how do you train a system that runs solely on software how to detect subtle human behavior or discern the difference between hunters and insurgents? How does the killing machine out there on its own flying about distinguish between the 18-year-old combatant and the 18-year-old who is hunting for rabbits?”

If you think killer robots is an overdramatic term coined to illicit fear, think again. The US navy’s AN-2 Anaconda gunboat requires little to no human control and can carry out it’s job in autonomous fashion.

Russia has what’s known as a T-14 Armata tank.

In other words, this is happening. And Nolan is speaking out to help stop it. But the fact is, it’s going to be difficult for any treaty to stop killer robot progress given that no country wants to fall behind in a warfare sector that could put them in harm’s way. Much like nuclear weapons and chemical weapons, countries will always find ways to stay ahead or keep up, with neighbors.

Author: Jim Satney

PrepForThat’s Editor and lead writer for political, survival, and weather categories.


Please visit the CDC website for the most up-to-date COVID-19 information.

*As an Amazon Associate I earn from qualifying purchases

Comments

comments