Looking for a Thanksgiving Dinner table conversation that isn’t politics or professional sports? Okay, let’s talk about killer robots. It’s an idea that has long since emerged from the pages of science fiction into reality, depending on what definition you use for “robot.” Military drones abandoned Asimov’s First Law of Robotics — “A robot cannot injure a human or, through inaction, allow a human to be harmed” — decades ago.
The issue has heated up again of late due to the growing possibility of killer robots in domestic law enforcement agencies. One of the best-known robot makers of the era, Boston Dynamics, raised some public policy red flags when it showed footage of its Spot robot being deployed as part of a Massachusetts State Police training exercise on our stage in 2019.
The robots were not armed and instead were part of an exercise designed to determine how they could help keep officers out of harm’s way during a hostage or terrorist situation. But the prospect of deploying robots in situations where human life is in immediate danger was enough to draw scrutiny from the ACLU, which told TechCrunch:
We urgently need more transparency from government agencies, who should be upfront with the public about plans to test and deploy new technologies. We need statewide regulations to protect civil liberties, civil rights, and racial justice in the age of artificial intelligence.
Last year, meanwhile, the NYPD cut a deal with Boston Dynamics after a strong public backlash, after images surfaced of the spot being deployed in response to a home invasion in the Bronx.
For its part, Boston Dynamics has been very vocal in its opposition to the weaponization of its robots. Last month, it signed an open letter along with other leading companies Agility, ANYbotics, Clearpath Robotics and Open Robotics, condemning the actions. It notes:
We believe that adding arms to robots that are remotely or autonomously operated, widely available to the public, and capable of navigating previously inaccessible places where people live and work raises new risks of harm and serious ethical issues. Weaponized applications of these newly-enabled robots will undermine public confidence in the technology to the extent that they undermine the enormous benefits to society.
The letter was believed to be a response to Ghost Robotics’ work with the US military. While Twitter showed a photo of its own robotic dog sporting an autonomous rifle, the Philadelphia firm told TechCrunch that it took an agnostic stance on how the systems are employed by its military partners:
We do not create payloads. Are we going to do any promotion and advertising of this weapon system? Probably not. That’s a tough one to answer. Since we are selling to the military, we don’t know what they do with them. We are not going to dictate to our government customers how they use robots.
We don’t draw the line where they’re selling. We only sell to US and allied governments. We don’t even sell our robots to enterprise customers in adversarial markets. We get a lot of inquiries about our robots in Russia and China. We do not ship there, even to our enterprise customers.
Boston Dynamics and Ghost Robotics are currently embroiled in a lawsuit involving several patents.
This week, local police reporting site Mission Local raised new concerns surrounding killer robots – this time in San Francisco. The site notes that a policy proposal being reviewed by the city’s Board of Supervisors next week includes language about killer robots. “Law Enforcement Equipment Policy” begins with a list of robots currently in possession of the San Francisco Police Department.
There are 17 in total – 12 of which are operational. They are primarily designed for bomb detection and disposal – meaning that none are specifically designed for assassination.
“Robots listed in this section shall not be used outside of training and simulation, criminal apprehension, serious incidents, emergency situations, executing a warrant or evaluating a suspicious device,” the policy notes. It then adds more alarmingly, “Robots shall only be used as a deadly force option when the risk of loss of life to the public or officers is imminent and outweighs any other force option available to the SFPD.”
Effectively, according to the language, robots can be used to kill in order to potentially save the lives of officers or members of the public. It seems innocent enough in that context, perhaps. At the very least, it appears to fall within the legal definition of “justified” deadly force. But new concerns arise that could lead to profound changes in policy.
To begin with, the use of bomb disposal robots to kill suspects is not unprecedented. In July 2016, Dallas police officers did just that in what was believed to be the first time in US history. “We saw no other option but to use our bomb robot and place a device on its extension to detonate where the suspect was,” Police Chief David Brown said at the time.
Second, it is easy to see how the new precedent could be used in a CYA situation when a robot is intentionally or accidentally used in this manner. Third, and perhaps most worryingly, one can imagine the language applying to the acquisition of future robotic systems not fully designed for explosives detection and disposal.
Mission Local added that Aaron Peskin, chairman of SF’s Board of Supervisors Rules Committee, tried to insert a more Asimov-friendly line, saying, “Robots may not be used as a form of force against any person.” SFPD apparently went over Peskin’s change and updated it to its current language.
The renewed conversation about killer robots in California comes, in part, because of Assembly Bill 481. Signed into law by Governor Gavin Newsom in September of last year, the law is designed to make police actions more transparent. It includes a list of military equipment used by law enforcement agencies.
The 17 robots included in the San Francisco document are part of a long list that includes Lenko Bearcat armored vehicles, flash-bangs and 15 sub machine guns.
Last month, Oakland police said it would not seek approval for armed remote robots. The department said in a statement:
The Oakland Police Department (OPD) is not adding armed remote vehicles to the department. OPD participated in ad hoc committee discussions with the Oakland Police Commission and community members to explore all possible uses for the vehicle. However, after further discussions with the principal and executive team, the department decided that it no longer wished to explore that particular option.
The statement followed a public backlash.
Toothpaste is already out of tube first Asimov’s first law. Killer robots are here. As for the second law – “A robot must obey orders given by humans” – it is still mostly within our grasp. It’s up to society to decide how its robots behave.