Five officers were dead. Seven more were wounded. The Dallas PD had the shooter cornered at the end of a hallway in El Centro College, but knew that any direct attempt to attack his position would mean more police hurt, and maybe more police killed.
So they tried something new: They fired up the bomb squad’s Remotec Andros F5, stuck a bomb on its gripper arm and slowly piloted it towards the shooter. There was a loud bang; the shooter was killed by the blast. And thus, at the end of a bloody night, America got its first-ever robot-involved bombing.
The days since have seen a stream of worried articles and op-eds looking for precedent in past police bombings, questioning whether police should have access to military technology, asking about the ethics of deadly robots, and trying to figure out if this is just a taste of what’s to come.
The tone, in general, has not been optimistic. Those calling for police reform, or even police abolition, see it as another step towards total militarization, a way for police to be more violent without putting themselves in harm’s way. Those calling for police accountability see it as another way for police to deny culpability after killing someone. Those afraid of killer robots see it as a killer robot, step one on the path to the Age of Ultron.
In short: This can’t be good.
But what if robot police could be better than human police? Not faster or stronger, but actually better at being the kind of police we want in our communities?
For the past four years, undergraduates at Florida International University have been building the Telebot, a RoboCop on wheels. It’s six feet tall, weighs about 80 pounds, and has fully articulable arms and hands, cameras for eyes, and a loudspeaker for a mouth. It’s mounted on three wheels, at the moment, but the team plans to upgrade to a Segway-style self-stabilizing platform soon. At hip level, it has a big hinge; if you hit it with a bat, it can just bend down with the impact, then pop back up.
Unlike the remote-controlled Remotec Andros V5 used in Dallas or the palpably more terrifying Big Dog military robots, which run themselves using pack-based AI, the Telebot is controlled by a person hooked up to a motion-capture and virtual reality setup. When the operator turns their head, moves their arms and wiggles their fingers, the Telebot moves in the exact same way. The operator sees what the Telebot’s camera eyes capture, and when the operator speaks, their voice comes out of the Telebot’s mouth.
“It’s an avatar of that person,” says Nagarajan Prabakar, the current project director and an associate professor of computer science at FIU. “It’s a very creative, cost-effective approach, and more human oriented in the sense that any adult can operate the robot, and that person is responsible for making the decisions.”
The project began when a Navy Reservist named Jeremy Robins approached Jong-Hoon Kim, an FIU professor and Prabakar’s predecessor as project director, with the idea to build a telepresence robot — something like a beefier version of the iPad-on-a-stick that Edward Snowden uses to virtually attend conferences — so that disabled military and police veterans could still be active in law enforcement. Robins also donated some money to get the project going. But in the years since, Prabakar and his team have come up with several situations in which the Telebot would be useful for disabled and able-bodied officers alike.
First, there’s advance surveillance. If the police want to see what’s waiting inside a building or around a corner, any officer could suit up and send the Telebot in first, using its intuitive interface to open doors, look around and talk with someone inside.
And since a Telebot can be booted up remotely, they could also be stationed, dormant, across cities or within schools or office buildings, cutting down response time — as soon as police get the call that there’s an active shooter, an officer back at the station could jump into a nearby Telebot and start figuring out what’s going on while the rest of the first responders are en route.
On a more mundane level, Prabakar points out that a Telebot could take the place of that unlucky cop stuck out under a broken stoplight, directing traffic in the rain. “An officer could sit in a nearby car instead, and even automate some of those routine traffic-directing processes.”
But most importantly, the Telebot might be able to change how the police use violence. Reason one? No gun.
“We want to make sure it can have pepper spray or a taser on a hip or elbow belt,” Prabakar says, “but my opinion is that it shouldn’t have a gun.” Another important feature: Instead of body cameras that can be turned off at will (or fall off, as in the case of the Louisiana officers who shot Alton Sterling last week), we could have a complete recording of the Telebot’s sensory field for reference, when officer accountability comes into question.
The real change, though, might come from a more basic shift: the Telebot transforms life-and-death situations into life-and-broken-robot situations.
Again and again, officers justify killing civilians, armed or unarmed, by saying that they feared for their lives. Mike Brown supposedly terrified Darren Wilson to the point that the officer said the teenager looked like a “demon,” “bulking up to run through the shots” when he opened fire. Last year, LAPD officers killed Charly “Africa” Keunang in Downtown L.A. when they believed he was reaching for one of the officer’s guns. According to the video livestreamed by his girlfriend, Diamond Reynolds, Philando Castile was killed last week because police believed he was reaching for his own gun.
Since the robot is operated directly by a real police officer, using the Telebot won’t have any effect on the racism, poor judgement, and structural injustice inherent in our criminal justice system. But if you take away the blue-uniformed self that police so forcefully preserve, you take away the primary reason cops claim they kill: the fear of death.
From the perspective of the civilian, a robot known to be unarmed (and, honestly, a little goofy-looking) could help de-escalate violent encounters, too. Prabakar says that the team is considering adding a screen to the Telebot’s chest, which would allow an officer to show their own face while speaking — one way to cut down on the strangeness of having a conversation with a 6-foot robot on wheels.
(A screen would also be useful in less violent situations: “If a pedestrian needs directions, an officer could project a map onto the display, or other relevant information.”)
“If a person can see that the robot doesn’t have a gun pointing at them,” Prabakar says, “there’s a higher chance for peaceable dialogue. The person may or may not respond to the voice of the officer coming from the robot, but they might at least listen.”
And the officer plugged into the Telebot, removed from the brutal logic of killing or being killed, might just keep the conversation alive.