Article Thumbnail

Will Driverless Cars Mean We’ll Be Able to Drink and Drive?

Every day, 28 people die in an alcohol-related car crash in the U.S. And while it’s impossible to say that every single one of those deaths could be prevented, this much is true: Drunk driving is avoidable.

Obviously, though, people do it anyway. Geography is a big reason why, not everywhere has reliable public transportation. Cost plays a part, too, not everyone can afford a cab or an Uber back home. The same goes for stubbornness and ego — I cannot tell you how many blacked-out people I’ve argued with over their ability to get behind the wheel.

But since human behavior is nearly impossible to rewire and America writ large isn’t going to put down the bottle (we all know how well Prohibition turned out), could technology actually be the thing that eradicates drunk driving once and for all? Namely, could the introduction of self-driving cars save those 28 lives that are lost daily because someone thought they were sober enough to drive?

Theoretically at least, a self-driving car would eliminate the need for a DUI policy — after all, an operating system can’t get drunk, rendering anyone in the car a mere passenger and a threat to no one but themselves.


This is the current debate in Australia. Per a recent paper by Australia’s National Transport Commission, a legislative body that oversees all things transportation-related Down Under, it’s time “to develop legislative reform options to clarify the application of current driver and driving laws to automated vehicles, and to establish legal obligations for automated driving system entities.” And whether or not a person can “drive” under the influence in a self-driving car features prominently in their report:

“The application of an exemption is clear-cut for dedicated automated vehicles, which are not designed for a human driver. The occupants will always be passengers. The situation is analogous to a person instructing a taxi driver where to go.”

That said, all self-driving cars aren’t created equal. There are currently several levels of self-driving vehicles, and not all of them are fully automated. In fact, back in the U.S., the National Highway Traffic Safety Administration designates six levels of automation in all:

  • Level 0 — No Automation: The human driver does all the driving.
  • Level 1 — Driver Assistance: An advanced driver assistance system (ADAS) on the vehicle can sometimes assist the human driver with either steering or braking/accelerating, but not both simultaneously.
  • Level 2 — Partial Automation: An ADAS can control both steering and braking/accelerating simultaneously under some circumstances. The human driver must continue to pay full attention (“monitor the driving environment”) at all times and perform the rest of the driving task.
  • Level 3 — Conditional Automation: An ADAS can perform all aspects of the driving task under some circumstances. But in these circumstances, the human driver must be ready to take back control at any time when the ADAS requests them to do so. In all other circumstances, the human driver performs the driving task.
  • Level 4 — High Automation: An ADAS can perform all driving tasks and monitor the driving environment — essentially, doing all the driving — in certain circumstances. The human need not pay attention in those circumstances.
  • Level 5 — Full Automation: An ADAS can do all the driving in all circumstances. The human occupants are just passengers and need never be involved in driving.

Everything but Level 5 (Full Automation) still require a human driver — the so-called “fallback-ready user” — to be capable of taking the wheel if the need arises. This, of course, is tough to do if you’re bombed. In January, for instance, a man driving a Tesla Model S, an electric car with advanced autopilot capabilities that would fall under Level 2 (Partial Automation), was found passed out behind the wheel on the Bay Bridge with a blood-alcohol content twice the legal limit. The driver allegedly told the California Highway Patrol officers who arrested him that everything was fine: The car was on autopilot.

He must’ve failed to read the owner’s manual, which makes very clear, “Autosteer is not designed to, and will not, steer around objects partially or completely in the driving lane.” The car also requires drivers to have both hands on the wheel and continue to pay attention to the road even while engaging Autosteer. If the car detects that your hands aren’t on the wheel, it will beep at you, much like most cars today will beep if you or your passengers aren’t wearing seatbelts. If you still don’t touch the wheel, the car will put on its flashers and slowly come to a stop.

In other words, it won’t drive your drunk ass home.

Level 5 vehicles (Full Automation) likely won’t be available for the public for some time — 2025 is probably being generous — but steps toward putting automated vehicles on the roads are being rapidly taken. Case in point: At the end of February, the Department of Motor Vehicles in California passed regulations that will allow autonomous vehicles with remote human drivers to operate in the state next month. “We think we have the ultimate backup system — which is a human,” Elliot Katz, co-founder of Phantom Auto, “a remote-control-as-a-service safety solution for autonomous vehicles,” recently told Reuters.

Unlike in-car backup drivers, remote human operators would be able to control self-driving cars from several miles away. In this use case then, if human intervention is required, remote operators would override the self-driving cars’ controls, eliminating the need for the person in the car to be paying attention to the road. The remote-control capabilities could even, one day, mean that everyone with a credit card could be picked up by a friendly driverless car and driven safely home, even if home is in the middle of nowhere.

With a capable, competent (and sober) human holed up in an office somewhere (last month at CES Katz reported that autonomous vehicles operating in Las Vegas were successfully controlled from more than 500 miles away in California), legal responsibility for the vehicle and its travels falls, according to the autonomous vehicles section of the California DMV Vehicle Code, on the controlling company when the ADAS is engaged, leaving you, legally speaking, just a drunk passenger.

So as long as you don’t fuck with the ADAS, and you know, try to drive, you’re theoretically, good to go. Just make you can remember your address. We’re a long way from telepathic vehicles.