Wednesday, June 18, 2014

2030 May Be The Year They'll Take Your Driver's License Away



I was stuck in traffic yesterday, which I didn't really mind because I have a fun little yellow convertible, and I was thinking about Uber ($17 billion! - that's the company's valuation, not the price of a ride) and Google's driverless cars (development cost unknown), and I decided it was time to connect the dots: once a car learns to drive, there's no need to own it and there's no need for a driver. That's because the car can come when called, take you to your destination, then go off and pick up someone else.


That sounds great and I'm hardly the first to connect those particular dots, but there's a corollary that seems to have gone largely (though not entirely) unnoticed: when driving oneself becomes unnecessary, it will eventually become more expensive, less convenient and - ultimately - unlawful, because the cars will do it a lot better than we can. Traditional cars will find themselves in a death spiral, and they'll be gone in less than - well, not less than 60 seconds, but sooner than you think.


In other words, Google is engineering all of us right out of the driver's seat. If they manage to get self-driving cars on the market by 2020 - as they've said they hope to - then I'd give human drivers another ten years before we all get our licenses pulled and registrations revoked.


Welcome to the Jetsons era of driverless Cars as a Service, or d-CaaS if you will.


Improbable? Not really. Consider how the laws changed around smoking on aircraft. Until 1973, you could smoke anywhere on an airplane. Next came smoking in designated sections and then, in 1988 - just 15 years later - an outright ban on most flights. It was all motivated by health concerns. Today, with safety issues, shoelace bombers and explosive underwear added to the mix, no sane person would touch a cigarette while aloft. What once was commonplace is now understood to be reckless, and punishable accordingly.


Or consider horses. Once common on city streets such as LA's Sunset Blvd., they disappeared in the twinkling of an eye. (Well, not completely.) Who at the time could have imagined?


Already, of course, the elderly lose their right to drive when they can't do so safely enough. But my point is that the definition of "safely enough" will shift: when self-driving cars are safer than human drivers, at some point none of us humans will be deemed safe enough to worthy of a license.


Let's back up and look at where we are now. Google's fleet has already logged nearly 700,000 autonomous miles. How smart are the cars? Consider this, from a 2013 New Yorker article:



[Google lead programmer Dmitri Dolgov] was riding through a wooded area one night when the car suddenly slowed to a crawl. "I was thinking, What the hell? It must be a bug," he told me. "Then we noticed the deer walking along the shoulder."



The deer wasn't even on the road, but was alongside it. In the dark, in a fraction of a second (one presumes), the car had detected an object, recognized it as a deer, inferred that it could leap onto the roadway unpredictably, released the gas and applied the brakes. Now that's a smart car, and the night vision is just icing on the cake.


Maybe the deer was the only thing out there that night. Perhaps the car itself turns into a deer in the headlights when faced with complexity? No. Said Google's project director Chris Urmson in a recent blog post, "our software ... can detect hundreds of distinct objects simultaneously -  pedestrians, buses, a stop sign held up by a crossing guard, or a cyclist making gestures that indicate a possible turn."


Now, there is some "trickery" involved: the car is preloaded with extremely high-def maps of the areas it drives in. Far more detailed than Google Maps, these maps are models of the physical world, with information like the height of traffic signals, the position of curbs and more. Those models make it easier for the computer to process sensor inputs, because it knows some of what to expect.


Of course, you and I use the same trick when we drive on familiar roads. And Google is hardly going to be deterred by the need to map the world's roadways.


That deer story is already more than six months old. The software has probably moved on to recognizing cats and raccoons by now. Meanwhile, on the hardware front, the latest Google cars don't have steering wheels or pedals. The driver is just another passenger along for the ride. And guess what: in most states, that's probably legal already. Anyway, those cars are intentionally limited to 25 mph. That may sound kind of limiting, but it's actually the same as the proposed New York City speed limit.


The Google team are very clever folks, and they're not the only ones working on these kinds of vehicles, though they seem to be the furthest ahead. Readers who hate government should note that much of this was spurred by a U.S. government sponsored competition; indeed, Google's Urmson, then at Carnegie-Mellon University, was the technology leader of the team that won the government's 2007 Urban Challenge.


In any case, it's just a matter of time and determined engineering before autonomous cars become better drivers than people are. (By some measures, says Google, they already are.) That will lead to commercial availability. When that happens, the conversation will begin to shift from "is it ok to let cars drive themselves?" to "is it ok to let people drive themselves?"


And the answer we'll arrive at, soon enough, will be no.


That's because (as the New Yorker article points out) people are terrible drivers. They're always distractible, and often tired, preoccupied, drunk, drugged, on the phone or sending a text. They're emotional, error-prone, drive too fast and react too slow. They have blind spots, can't see in the dark and usually can't be trusted to use both feet at once, meaning that in a unexpected crisis even the best driver loses seconds shifting his right foot from the accelerator to the brake. A well-programmed computer, kitted out with cameras, lasers, radars and cameras, will be able to do much better.


When that happens, the social cost of human driving will no longer be simply unfathomable - in the U.S. alone, 33,000 deaths, almost 2.5 million injuries, $277 billion in economic losses and $871 billion in total social harm per year - it will become unacceptable. Over 95% of that cost is attributable to driver error (pdf; see pp. 24-25).


Divide those dollar figures (2010 data, the latest available) by the adult population and you'll see that human driving costs the nation the equivalent of $1200 to $3500 per adult per year. Some of that is accounted for by auto insurance rates, but too many people are uninsured or underinsured, and the insurance system arguably doesn't do a good job of reflecting the true cost of accidents.


So money is probably where the regulation will start. Somewhere between perhaps 2025 to 2030, you might have to pay an annual "human driver" fee if you want to keep driving the old fashioned way. It could be a hefty add-on to present-day license fees - probably not as much as the true social cost at first, but it might increase over time. Or it might take form as an increase in car insurance rates for those who insist on driving themselves.


Of course, self-driving cars are likely to be more expensive than the old fashioned kind, at least at first. Would the poor and middle class suffer, forced into a Hobson's choice between paying more for a license and insurance or paying more for a driverless car or a retrofit kit? Not at all. They're likely to do neither, and instead use d-CaaS - driverless cars that appear when summoned, ready to whisk you to work, play, restaurants or one of the diminishing number of retail outlets not rendered superfluous by Amazon (with or without flying drones).


And who will provide those cars? Why Uber, of course (there's a reason Google invested a quarter-billion dollars in the company), and Lyft, Sidecar, ZipCar (owned by Avis), Car2Go, RelayRides, Zimride (owned by Enterprise), and Enterprise, Hertz, Avis, Dollar, Hailo, Taxi Magic, Flywheel, local taxi and limo companies, and, gosh, have I left anybody out? They'll all be in competition with each other. Some will own fleets, while others will provide financing or monetization for individuals who do buy autonomous cars or retrofit their existing vehicles.


But really, why buy? Americans spend roughly 3 hours a day in their cars. That means that about 21 hours a day, the vehicle sits idle. What a waste of money to buy a self-driving car when instead you can use one when you need it and the car can go about its business when you don't. Do you really want to be in the business of owning a self-driving car, always keeping up with hardware and software updates, cleaning up spilled soda from the backseat and restocking the candy dish? Uber is not going to be about human drivers forever.


So, a far smaller number of self-driving cars could fill people's needs than the number of cars owned today. During surge periods - primarily, rush hours - commuting in autonomous cars could involve ride-sharing (at a reduced cost per person) and/or multimodal trips (the car takes you to the train or subway station). Prefer your privacy? That will be available too, but at a cost in money and time too, since you won't get to use the commuter lane on the freeway.


Fewer cars and less time sitting idle means less need for garages, lots and on-street spaces, and little or no time spent looking for parking. Fewer cars, tighter clustering and greater efficiency translates to less congestion. There will be more mobility for the elderly and disabled. And driverless cars will probably all be electric so that they can just sidle up to a charging station and rejuice without human intervention. So, less pollution.


What happens to all those existing traditional cars? I think the government will get them, and your license too.


That may make you think of gun rights - "they'll pry my steering wheel from my cold, dead hands" -  but there's no Second Amendment for cars, and even today, driving is considered a privilege, not a right.


Anyway, the government probably wouldn't take your car from you at first. Instead, they'd buy it. The advantages of getting traditional cars off the road will be so great that some states might decide to pay owners to scrap and recycle them - just as California today pays owners of cars that fail smog check $1000 to $1500 to have their cars dismantled.


And, of course, the other thing they can do is regulate your car's value away, by pursuing policies that favor d-CaaS. As driving oneself becomes more and more a pastime of the rich, the price of licenses, registration, insurance and gasoline will likely increase, further narrowing the user base - and political constituency - for traditional cars. As demand lessens, gas stations will close up shop, and owning a non-autonomous gasoline vehicle will become unfeasible. That will drive another nail in the traditional car's coffin - or in its tire. You'll wish you sold it to the state when you had the chance.


Meanwhile, densely-packed cities like New York, Boston and San Francisco might outlaw traditional cars altogether, or levy a heavy use fee, just as London and a few other cities impose congestion pricing today, with plans afoot in New York.


Mothers Against Drunk Driving might rebrand as "Mothers Against Drivers Driving" and maybe Car & Driver magazine will become Car & Operating System, if it doesn't fold altogether.


Increasingly, those who choose to drive themselves will bear more of the consequences of their risky behavior. The damages awarded in the event of an accidents may skyrocket, not just because the average driver will be wealthy, but also because the accident will be seen as easily avoidable. When the police show up at an accident scene, they might charge a fee, just as ambulances do. Ultimately, the decision to drive oneself might become reclassified as what the law calls "ultrahazardous," making liability stricter and penalties more severe.


Of course, to take off the rose-colored Google Glass for a moment, it will be a weird and different world when nobody drives, and it's bound to bring a loss of privacy. There have already been fights over driving data; there will be more. Inside the car, you'll be in a public space: even now, the Google cars have cameras inside as well as out, and we can expect more of that, not less, as time goes on. And talk about a captive audience: at present, there are no ads in Google's self-driving cars, but that won't last.


There will no doubt be a loss of autonomy as well. Chinese cars will probably be programmed to avoid sensitive places like ongoing demonstrations, or Tiananmen Square on the anniversary of the massacre. Call it the Great Traffic Cone of China. (Yet countries like China and India might leapfrog the U.S. in the application of d-CaaS technology.) Even in the U.S., people on probation or subject to protective orders will have their mobility reduced. Teenagers might find that self-driving cars refuse to stop at liquor stores and head shops, or won't stay out after midnight ("Chad," the car will text an errant teen, "I'm leaving in five minutes."). Any one of us might be offered a discount on our ride by Google if we let the car take us to a restaurant or store the company favors. Today, our attention is bought and sold in the form of advertising; tomorrow, our physical presence will be for sal e as well. We might even have to agree to stay at the store for at least 15 minutes in order to get that ride discount.


Then there are the technical challenges. Yes, Google engineers are smart - but how smart? My Android-powered Galaxy S4, which initially worked so well, now tends to freeze, overheat and run down the battery. And that's not to mention the prospect of it being hacked. Programming is still a dicey business: we've been designing bridges for 3300 years but the very term "software engineering" is less than 50 years old. As a rigorous discipline, it's even younger - and it's subject to little regulation.


One kind of program code that is regulated is medical device software, and in that context, the FDA makes a point that applies equally well to driverless cars: "Because of its complexity, the development process for software should be even more tightly controlled than for hardware, in order to prevent problems that cannot be easily detected later in the development process."


Soon, another exception may be mapping apps: the National Highway Traffic Safety Administration wants the authority to regulate those for safety reasons.


So the road ahead for self-driving cars is likely to be studded with potholes. Will we need outside software auditors to reduce error and guard against the sort of cover-up that seems to have afflicted the GM ignition switch team and that company's management? Probably so. Google itself, after all, already operates under the scrutiny of outside privacy audits imposed by a 2011 consent decree, and the company later paid a record $22.5 million fine for violating that consent decree.


Can we trust Google or car companies to protect our privacy, ensure our safety, guard against bugs, prevent cars from being weaponized into self-driving human-free suicide bombs, make ethical decisions when crashes are unavoidable, and all the rest? Not without oversight. Federal regulators are still figuring out the issues (see pdf of May 2013 NHTSA policy paper), but thanks to Google's lobbying, California is moving aheadon driverless cars even as it struggles with Uber. Whoever brings us smart cars - whether Detroit, foreign car companies, Google or some combination - will be doing the wo rld (and themselves) a great service, but will also need to be regulated with more technological precision than we usually apply to software. Self-driving cars, after all, are robots with human payloads, and are far more dangerous than Roombas.


Even with those caveats, self-driving cars are on their way and human drivers on their way out. And even with all the efficiencies gained, I'll miss my yellow convertible, I will. But I expect that by 2030 I'll have more than a Roomba to console me. I'll be holding out for a robocat.


Jonathan Handel (jhandel.com) is an entertainment/technology attorney at TroyGould in Los Angeles and a contributing editor at The Hollywood Reporter.

No comments:

Post a Comment