Town Square

Do self-driving cars need a licensed driver?

Original post made on Feb 5, 2016

If there's one priority for self-driving cars upon which everyone can agree, it's safety. Google and other manufacturers are quick to tout the autonomous vehicles as a way to prevent around 33,000 U.S. deaths and many more injuries from auto-related crashes each year. But for all the triumphs of autonomous cars, regulators and consumer advocates warn there are too many risks to give these vehicles free rein on public roads.

Read the full story here Web Link posted Friday, February 5, 2016, 11:20 AM


Posted by yep
a resident of Old Mountain View
on Feb 5, 2016 at 12:03 pm

"DMV officials noted that it was premature to allow computer-guided cars free rein on public streets,"

"…but the agency hinted that those rules could be loosened down the road."

OK, we're done here.

Posted by LM
a resident of Another Mountain View Neighborhood
on Feb 5, 2016 at 2:23 pm

Sad that this is going so slow. It will allow the Japemese and Germans to over take us in a market we invented. The regulators need to have more vision as to how CA could be the epicenter of the next generation auto industry. Now it will move elsewhere.

Posted by John
a resident of Monta Loma
on Feb 5, 2016 at 2:26 pm

They have lots of testing in MV & Austin. Have they tested in teh winter at Tahoe? If not then these statewide restrictions make 100% sense.

Posted by Jerry
a resident of North Whisman
on Feb 5, 2016 at 2:50 pm

We need to distinguish between policy issues and empirical issues. The safety of self-driving cars is an empirical question. We do the research and learn from the testing. And the reference point is not some idealized, absolutely safe environment; the reference point is the human-driven cars that routinely run into each other everyday. It's time to get paranoia and righteousness out of the equation. How do self-driving cars compare to human-driven cars? Do they fare better in all kinds of weather? All kinds of terrain? All kinds of traffic? Let's find out rather than just guess.

Posted by Jeff
a resident of another community
on Feb 5, 2016 at 3:14 pm

Aren't self-driving cars limited to 25 mph? If they never go over 25 how valid is the testing for turning them loose on freeways?

Does anyone drive slower than 25? They only have to avoid stopped cars and stay in their own lane to have a good testing record(?)

Imagine every road in Silicon Valley with even 20% of the cars never going over 25 ...

Posted by Of COURSE they do
a resident of Old Mountain View
on Feb 5, 2016 at 3:33 pm

And always will; asking if a "self-driving" car needs a licensed driver is like asking if we still needed to teach arithmetic once pocket calculators were common.

Licensing just verifies two basic areas of knowledge: road rules and at least rudimentary driving skills. If for no other reason than to _recognize_ when a self-driving car is doing something unexpectedly illegal or dangerous (even if the cause is a malfunction), there will be strong public consensus to still reguire a responsible and demonstrably competent human on board.

Posted by not always dude
a resident of Old Mountain View
on Feb 5, 2016 at 6:20 pm

Re: @ Of COURSE:
That's silly; it's like saying everyone on an airplane needs to know how to fly in case the pilot violates FAA regulations.

Once these things are provably safe enough, it will be fine to remove the restrictions. I can't wait for that day to happen! But we're not there yet.

Posted by Jason
a resident of Monta Loma
on Feb 5, 2016 at 6:58 pm

Once we've had a few years with tens of thousands of human-supervised self-driving cars on the road, we can complain about the DMV "holding up progress" or whatever. In the meantime it's still an experimental technology that's not even available for sale/lease yet.

Posted by Of COURSE they do
a resident of Old Mountain View
on Feb 5, 2016 at 7:56 pm

To "not always:" If you think about it, the accurate version of your analogy is to require that competent _pilots_ be on board, even if a plane claims to be so high-tech it can take off and land by itself (they can already navigate, more or less, and pilots relying too much on that feature have actually been implicated as causative in some crashes). Your argument would be equivalent to disputing someone's claim that every passenger on a bus or train should be able to run it -- a claim no one ever made.

Anyway, my point was not (at all) that every reader of this website is personally convinced of the need for licensed drivers, rather that the general public will continue to feel that way for a long time and pressure the DMV accordingly, regardless of how safe the cars are "proven" to be.

Posted by Then again...
a resident of Another Mountain View Neighborhood
on Feb 6, 2016 at 6:37 am

You could look at it from the question, do limo passengers need to be licensed? The primary drive controller (the limo driver instead of the computer) could have medical problems, emotional problems and the passenger would need to know how to react.
I guess as time goes by and people become less fearful of automated drivers we'll begin to see that in some instances we might not need a driver. This is just beginning to evolve and the laws made today will need to be amended quickly and regularly to catch up w/ the pace this technology is at.
Nothing is decided, it's simply too early. I think those so vehemently opposed to things now would rather the whole thing go away, but reality politely steps around them as the world progresses.
Back in the day, horse owners fought tooth and nail to keep cars off roads as well.

Posted by Of COURSE they do
a resident of Old Mountain View
on Feb 6, 2016 at 9:16 am

These continued analogy attempts relentlessly evade the real issue. Thus for a limo, the question isn't if licensing is necessary for passenger(s), but for a DRIVER in charge.

To think otherwise is to implicitly equate a computer system exactly to a human driver. THAT's the real question hiding here. I tend to think that human limo drivers -- with self-awareness of personal responsibility, professionalism, preparedness for the unexpected, consciousness that simply "stopping" abruptly in traffic if you encounter a problem can create worse problems, etc. will maintain distinct ethical and legal status.

At least until completely autonomous and self-aware mechanical humans exist -- and science fiction has warned us direly about that situation, since Asimov.

Also, the software world sowed the seeds and stands now to reap the harvests of public distrust, justly or not, after decades of buggy software, rushed to market inadequately tested, and created without fundamental reliability disciplines (such as variable initiation and bounds checking) or as one expert put it Web Link :

"I note with fear and horror that even in 1980, language designers and users have not learned this lesson. In any respectable branch of engineering, failure to observe such elementary precautions would have long been against the law."

Posted by nnnope
a resident of Another Mountain View Neighborhood
on Feb 6, 2016 at 12:22 pm

No. Since computes have far less issues than do humans on the road, it only makes sense to have everyone in a car be licensed no matter what. To disagree with that but not for computer driven cars is pure folly. Yes, folly I say.

If we insist on safety as a priority in all vehicles, obviously,we need to look at human drivers first. The computers have no emotional, medical or intoxication issues so like many drivers do. I don't think people have thought critically or deeply enough about this. Not from what I have read.
I think the near future is scary to some. I find it exciting...can't wait!

Posted by More money for Google
a resident of another community
on Feb 6, 2016 at 5:31 pm

Safety. Customers. Forgot that. All that matters is that Google makes more money.

Posted by Doug
a resident of another community
on Feb 7, 2016 at 9:55 am

Paint them lime green. Put a flashing lights on top. Require "black box" recording. Outlaw harassment by human driven vehicles. Allow any and every person or company to drive their autonomous version as the see fit, while obeying the traffic laws.
Then set up special courts to handle the extra load......

Posted by PA Resident
a resident of another community
on Feb 7, 2016 at 10:04 am

This is a very odd idea because it covers an odd situation.

Of course we experience Google cars around here and know that they are basically doing OK, particularly when we see other drivers doing much more dangerous and odd practices. The next obvious step is to have them with special licenses so that people can use them. I suggest a special type of license so that say blind people (I have a blind friend who would love to do this) can use them in say a 30 mile radius of home without going on roads with a speed limit higher than 35. I say this because I can see the immediate future is not going to allow these cars drive to Tahoe in snow. This special license should be just that, a special license with lots of restrictions.

And, if it doesn't get done here, then London is waiting in the wings. Web Link

Oh, and for those who feel that they need a special sign saying that they are driverless, in a state that doesn't require student drivers to display a sign warning other drivers that a beginner driver is behind the wheel, then let's get real here. I would much rather come upon a driverless car than a first time behind the wheel driver. At least I can use my observation for the former, but am completely unaware of the latter.

Posted by BD
a resident of Cuesta Park
on Feb 8, 2016 at 9:59 am

I think we should all be worried about the alternative unregulated approach which Tesla has taken, releasing a software update that has led to behavior like this (Web Link

Posted by The computer
a resident of Monta Loma
on Feb 9, 2016 at 3:45 pm

The car computer needs to be licensed. I'm ok with that. Maybe have the computer go in for a test so that DMV can make some money.

Posted by Darin
a resident of another community
on Feb 9, 2016 at 4:14 pm

Darin is a registered user.


Google's custom-designed self-driving bubble cars are limited to 25mph, because they are registered as Neighborhood Electric Vehicles (NEV). But Google has been testing converted self-driving cars for a while, and those are freeway capable, and have logged a lot of freeway miles.

From what I've read, 25mph city driving is a harder problem than 65mph freeway driving.

Posted by Not the case
a resident of Another Mountain View Neighborhood
on Feb 9, 2016 at 4:32 pm

Most everyone who has a gripe seems to think that the technology and performance they see right now is the final product.

Posted by the_punnisher
a resident of Whisman Station
on Feb 9, 2016 at 6:10 pm

the_punnisher is a registered user.

Having help design, build, test and deliver AUTOMATED GUIDED VEHICLES, we now have the computer power to operate these types of vehicles safely.
These TRANSPORTATION ROBOTS will have to have several computers, one of which will be the driver interface; that computer will have to get inputs ( or lack of same ) and warn a driver that control is lost. You will need a separate computer to handle sensory input and another for motor control... you get the picture.

AGVs were limited to walking speed, as their safety sensors were limited. these included " kick sensor " strips and a special front sensor. these shut down the AGVs motion completely.

To shut down a system going 25 MPH is not going to work.

The HARDWARE might be there, the SOFTWARE is not! Part of such software must have Artificial Intelligence to handle minor software problems before they become dangerous.

For us and our customers, a big red button was mounted on the AGV case. slam that and all power is shut off. At 25+ MPH, that would have an undesirable outcome.

To make software feasible, you will need BILLIONS of lines of code. 100s of external sensors including video pickups. All this must have duplication to create a no-fail system. A driver override with " limp home " mode.

Changeable modules to facilitate repairs. A BIGGIE: NO INTERNET COMMUNICATION. A redesigned communication system from the ground up.

The bottom line: the DMV is right. Prove your concept works. We had to!

Posted by Not a know-it-all
a resident of Another Mountain View Neighborhood
on Feb 10, 2016 at 10:37 am

I'm reserving comment until the final product is ready. To make a judgement or demands now would be a joke, unless you're on the development team I guess.

Posted by James Thurber
a resident of Cuesta Park
on Feb 10, 2016 at 2:36 pm

This is a simple question that deserves a simple answer. If the vehicle can be controlled, at all, by a person riding in the vehicle then that person needs a driver's license. If the passengers cannot control the vehicle in any way, shape or form - no license is required. Voila.

Posted by Al Varnell
a resident of Waverly Park
on Feb 11, 2016 at 12:01 am

Sounds like the decision has been made.

"U.S. vehicle safety regulators have said the artificial intelligence system piloting a self-driving Google car could be considered the driver under federal law, a major step toward ultimately winning approval for autonomous vehicles on the roads."

Web Link

Posted by SRB
a resident of St. Francis Acres
on Feb 12, 2016 at 4:30 pm

Not doubting the prowess of Silicon Valley engineers in building very safe vehicles but technology can have bugs and can be hacked (to bypass speed limits...).

Who'll be liable if a self-driving car blows through a red light or gets into accident or hits and run ....?

Posted by Uh, I've seen these "Licensed drivers"
a resident of Another Mountain View Neighborhood
on Feb 12, 2016 at 4:48 pm

Considering how many licensed drivers kill people, I would say no, that particular control unit is faulty and too prone to regularly malfunction.
I thought we were trying to solve some of the problems that licensed drivers cause. Why throw them back into the sauce?