Getting your Trinity Audio player ready...

Self-driving cars are already a regular sight in Mountain View and nearby cities, but the technology will be kicking into high gear starting this week.

Google’s autonomous car offshoot, Waymo, announced Tuesday it had received permission from California regulators to begin piloting autonomous vehicles with no human in the driver seat. The company can now send out up to 36 driverless vehicles onto public roads in Mountain View, Palo Alto, Sunnyvale, Los Altos and Los Altos Hills.

The new authorization allows self-driving cars to operate without drivers at all hours of the day, along city streets, rural roads and highways. Waymo officials say they will start with a limited geofenced area to begin testing, and will gradually expand it.

Waymo’s permit, which it applied for in April, is the first of its kind granted by regulators with the California Department of Motor Vehicles, although driverless testing has already been approved in Arizona. The California permit takes effect on Tuesday, Oct. 30.

Self-driving cars have been navigating Mountain View streets for years, but they have been required to have someone sitting in the driver’s seat to take control if the autopilot system were to fail. The new testing phase allows the autonomous cars to be unoccupied, but companies would still be required to have human monitors tracking the systems, and possibly taking remote control in the case of a problem.

In the buildup to driverless testing, Waymo officials have been hosting community meetings in some of the affected cities. A meeting in Mountain View is tentatively planned for December, according to city officials.

Under the new permit, Waymo is obligated to work with local authorities, including developing some kind of protocol for how emergency officials can stop and interact with driverless cars, especially if the vehicles threaten public safety.

In the early days of this testing phase, Waymo officials will be giving rides to their own employees. But eventually, the company intends to begin offering the public free rides in their driverless cars. Waymo is expected to launch an early rider program in the coming days.

Join the Conversation

36 Comments

  1. Can we please have some identifier on these cars to show that they are unoccupied. We would not want someone to risk their life trying to rescue someone in the event of a car in flames or fallen off a bridge only to discover it unoccupied.

  2. These Waymo vehicles are serious road boulders. They ooze down our street at well below the posted (25 MPH) limit, they clog traffic on the larger road arteries and they block intersections and merges because they are programmed to be beyond cautious into incompetence. At one merge onto Central Expressway from Shoreline, recently, a Waymo test vehicle had 6-7 cars bottled up despite light and well-spaced traffic on Central Expressway, with some of the delayed vehicles blowing their horns.
    We already have clogged traffic. This will only make things worse.

  3. @John Joss
    36 extra cars will clog the traffic that consists of thousands and thousands of cars? Are you serious? I think there is a clear benefit from developing of self-driving car and not so much of a downside.

  4. Haha. I saw one of our local road ragers who didn’t want to drive the speed limit tailgating a WAYMO. He didn’t understand it was 100% fruitless. LOL.
    I enjoy knowing the self drivers opt for safety above all else, esp in school zones.
    People get delayed all the time by someone who’s afraid to merge, or someone driving at the speed limit when they want to go faster, but when it happens with a WAYMO, it’s apparently inexcusable.
    People are funny that way, even if they know the technology will only get better and better.

  5. Be afraid. Be very afraid.
    Especially if you are riding a bike. I’ve had them pull over in the bike lane in front of me and stop for no reason.
    I 2nd the request to have an identifier for unoccupied cars – that way we know to watch out for them.

  6. I for one (and my wife as well) are really looking forward to the day when the streets are full of nothing but autonomous vehicles (AVs). The number of accidents would be reduced to nearly zero.

    Most of the problems with AVs today is that they have to interact with humans, most of whom think that they’re better drivers than the AVs. I’d rather trust my life to an AV than to drivers who run red lights, think that STOP signs are optional, speed, and lane change recklessly and without signaling.

    Count me in when the early rider program comes to town!

  7. This is pretty neat. Computers think in ones and zeros. Humans think in A, B, C and D – which makes it incredibly easy for us (humans) to look at a squirrel, a pedestrian, biker, et al, and instantly figure out how we’re going to handle it.

    Google / Alphabet / Waymo has had to figure out how to combine a simple thinking system (zeros and ones) can do similar.

    I hope this works. It would be awesome for many reasons (heaving drinking and driverless taxi cabs being one).

  8. So we should be afraid because a computer driver car does exactly the same as human drivers? At least the computer driven ones do not speed, do not run red lights/stop signs, signal when turning, do not drift across lanes

  9. The driverless cars do hit their brakes about twice per seconds and fully stop on major streets even though there is nothing in front of them. With that kind of erratic driving, no wonder there are so many accident reports of them getting hit in the rear.

  10. Probably the best thing about driverless cars is that they will improve over time – when a new car hits the streets, it will have technology benefiting from previous generations whereas new drivers have to learn the hard way. It’s obvious that there will be an adjustment period and there will be people freaking out over driverless car accidents while ignoring the fact that there are countless accidents from clueless drivers – especially in the bay area. On the other hand, I do fill a bit as though the public didn’t get a vote in this process (maybe I missed it) and, living in London, I’m really, really afraid of the challenges driverless cars will face with the streets and traffic around here.

  11. The driverless cars do hit their brakes about twice per seconds and fully stop on major streets even though there is nothing in front of them.
    (No, they don’t)
    With that kind of erratic driving, no wonder there are so many accident reports of them getting hit in the rear.
    (No, there aren’t)

  12. I went and checked the accident reports that SG provided the link for. All the ones that I read were ‘Vehicles driving in manual mode’ or ‘Vehicles driving in conventional mode’. I didn’t find one crash where a driverless car was actually operating in autonomous mode. And they are not reports of ‘autonomous cars being hit in the rear’ So one must conclude that the description of the DMV reports as ‘dozens of accident reports of autonomous cars getting hit in the rear’ is incorrect.

  13. The autonomous vehicles can behave erratically. When an accident occurs and the autonomous vehicle is in manual mode, the questions are, did the vehicle computer system get confused and kick into manual mode suddenly? Or did the autonomous vehicle driver take over suddenly because the vehicle did not sense the conditions correctly and they did the takeover for safety reasons?

  14. @mvresident2008 To be fair, I checked a couple of latest Waymo accidents, and they were in autonomous mode, but drivers behind them did not pay attention:
    A Waymo Autonomous Vehicle (“Waymo AV”) was stopped in autonomous mode at a traffic light at the intersection of Rengstorff Ave. and Garcia Ave. in Mountain View when a box truck made contact with the rear bumper of the Waymo AV.
    A Waymo autonomous vehicle(” Waymo AV” ) in autonomous mode was rear ended after entering the parking lot at 100 Mayfield Ave.
    The Waymo AV was yielding for an oncoming vehicle while preparing to make a left turn inside the parking lot.
    A Waymo autonomous vehicle (“Waymo AV”) in autonomous mode traveling southbound on San Antonio in Palo Alto, CA was stopped to
    yield to cross traffic when it was rear ended by a vehicle traveling at approximately 5 mph at the time of the collision

  15. “The driverless cars do hit their brakes [too often] and fully stop on major streets even though there is nothing in front of them. With that kind of erratic driving, no wonder there are so many accident reports of them getting hit in the rear.”

    Many of us witness those and other erratic, “unreasonable” behaviors by these cars (see also the bicyclist complaint above). Driving is a joint activity, demanding not just alertness but awareness of role in a group. The successful group requires individual behaviors within limits: neither too fast nor slow, too daring nor hesitant. This is the notorious failing of self-driving technology so far. Just as some human drivers impulsively, neurotically brake often and needlessly, ignoring that traffic is behind as well as in front of them, so the self-driving vehicles tend to act like lone players, ignoring their role among a group. Hence, traffic “bottlenecks” as they drive slower or merge far, far more hesitantly than a human driver knows is safe.

    Further, Elaine Herzberg’s tragic death in Arizona disclosed the hubris of the self-driving-car employee community there. First, test drivers “touted the [self-driving cars’] sensor technology [as] effective at anticipating jaywalkers, especially in the darkness” just months before the jaywalker was killed. Then, “The investigation concluded that … [Rafaela Vasquez, test driver in the fatal car] was streaming The Voice over Hulu at the time of the collision, and the driver-facing camera in the Volvo showed ‘her face appears to react and show a smirk or laugh at various points during the time she is looking down'” thus she was “distracted from her primary job of monitoring road and vehicle conditions. Tempe police concluded the crash was ‘entirely avoidable’ and faulted Vasquez for her ‘disregard for assigned job function to intervene in a hazardous situation.’ ” [Investigation summaries from Wikipedia]

    Commenters here dismissing demonstrably reasonable concerns as “false narrative,” or gushing about distant utopias where cars all self-drive (implicitly MANY decades off, and ignoring that some people who buy regular cars maintain them for 30 years) may be revealing hubris too — and who knows how many of those commenters work for the firms involved, and thus bring hidden agendas, since NOT ONE of such comments (as of the time I write this) was from a Registered User on this site?

  16. I myself, and many others I imagine, who recognize what these cars look like treat the Waymo cars differently when we encounter them, something like a Student Driver on a Drivers Ed car. We expect them to behave unpredictably and are a little more wary. This means that at times they do not get the right treatment to improve their algorithms.

    Anyway, I expect it will make us all better road users and probably the same if we are on bikes or walking and crossing the street. We will have to teach our children that they had better pay more attention when they are walking or bike riding. A good thing all round.

  17. Would it help if other motorists considered the LIDAR sensor on the top of self-driving vehicles to be a big “Student Driver” sign?

    Yes, I’ve seen Waymo/Google cars behave in ways that were a little different from most human drivers (which is why I noticed the behavior in the first place). And yes, what stands out is that they tend to be more cautious than most human drivers. But their behavior has been more like a human student driver than a distracted/drunk human driver, which is a good thing.

    And Waymo’s program seems to be much more mature than Uber’s program, so attributing Uber’s well-publicized failures to Waymo or to self-driving systems in general is a bit unfair.

  18. Your point taken, Darin, re not confusing Uber with Waymo. I remind everyone, though, that the Uber-employee arrogance evidence quoted above became widely known only *after* someone was killed. The example is unencouraging.

    And anecdotally today, from a discussion on a public-policy mailing list:

    “Learned that Google has permission to have cars without any humans inside. The day before, the driverless Waymo car ahead of me which was stopped at a red light on Alma begin to accelerate through the red. Whether the person in the car or the computer stopped it is unknown but the brakes were hit so hard the car rocked back and forth. This obviously colors my opinion of the technology being ready to go without humans.” (Presumably the human occupants that time were watching the road, not shows on Hulu.)

  19. I’ve been hit 3 time in my life by people turning right into me without checking their mirrors.
    I’ve also been in the same situation with WAYMO cars over the past 3 years, except the WAYMO actually saw me and stopped, letting me ride past on the right before making the turn…as expectations and the LAW states it should go.
    I ride with a lot of other cyclists and we share the same stories.
    The WAYMOs are safer on the road. They don’t rush, they don’t try to figure out if hey can squeeze by, they don’t pull ego trips and try and block your right had turn and they differ to SAFETY above ego or assuming risk by shooting through a neighborhood or school zone. That’s a fact.
    This road user is 100% in favor of the self drivers because of their superior safety and an absence of “Get out of my way”

  20. Facinating:
    ” Whether the person in the car or the computer stopped it is unknown but the brakes were hit so hard the car rocked back and forth. This obviously colors my opinion of the technology being ready to go without humans.”

    Interesting. So without any understanding of how it happened, it has tainted your view of the technology? Even if the driver tried to test the system by trying to go through the light to see if the car would stop them? Sounds like it stopped urgently, which is hat would be called for in a case of unintended acceleration.
    A lot of driving through storefronts would be avoided with the technology as described.

  21. I had my little old Prius recently destroyed (totaled) by a driver rearending me on a merger to Foothill Expressway in Los Altos. Not by an AV, but by a human driver. First time.

    But @Common Sense made a point that I saw recently at a 4 stop sign corner that needs a lot of human negotiation. A Waymo car could not figure out how to work within a group of humans. The corner of Cuesta and Springer needs this type of negotiated behavior. The AV started into the intersection (correctly negotiated with 3 humans) and then got spooked and just stopped. Not a nice slow stop, but I think all the humans nearby were just surprised at the abruptness. Eventually the intersection cleared out.

    I really appreciate the Cyclist right turn+bike lane perspective. A little low speed fender-bender at an intersection (car to car) would not be nearly as bad as a vehicle to biker or pedestrian accident! That would be 10X worse! So, like the 1st Robot Law of Asimov:
    A robot may not injure a human being or, through inaction, allow a human being to come to harm.

    If only all humans were as well mannered and “thoughtful”.

  22. I was stopped in the merge lane – behind another car – waiting for a safe opening in the very heavy rush-hour traffic in the right traffic lane of Foothill. Any intelligent being, or AV program in good operating condition would recognize it as a case to slow down and also carefully stop. Mistakes were made – just not in coding.

  23. I am not a fan of driverless cars, but for a moment, I hope to be an advocate for those that may not otherwise have a voice. If driverless cars become successful, just think what it would mean to those that for medical reasons or impairments lose their independence and become home bound. I know how difficult it was for my late husband who was quite the outdoorsman to stay home day in and day out and not being able to go see people, or to the ocean or just drive out to shoreline or to drive themselves to doctor appointments or therapy.
    If you have ever lost your independence and had to depend on others when you really did not want to, then I am sure you can understand how important this advancement would mean to them and to all of us eventually.

  24. The nice thing about the WAYMOs is that they have video cameras and radar constantly monitoring the road and driver and even the speed of other vehicles. They have verifiable data records to things back up. From the news:

    “As the Waymo car — all of its cars are white Chrysler Pacifica minivans — drove at 21 mph in the middle of three lanes, a car in the left lane began to merge into that middle lane. The test driver “took manual control of the AV (autonomous vehicle) out of an abundance of caution, disengaged from self-driving mode, and began changing lanes into Lane 3” (the right-hand lane), the report said.

    A motorcycle was in the right-hand lane traveling at 28 mph and beginning to overtake the Waymo car. Waymo’s car and the motorcycle collided at the car’s right rear bumper. The injured motorcyclist was transported to a hospital, the report said.”

    It would be nice if all cars had this. There would be no more guessing as to who was at fault for an accident or how fast a car was going when they hit a pedestrian.

  25. They ran the model to see how the car would have reacted under auto-control. Braking would have occurred and the accident, avoided.
    Once again an accident was caused by human error.

  26. Human error is by far the biggest factor in crashes. Autonomous cannot be perfect, but compared to the average human driver they could end up looking pretty close to it.

  27. A Google employee, on a Google assigned job made an ill advised and reckless decision to disengage the safety precautions that were designed to avoid the accident… direct actions taken by a Google employee during his job that caused the injury of a citizen who was simply going about his day.

    The Google employee purposely and willfully disengaged the SAFETY precautions which Google confirms would have avoided the accident, thus the Google employee’s willful actions REMOVED all the safety mechanisms we depend on to keep us safe as “Guinea pigs” in their experiment . The Google employee’s behavior, without question, directly caused the accident. Google confirms this in their statement.

    Google needs to pay ALL costs involved as well as ALL claimed damages.

  28. It would be nice if the residents of the city also had a say. When the self-driving cars first started appearing on Mountain View streets many years ago, I always wondered why no one asked if it was OK to experiment on all of us. After all, it could have gone either way.

  29. “it could have gone either way”

    Not really, I mean theoretically, yes, but it wasn’t like it was a coin flip.
    There was so much data showing their unparalleled safety even before the decision was made, that everyone was pretty sure it was going to go like it has gone.

    I’d rather have say in which human drivers in MV get to stay on the road 😉

Leave a comment