Google Self-Driving Car Accident Reports

We are the Borg.
User avatar
Rob Lister
Posts: 21039
Joined: Sun Jul 18, 2004 7:15 pm
Title: Incipient toppler
Location: Swimming in Lake Ed

Re: Google Self-Driving Car Accident Reports

Post by Rob Lister » Mon Apr 02, 2018 11:45 am

I must have missed the part. Maybe because they didn't show that part.

User avatar
Anaxagoras
Posts: 25187
Joined: Wed Mar 19, 2008 5:45 am
Location: Yokohama/Tokyo, Japan

Re: Google Self-Driving Car Accident Reports

Post by Anaxagoras » Mon Apr 02, 2018 11:59 am

The highway forks but the car didn't take either the right lane or the left lane. Instead it mistook the no-man's land in between for a lane.
A fool thinks himself to be wise, but a wise man knows himself to be a fool.
William Shakespeare

User avatar
Grammatron
Posts: 34573
Joined: Tue Jun 08, 2004 1:21 am
Location: Los Angeles, CA

Re: Google Self-Driving Car Accident Reports

Post by Grammatron » Mon Apr 02, 2018 8:29 pm

Decades away IMO.

User avatar
Rob Lister
Posts: 21039
Joined: Sun Jul 18, 2004 7:15 pm
Title: Incipient toppler
Location: Swimming in Lake Ed

Re: Google Self-Driving Car Accident Reports

Post by Rob Lister » Mon Apr 02, 2018 8:36 pm

Grammatron wrote:Decades away IMO.
Maybe the 's' is too much but I generally agree. I think less than twenty years. By ten some self-driving will be common.

I'm not sure how it will handle the really weird stuff where even humans are not well equipped. Some sort of 24/7 support system is going to be required. That's why God gave us India.

User avatar
Anaxagoras
Posts: 25187
Joined: Wed Mar 19, 2008 5:45 am
Location: Yokohama/Tokyo, Japan

Re: Google Self-Driving Car Accident Reports

Post by Anaxagoras » Tue Apr 03, 2018 3:19 am

Anaxagoras wrote:The highway forks but the car didn't take either the right lane or the left lane. Instead it mistook the no-man's land in between for a lane.
Just for reference, the recent fatal Tesla autopilot crash happened at a similar fork in the road:

https://www.mercurynews.com/2018/03/28/ ... tal-crash/

In the video I posted above I can see how the autopilot made the error, because the lane markers were fading but the one on the left was clear. Fooled it into thinking it was in the lane.
A fool thinks himself to be wise, but a wise man knows himself to be a fool.
William Shakespeare

User avatar
Anaxagoras
Posts: 25187
Joined: Wed Mar 19, 2008 5:45 am
Location: Yokohama/Tokyo, Japan

Re: Google Self-Driving Car Accident Reports

Post by Anaxagoras » Tue Apr 03, 2018 5:27 am

Here's data about the difference between makers:

Leaked data suggests Uber self-driving car program may be way behind Waymo [Updated]
Insiders have long viewed Uber as a laggard in the driverless car race, but internal documents obtained by The New York Times suggest that the company's self-driving car program may be even further behind its rivals than had been publicly known.

The key statistic: prior to last Sunday's fatal crash in Tempe, Arizona, Uber's self-driving cars in Arizona were "struggling" to go 13 miles between interventions by a safety driver—known as a disengagement.

The Times points out that, in 2017, Waymo's self-driving cars in California traveled 5,600 miles between incidents in which a driver had to take over for safety reasons. Cruise, GM's self-driving car subsidiary, had a safety-related disengagement once every 1,250 miles in the state. We don't know either company's statistics in Arizona because Arizona law doesn't require them to be disclosed.

The Times presents the Uber and Waymo paragraphs back to back, suggesting they're directly comparable. But it's not clear if they are. The Waymo and Cruise figures are for safety-related disengagements—situations when the driver has to take over to prevent an accident. The figures don't include situations where the vehicle gets stumped by something tricky like a construction site and needs the safety driver to take over even though there's no immediate danger of a crash.

It's not clear from the Times report whether that 13-miles-per-disengagement figure is for safety-related disengagements—which would be comparable to the California numbers—or for all disengagements—which wouldn't be.

Moreover, as an Uber spokesman pointed out to the Times, the disengagement rate depends on many factors, including the type of roads the car is being tested on, the kinds of tests being performed, and how a car's software is configured. However, Uber is testing its cars in the Phoenix metro area—a region whose wide suburban streets are generally considered among the easiest in the country to navigate. By contrast, Cruise CEO Kyle Vogt has boasted about testing cars on the crowded and chaotic streets of urban San Francisco.

If it is an apples-to-apples comparison, then Uber would have a lot of ground to make up. In 2016, Waymo's (then Google's) cars in California went more than 5,000 miles between disengagements. In 2015, the figure was 1,250 miles per disengagement. So that would mean Uber's cars need human help 100 times as often as Waymo's cars did in 2015.
It may possibly be that non-safety disengagements are included in the Uber numbers, but then again, it may be an apples-to-apples comparison. Here's that NY Times story:

Uber’s Self-Driving Cars Were Struggling Before Arizona Crash
SAN FRANCISCO — Uber’s robotic vehicle project was not living up to expectations months before a self-driving car operated by the company struck and killed a woman in Tempe, Ariz.

The cars were having trouble driving through construction zones and next to tall vehicles, like big rigs. And Uber’s human drivers had to intervene far more frequently than the drivers of competing autonomous car projects.

Waymo, formerly the self-driving car project of Google, said that in tests on roads in California last year, its cars went an average of nearly 5,600 miles before the driver had to take control from the computer to steer out of trouble. As of March, Uber was struggling to meet its target of 13 miles per “intervention” in Arizona, according to 100 pages of company documents obtained by The New York Times and two people familiar with the company’s operations in the Phoenix area but not permitted to speak publicly about it.

Yet Uber’s test drivers were being asked to do more — going on solo runs when they had worked in pairs.

And there also was pressure to live up to a goal to offer a driverless car service by the end of the year and to impress top executives. Dara Khosrowshahi, Uber’s chief executive, was expected to visit Arizona in April, and leaders of the company’s development group in the Phoenix area wanted to give him a glitch-free ride in an autonomous car. Mr. Khosrowshahi’s trip was called “Milestone 1: Confidence” in the company documents.
Another story from Wired:
DMV Data Says Waymo and GM Are Leading the Self-Driving Car Race
The Golden State, home to many of the companies leading the robo revolution, has some of the strictest rules for AVs in the country. Operators who run cars on public roads must publicly report any crashes they’re involved in. And at the end of every year, they must hand over data on how many miles they drove and how many times their onboard human safety driver had to take control from the machine—that’s called a disengagement. Combine those, and you have a number approximating how far any company’s self-driving car can go without human help. Something like a grade.

The metric is imperfect, and this data comes with a crate of caveats. But before we get into those, know this: Waymo (formerly known as Google’s self-driving car project) and General Motors appear to be leading the pack and making rapid progress toward the day when human drivers, with all their inattention and distraction and tendency to crash, will be obsolete.
Ifs and Buts

You can read more about the shortcomings of disengagement reports here, but here’s the quick rundown:
  • They’re unscientific, because each company reports its data in a different way, offering various levels of detail and idiosyncratic explanations for what triggered the human takeover.
  • They’re packed with vague language and lack context. Delphi cites “cyclist” as the reason for a bunch of disengagements. Zoox blamed every disengagement on a “planning discrepancy” or “hardware discrepancy.”
  • They’re little use for anyone who wants to compare rival companies, because those companies aren’t running the same tests: Waymo does most of its testing in simple suburbs; GM focuses on the complex city. They’re better for tracking the progress of each outfit, but still not great, because those companies change how and where they test over time.
  • A disengagement does not mean the car was going to crash, only that the human driver wasn't 100 percent confident in how it would behave.
  • They only cover driving on public roads in California. So we don’t know anything about Ford, which focuses its testing around Detroit and Pittsburgh. We don’t see data for Waymo’s increasingly important test program in Phoenix—where its cars are tooling about without anyone inside.
On the other hand, the disengagement reports are the best data we’ve got for evaluating these development efforts. No state but California demands anything like this, and private companies only share such info when the government demands it.

So, let's sprinkle some grains of salt on the numbers and take a look. We broke them down into a pair of two-axis charts. The first looks at Waymo and General Motors. It notes how many miles they drove in 2016 and 2017 (in green) and how many miles they averaged between disengagements (in blue). (By the way, Uber didn't have to file a report, because this data isn't required until your first full calendar year of testing. Uber didn't get its permit to test in California until March of 2017.)

The takeaway here is that Waymo’s software remains excellent, and it’s still doing tons of testing in California. For GM, you can see a huge ramp-up in miles driven, and a steep increase in miles per disengagement. That’s progress, and it's a good thing: GM plans to launch a car without a steering wheel or pedals next year. Keep in mind that GM does nearly all its public street testing in San Francisco, a much more complicated environment than Palo Alto and Mountain View, where Waymo works.
A fool thinks himself to be wise, but a wise man knows himself to be a fool.
William Shakespeare

User avatar
Anaxagoras
Posts: 25187
Joined: Wed Mar 19, 2008 5:45 am
Location: Yokohama/Tokyo, Japan

Re: Google Self-Driving Car Accident Reports

Post by Anaxagoras » Tue Apr 03, 2018 5:34 am

This is pretty cool, 360-degree video:

A fool thinks himself to be wise, but a wise man knows himself to be a fool.
William Shakespeare

User avatar
Anaxagoras
Posts: 25187
Joined: Wed Mar 19, 2008 5:45 am
Location: Yokohama/Tokyo, Japan

Re: Google Self-Driving Car Accident Reports

Post by Anaxagoras » Wed May 09, 2018 4:17 am

At first watching this does not appear to be avoidable:

A fool thinks himself to be wise, but a wise man knows himself to be a fool.
William Shakespeare

User avatar
Anaxagoras
Posts: 25187
Joined: Wed Mar 19, 2008 5:45 am
Location: Yokohama/Tokyo, Japan

Re: Google Self-Driving Car Accident Reports

Post by Anaxagoras » Wed Dec 05, 2018 3:45 pm

Waymo launches self-driving car service Waymo One
Waymo, the former Google self-driving project owned by parent company Alphabet, is launching a commercial robotaxi service in the Phoenix area dubbed Waymo One.

This milestone, for the company and nascent self-driving technology industry, comes with caveats.

The Waymo One self-driving car service, and accompanying app, won’t be available to just anyone. And for now, the company says it will have Waymo-trained test drivers behind the wheel (even though the company already has driverless vehicles on public roads in Phoenix).

Waymo will first invite Phoenix residents who are part of its early rider program, which was designed to give a vetted group of people the ability to use an app to hail a self-driving vehicle. The early rider program, which launched in April 2017, had more than 400 participants the last time Waymo shared figures on the program.
OK, so it's really still in the sandbox phase, not a true "driverless" car service available to the general public, but it's a step forward. A baby step.

Ultimately the technology isn't really mature until "driverless" literally means driverless. How far away is that?
A fool thinks himself to be wise, but a wise man knows himself to be a fool.
William Shakespeare

User avatar
Abdul Alhazred
Posts: 77349
Joined: Mon Jun 07, 2004 1:33 pm
Title: Yes, that one.
Location: Chicago

Re: Google Self-Driving Car Accident Reports

Post by Abdul Alhazred » Wed Dec 05, 2018 4:37 pm

Inevitably, one will hit a child.

The only question is before or after general adoption of the technology.
Image "If I turn in a sicko, will I get a reward?"

"Yes! A BIG REWARD!" ====> Click here to turn in a sicko
The arc of the moral universe bends towards chaos.
People who believe God or History are on their side provide the chaos.

User avatar
Anaxagoras
Posts: 25187
Joined: Wed Mar 19, 2008 5:45 am
Location: Yokohama/Tokyo, Japan

Re: Google Self-Driving Car Accident Reports

Post by Anaxagoras » Thu Dec 06, 2018 1:49 am

A fool thinks himself to be wise, but a wise man knows himself to be a fool.
William Shakespeare

User avatar
Abdul Alhazred
Posts: 77349
Joined: Mon Jun 07, 2004 1:33 pm
Title: Yes, that one.
Location: Chicago

Re: Google Self-Driving Car Accident Reports

Post by Abdul Alhazred » Fri Feb 01, 2019 4:33 pm

Here's yet another reason why self driving cars are evil! :evil:

https://www.theregister.co.uk/2019/02/0 ... ving_cars/

Synopsis: The will cause traffic jams by driving around empty to avoid parking fees.
Last edited by Abdul Alhazred on Fri Feb 01, 2019 11:07 pm, edited 1 time in total.
Image "If I turn in a sicko, will I get a reward?"

"Yes! A BIG REWARD!" ====> Click here to turn in a sicko
The arc of the moral universe bends towards chaos.
People who believe God or History are on their side provide the chaos.

User avatar
Rob Lister
Posts: 21039
Joined: Sun Jul 18, 2004 7:15 pm
Title: Incipient toppler
Location: Swimming in Lake Ed

Re: Google Self-Driving Car Accident Reports

Post by Rob Lister » Fri Feb 01, 2019 5:30 pm

Abdul Alhazred wrote:
Fri Feb 01, 2019 4:33 pm
Here's yet another reason why self deriving cars are evil! :evil:

https://www.theregister.co.uk/2019/02/0 ... ving_cars/

Synopsis: The will cause traffic jams by driving around empty to avoid parking fees.
I admit to not clicking the link but it seems like driving around for an hour is more expensive than parking for an hour.

User avatar
Grammatron
Posts: 34573
Joined: Tue Jun 08, 2004 1:21 am
Location: Los Angeles, CA

Re: Google Self-Driving Car Accident Reports

Post by Grammatron » Fri Feb 01, 2019 9:27 pm

Well they are skewing more toward cities like London or Manhattan where parking is ~$20/hr.

If that gets to be such a huge problem, they can easily circumvent it by charging an unoccupancy fee for vehicles driving on public roads with no passengers.

User avatar
Abdul Alhazred
Posts: 77349
Joined: Mon Jun 07, 2004 1:33 pm
Title: Yes, that one.
Location: Chicago

Re: Google Self-Driving Car Accident Reports

Post by Abdul Alhazred » Thu Mar 07, 2019 12:11 pm

Google killed your baby? What color?

How human bias seeps into autonomous vehicles' AI
Axios
...
  • Researchers divided a large dataset of images that contain pedestrians by skin tone.
  • Then they compared how often the AI models correctly detected the presence of people in the light-skinned group versus how often they got it right with people in the dark-skinned group.
  • Detection of dark-skinned people was 5 percentage points less accurate.
...
Image "If I turn in a sicko, will I get a reward?"

"Yes! A BIG REWARD!" ====> Click here to turn in a sicko
The arc of the moral universe bends towards chaos.
People who believe God or History are on their side provide the chaos.

User avatar
Anaxagoras
Posts: 25187
Joined: Wed Mar 19, 2008 5:45 am
Location: Yokohama/Tokyo, Japan

Re: Google Self-Driving Car Accident Reports

Post by Anaxagoras » Thu Mar 07, 2019 2:12 pm

Is it really a human bias though or just a matter of sharper contrast.
A fool thinks himself to be wise, but a wise man knows himself to be a fool.
William Shakespeare

User avatar
Abdul Alhazred
Posts: 77349
Joined: Mon Jun 07, 2004 1:33 pm
Title: Yes, that one.
Location: Chicago

Re: Google Self-Driving Car Accident Reports

Post by Abdul Alhazred » Thu Mar 07, 2019 4:09 pm

Anaxagoras wrote:
Thu Mar 07, 2019 2:12 pm
Is it really a human bias though or just a matter of sharper contrast.
Question:
Would a human, even a racist white human, notice black little children blocking the street?
The law certainly expects them to.

The question becomes whether the alleged benefits of self-driving cars are worth the collateral damage.
Image "If I turn in a sicko, will I get a reward?"

"Yes! A BIG REWARD!" ====> Click here to turn in a sicko
The arc of the moral universe bends towards chaos.
People who believe God or History are on their side provide the chaos.

User avatar
Anaxagoras
Posts: 25187
Joined: Wed Mar 19, 2008 5:45 am
Location: Yokohama/Tokyo, Japan

Re: Google Self-Driving Car Accident Reports

Post by Anaxagoras » Wed Mar 13, 2019 7:46 am

Uber 'not criminally liable' for self-driving death
Prosecutors have ruled that the company is not criminally liable for the death of Elaine Herzberg, 49, who was struck as she crossed a road in Tempe, Arizona.

The car's back-up driver could still face criminal charges.

A police report has previously called the incident "entirely avoidable".

"After a very thorough review of all evidence presented, this office has determined that there is no basis for criminal liability for the Uber corporation," wrote Yavapai County Attorney Sheila Sullivan Polk in a letter.

The crash occurred in March 2018, and involved a Volvo XC90 that Uber had been using to test its self-driving technology.

Just before the crash, Ms Herzberg had been walking with a bicycle across a poorly lit stretch of a multi-lane road.

Dash-cam footage released by police after the incident appeared to show the vehicle's back-up driver, Rafaela Vasquez, taking her eyes off the road moments before the crash.

Further records from the streaming service Hulu suggested that Ms Vasquez had been streaming the TV show, The Voice, on a phone at the time of the crash.

The Yavapai County Attorney's office recommended an expert analysis of the video, and that the Tempe police department collect further evidence on what the back-up driver would have seen on the road.

The office did not explain its reasoning for finding Uber to be not criminally liable.

Uber did not immediately respond to the BBC's request for comment.

The National Transportation Safety Board is also investigating the crash. It released a preliminary report last year that suggested the sensors on the Uber vehicle were working correctly, but that emergency braking manoeuvres may not have been enabled.

Following the crash, authorities in Arizona suspended Uber's ability to test self-driving cars on the state's public roads. Uber subsequently pulled the plug on its autonomous car operation in Arizona, although the company has since resumed tests in Pennsylvania.
Uber death leaves questions about self-driving car liability unanswered
Washington, DC (CNN Business)A year after the first fatality caused by a fully self-driving car, questions about liability in the event of a death involving the cars are still completely up in the air.
Officials announced earlier this week that Uber won't face criminal charges in the death of a pedestrian struck and killed by one of its self-driving cars nearly a year ago in Tempe, Arizona.
The Yavapai County Attorney's Office said it conducted a thorough review of the evidence and determined there was no basis for criminal liability against Uber. It did not detail how the decision was made and has declined to answer any questions on the case.

The pedestrian was walking a bicycle across a road at night. Uber's self-driving software system initially classified the pedestrian as an unknown object, then as a vehicle, then as a bicycle, but never braked.
However, the Uber employee who was behind the wheel of the SUV could still face criminal charges. Companies working on self-driving cars, such as Uber, have test drivers who are supposed to intervene if the car fails to act properly.
Uber also faced the risk of a civil lawsuit. Edwards said this is why the company moved quickly to settle with the family of Elaine Herzberg, the pedestrian who was killed, shortly after her death. Uber and Herzberg's family settled fewer than two weeks after the crash. Details of the agreement weren't revealed.
A fool thinks himself to be wise, but a wise man knows himself to be a fool.
William Shakespeare

User avatar
Anaxagoras
Posts: 25187
Joined: Wed Mar 19, 2008 5:45 am
Location: Yokohama/Tokyo, Japan

Re: Google Self-Driving Car Accident Reports

Post by Anaxagoras » Mon May 20, 2019 3:50 am

Update on the current situation with Waymo (formerly Google):

Hand Gestures And Horses: Waymo’s Self-Driving Service Learns To Woo The Public (Forbes)

There's also a 5-minute video.
On a sunny spring day in suburban Phoenix, a white minivan stops at a crosswalk to let a man pass. He gives the vehicle a wave, signaling it should go ahead — which it does, until the pedestrian suddenly steps off the curb and dashes through the crossing. The van, sporting the green and blue logo of Alphabet’s self-driving car unit, brakes to a halt.

Six months after Waymo started offering a driverless taxi service near Phoenix, the robot vehicles and — and the public — are learning to coexist. Technically, the rollout has been a success. The Pacifica Hybrid minivans can make split-second adjustments after reading cues like a hand gesture, a sophisticated step for autonomous cars. They handle tricky turns and brake more smoothly compared to previous test rides by Forbes. More than 1,000 people are signed up to use the Waymo One service; tens of thousands are waiting to sign on. Outrage over the too cautious maneuvering of the programmed vehicles seems to have died down.
The driving algorithm just seems to be getting better. It's no longer too timid (nor too aggressive).
As a commercial endeavor that could ultimately become a source of billions of dollars of income for its digital advertising-dependent parent, however, progress looks almost glacial. Waymo is keeping safety drivers at the wheel for most rides and airport and highway runs aren’t yet an option. It’s also not saying when it will transition to a service without safety drivers and launch in bigger, denser markets — currently, it’s in a 100-square-mile stretch including the Phoenix suburbs of Chandler, Tempe, Mesa and Gilbert.

“We've always had a very conservative approach to making both our users and the public feel safe with this technology and what it is we're doing here," a Waymo spokeswoman said. Ride rates are in line with what Lyft and Uber charge but the company isn't saying how many people it's hauling a day or sharing revenue details. (Forbes’ recent Waymo One trip cost $8.53.)
About those cars without safety drivers:
There’s another piece of the Arizona program that’s closer to Waymo’s long-term plans of full autonomy. A few hundred people are getting rides in Pacificas with no safety driver through its Early Rider program, an earlier test rollout. Unlike Waymo One users, Early Riders have to sign nondisclosure agreements and aren’t allowed to discuss the program.

Early Riders are also a way for the company to observe how people adapt to a robotic service and the options they want. Recently Waymo integrated Google Play music into the Waymo One app to let riders automatically listen to their preferred songs and artists. Video streaming, games and other in-vehicle options that leverage Google’s many services are likely additions, though Waymo won’t verify that.
Although it's still sort of a small beta test phase, not open to the general public, there are actual true autonomous vehicles operating on public roads without a safety driver. (It's also mentioned in the video. The chief of police mentions that there are sometimes vehicles with no drivers.)

As far as the "too timid" criticism, that's something that happens even with human drivers who strictly follow the letter of the law. From a corporate liability standpoint though, I think they have no other choice. If they try to imitate the behavior of human drivers too much, including how they sometimes bend the law by not coming to a complete stop at a stop sign, or going 5 miles over the speed limit, there would be an even bigger backlash to that, I think.
A fool thinks himself to be wise, but a wise man knows himself to be a fool.
William Shakespeare

User avatar
Rob Lister
Posts: 21039
Joined: Sun Jul 18, 2004 7:15 pm
Title: Incipient toppler
Location: Swimming in Lake Ed

Re: Google Self-Driving Car Accident Reports

Post by Rob Lister » Mon May 20, 2019 12:40 pm

As far as the "too timid" criticism, that's something that happens even with human drivers who strictly follow the letter of the law. From a corporate liability standpoint though, I think they have no other choice. If they try to imitate the behavior of human drivers too much, including how they sometimes bend the law by not coming to a complete stop at a stop sign, or going 5 miles over the speed limit, there would be an even bigger backlash to that, I think.
Here's a great article giving statistics from 2018 on the top 7 reasons for car accidents.
https://www.after-car-accidents.com/car ... auses.html

Summary:

speeding.
drunk driving
distracted driving (texting, eating, getting a handjob, etc)
talking on a cell phone
bad weather
trying to beat the red light.
falling asleep

which of those is a problem for autonomous cars?

Yea, going exactly speed limit bothers the sleepy drunk driver behind you because he's trying to make the next light.