Support BikePortland - Journalism that Matters

Portland’s cautious approach to AVs should prevent what just happened in Tempe

Posted by on March 19th, 2018 at 11:48 am

Scary news out of Tempe.

Uber has been testing its new self-driving cars on human subjects since last year and now it appears one of them has killed a person who was walking across a street. The collision happened in Tempe, Arizona late last night. According to a local news report, “Tempe Police says the vehicle was in autonomous mode at the time of the crash and a vehicle operator was also behind the wheel.”

This is the second self-driving Uber (that we know about) that has been involved in a collision. Last month a local news station in Pittsburgh reported that one of them slammed into another car while in self-driving mode.

After last night’s death, Uber has announced it will immediately end its testing in Tempe and Pittsburgh, as well as San Francisco and Toronto.

Thankfully in Portland our local leaders and transportation officials have not allowed a private company to test their deadly product on humans.

Back in April the Portland Bureau of Transportation launched their Smart Autonomous Vehicle Initiative (SAVI). In doing so Portland made it clear it welcomed innovative companies to try new autonomous vehicle (AV) products and services here; but only if safety was the number one priority.

In July, PBOT released a Request for Information aimed at companies like Uber. “If you want to test your technology, we are game to work with you,” PBOT staffer Ann Shikany proclaimed in a webinar with potential respondents, “But before we just start testing, we want to understand what’s out there.” It was PBOT’s attempt to, “Begin engagement with the private sector in a more intentional way.”

Advertisement

(Graphic: PBOT)

19 companies responded. It doesn’t look like Uber is on the list…

(Source: PBOT)

The RFI itself makes it clear that any companies who want to test AV tech must first prove it does not conflict with our Vision Zero Action Plan. They also told prospective companies that no AV testing permits would be accepted until all City of Portland regulations were completed.

Here’s one of six goals of the SAVI effort that relates to safety:

Ensure the safety of our residents and businesses by requiring AV providers to align with our Vision Zero goal to eliminate all traffic deaths and serious injuries by 2025. AVs must show that they can and will stop or avoid pedestrians, bicyclists, animals (to include domestic, game and livestock), disabled people, emergency vehicles, red lights, and stop signs.

PBOT Project Manager Peter Hurley emphasized during the webinar that if/when a company is selected to test here they must start in a “controlled situation” that’s not in the public right-of-way. Ann Shikany added, “We’re thinking it might be more of a closed track, obstacle course situation in an industrial area rather than a heavily pedestrian-trafficked residential street.” (Last night’s fatal collision in Tempe occurred in a suburban setting at an intersection of two large arterial roads.)

“We don’t want to be passive respondents to this technological change,” Hurley said at a bicycle advisory committee meeting last August, “We want to have a proactive role.”

So far PBOT has made a valiant effort to protect us from the dangers of irresponsible companies like Uber. We’re grateful for that.

For more on Portland’s Smart Autonomous Vehicle Initiative, check out the city’s website.

— Jonathan Maus: (503) 706-8804, @jonathan_maus on Twitter and jonathan@bikeportland.org

Never miss a story. Sign-up for the daily BP Headlines email.

BikePortland needs your support.

NOTE: We love your comments and work hard to ensure they are welcoming of all perspectives. Disagreements are encouraged, but only if done with tact and respect. BikePortland is an inclusive company with no tolerance for discrimination or harassment including expressions of racism, sexism, homophobia, or xenophobia. If you see a mean or inappropriate comment, please contact us and we'll take a look at it right away. Also, if you comment frequently, please consider holding your thoughts so that others can step forward. Thank you — Jonathan

91
Leave a Reply

avatar
28 Comment threads
63 Thread replies
0 Followers
 
Most reacted comment
Hottest comment thread
40 Comment authors
Todd BoulangerJohn LiuPeteJonathan Maus (Publisher/Editor)Dan A Recent comment authors
  Subscribe  
newest oldest most voted
Notify of
rick
Guest
rick

Why not ban Uber like London?

dan
Guest
dan

It definitely feels qualitatively different to be killed by a robot rather than by an inattentive human driver, but this feels like a tricky question. If we assume that robot drivers will cut fatalities significantly, are we really best-served by a zero-tolerance policy? I mean, if we stick with 100% human drivers, the total traffic fatalities are almost certain to be higher than with autonomous cars, no?

Jim Lee
Guest
Jim Lee

Remember HAL 9000!

Bald One
Guest
Bald One

Seems like robot cars are a menace, but then so are regular Uber/Lyft cars – with respect to their frequently observed behaviors. I would be interested to understand how Uber plan to program their robot cars to stop in order to pick up and drop off their passengers in urban areas. Perhaps they could be programmed to do this in a way that does not involve stopping in the bike lane, blocking traffic or creating a disruption, parking on sidewalks, failing to use signals, driving erratically and unpredictably, and being generally disrespectful to all others for the convenience of the Uber and their passenger. Their behavior has placed the little blinking dot on the screen above the importance of all else. If they can program both their human drivers and their future robot drivers to pull over in a safe location in dense areas, even if it involves their fare walking a block or two, I might be a little more receptive to their business.

Rain Waters
Guest
Rain Waters

If humans are so scary than who shall program these vehicles ?

John Liu
Guest
John Liu

I would not be so quick to jump to conclusions and start calling AVs “deadly products”. I don’t think anyone can make such claims without analyzing accidents per miles driven and investigating the behavior and limitations of the technology. Remember that human drivers are hitting other cars and pedestrians with some frequency. Do we have data showing that the current AVs have a higher accident rate than human drivers?

That said, I support the city’s careful approach. AV testing brings jobs but also brings risks, and that trade off needs to be examined.

caesar
Guest
caesar

The author of this editorial has allowed his distaste for Uber, the company and its business plan, to cloud his judgement of autonomous vehicles and their future potential to at least partly ameliorate automotive-induced carnage. This is apparent in his characterizations, such as the one of Uber as “a private company (that tests) their deadly products on humans.”

Clearly there are many obstacles to be overcome and refinements to be made before AV technology attains an acceptable safety profile. It will never be 100% safe, just like no other form of travel is 100% safe or ever will be. Rather than demonize Uber’s AV program in such a melodramatic way, show a little journalistic balance (dare I say integrity?) and also point out how buses, trains, bikes, skateboards, escalators, elevators, roller skates, footwear and just about any other form and style of human conveyance has resulted in deaths many times over since they were introduced to the public.

Hyperbole of the magnitude employed in this op-ed is rarely conducive to a useful discussion on any subject, and the hypocrisy of singling out Uber when there are dozens of instances of other products and modern conveniences resulting in human disability or death is troubling to me, especially coming from this otherwise progressive blog.

Asher Atkinson
Guest
Asher Atkinson

And meanwhile, using statistics for 2015 when human drivers killed over 6,000 pedestrians and bicyclists, likely at least 15 others were killed in the traditional manner and it wasn’t newsworthy. Granted, had the Uber AV not been on the road testing, it may have resulted in one less death yesterday. Or the car behind it may well have caused the death instead. It is one thing to demand safe protocols for testing AVs (the one in this case had a human driver, which doesn’t seem to have helped) and it is another to take a reactionary position and describe AVs as death machines marketed by people and companies with no sincere interest in safety.

Richard Klancer
Guest
Richard Klancer

If you like Portland’s approach and want it to remain legal, CALL YOUR REPRESENTATIVE. Seriously. There is an existing Senate bill, “AV START” or S. 1885, that would preempt states and cities from enforcing safety rules on self driving cars. (Only as-yet nonexistent federal rules would be allowed.)

See https://www.tomudall.senate.gov/news/press-releases/udall-senators-self-driving-car-bill-needs-stronger-safety-measures-consumer-protections

and

http://www.autonews.com/article/20180316/MOBILITY/180319765/lobbying-senate-holdouts-av-start-act

Joel H
Guest
Joel H

Unfortunately, our plethora of right turn auto Lanes that cross straight-through bike lanes ensures it will keep happening here. This insane design is at least as much at fault here as the driver.

Andrea Capp
Subscriber
Andrea Capp

I would like to know how AVs will respond to all of our intersections without stop signs. What about those (worst idea ever) stop signs with a sign underneath them that permit turning right without stopping?

Hazel Gross
Guest
Hazel Gross

I actually encountered an AV dump truck type construction vehicle backing into a lot on the Going St bike route. It was weird and terrifying. It was pulling forward and backing up(adjusting the vehicle so it would back in correctly. I had looked up to make eye contact with the driver when I realized there wasn’t one. I had no way to know weather it would stop what it was doing or run me over.

Jon
Guest
Jon

It appears that the first look at the video from the vehicle makes it “likely” that there was nothing that the Uber could have done to avoid the crash.
https://arstechnica.com/cars/2018/03/police-chief-uber-self-driving-car-likely-not-at-fault-in-fatal-crash/

Tim
Guest
Tim

What cutting age tech and programming is Uber running?
With infrared tech from 20 years ago I watch small watch animals wandering in total darkness… So tell me why the AV didn’t “see” the person???

CargoRider
Guest
CargoRider

Police are saying the AV was going 40 in a 35 zone. So, just like a human driver…

X
Guest
X

I support AV in principle. But if we’re going to have robots do the driving we need some Laws. Keep it simple, just a few will do. For example: An AV will not move through the space containing a vulnerable road user. If the future progress of a detected vulnerable road user crosses the track of an AV it must slow down enough to remain under control in all outcomes. If some part of the environment around an AV cannot be scanned by its detection systems that space must be assumed to contain a vulnerable road user on a crossing path.

That kind of AV I can live with. The problem with Uber is, they are the last bunch of people I’d want to be developing the AV of the future. Kind of like having your military in charge of developing nuclear power, alas.

bikeninja
Guest
bikeninja

A good reason to focus on accidents and the fatalities caused by AV’s at this early stage and apply great pressure in the form of laws and liability judgements is that the designer of an AV has much latitude ( or will once the tech is better understood) in how many sensors, what quality they are and the speed and integration of the processing in to the cars driving system. If the financial and legal penalties for accidents are low companies will go with the bare minimum to get by at the price of more accidents in iffy situations. If the penalties are high then future AV’s will have to choose the best and most in their designs. A good example of this is comparing Chinese and German CNC machine tools. European workers comp type laws are very strict so German machines have very high quality redundant safety equipment that is impossible to bypass, while current Chinese machines have the bare minimum to meet US standards and use low quality safety gear that breaks quickly and is easy to remove. The safety of future AV’s will depend on how hard we push now.

Tom
Guest
Tom

How will the federal AV START Act” (S. 1885) impact the cities policy? This bill is apparently likely to pass and is very broad in scope at preventing cities from interfering with AV testing in any way.

Dave
Guest
Dave

Why can’t Uber rent race circuits or the big 3 carmakers’ test tracks? If they’re going to be on public roads, I think it’s legit self defense for cyclists and pedestrians to vandalize, torch, or otherwise disables AV’s/

Todd Boulanger
Guest

This was sent out yesterday…

NACTO Statement Re: Automated Vehicle Fatality in Tempe, AZ [March 19th 2018]
——————————————————————————-
Linda Bailey, Executive Director of the National Association of City Transportation Officials (NACTO), issued the following statement in response to a pedestrian being killed by an automated vehicle in Tempe, AZ:

Last night, an autonomous vehicle hit and killed a pedestrian in Tempe, AZ. We do not know much yet about this incident—which is believed to be the first time someone walking or biking was killed by a vehicle operating autonomously in the U.S.

NACTO is encouraged that the National Transportation Safety Board is sending a team to provide an in-depth, independent assessment of the tragic crash. However, what is already clear is that the current model for real-life testing of autonomous vehicles does not ensure everyone’s safety. While autonomous vehicles need to be tested in real-life situations, testing should be performed transparently, coordinated with local transportation officials, and have robust oversight by trusted authorities.

In order to be compatible with life on city streets, AV technology must be able to safely interact with people on bikes, on foot, or exiting a parked car on the street, in or out of the crosswalk, at any time of day or night. Cities need vehicles to meet a clear minimum standard for safe operations so the full benefits of this new technology are realized on our complex streets. Responsible companies should support a safety standard and call for others to meet one as well.

We cannot afford for companies’ race-to-market to become a race-to-the-bottom for safety…

https://nacto.org/2018/03/19/statement-on-automated-vehicle-fatality/?mc_cid=1655902416&mc_eid=8f31e4e041

Peter W
Guest

Uber is not the only one we should be concerned about here.

Google (Waymo) pulled off a great PR bait-and-switch on the American public, introducing driverless tech with their adorable and harmless looking two-seater (the “Firefly”), before killing it in 2016[1] and switching to the strategy of sticking their sensor arrays on the outside of much larger existing vehicles (granted, their minivan is probably half as deadly as the Uber SUV [2]).

Meanwhile, Google and Uber are part of the same industry lobbying group which successfully killed a bill in California which would have required vehicles smart enough to drive themselves to be smart enough to not run on oil. (Not surprising Google would oppose that; among other things, they’re now rolling out driverless semis [3].)

The same industry group is pushing a federal bill which seems designed to preempt such policies in any state (as well as, perhaps, these rules that Portland has developed [4]).

1: https://etfdailynews.com/2016/12/13/the-google-car-is-dead/
2: https://www.newscientist.com/article/dn4462-suvs-double-pedestrians-risk-of-death/
3: https://www.theverge.com/2018/3/9/17100518/waymo-self-driving-truck-google-atlanta
4: https://www.theverge.com/2018/3/16/17130190/av-start-legislation-lobbying-washington-feinstein

Todd Boulanger
Guest

Looking at the aerial view – the AZDOT landscape architects and engineers who worked on the design of this highway have done almost 99% of creating an unmarked “crossing” at this location with high attraction for bike and peds: what with the nice paved “walking” areas in the shape of an “X” connecting two desire lines from the Canal and Loma trail to area destinations (theatre) and transit stop (EART) in the west…the state or city staff should be ashamed of the incomplete/ ineffective solution they relied on: just installing some small 18×12 R9-3 signs prohibiting the pedestrian crossing…

Fred
Guest
Fred

Everybody needs to read this article from Slate about the collision:

https://slate.com/technology/2018/03/uber-self-driving-cars-fatal-crash-raises-questions-about-our-infrastructures-readiness.html

It contains important details about the collision – a woman walking a bike across a poorly-lit eight-lane road – and echoes a lot of what Jonathan has been saying for years about the need to upgrade our infrastructure to include cyclists and walkers.

John Liu
Guest
John Liu

Video of collision, exterior and interior cameras.

https://youtu.be/XtTB8hTgHbM

My take:
1 – An average human driver with average attentiveness, vision and reflexes may well not have avoided the accident. To an average driver, the pedestrian really did come out of nowhere, doing something that was suicidal.
2 – A far above average human driver with extreme attentiveness, terrific night vision, and extremely fast reflexes might have avoided the accident. Think race car driver etc.
3 – An AV with all the sensors and high speed processing we expect them to have . . . I think it should have detected the pedestrian when she was still in the adjacent lane, and avoided the accident. Because #3 should be superior to #2 and far superior to #1, at least in this sort of pretty straightforward situation.
4 – There is a legal conudrum here. If the police and DA assess the Uber AV’s actions according to the standard applicable to #1, they might reasonably find no reason to take legal action. Is that the appropriate legal standard? Should we hold AVs to the legal standard of care applicable to the average human driver? Or to a higher standard? If the latter, what standard? I think it will take some time for the law to figure this out.
5 – The safety driver is obviously not a #2. He or she might well be a #1, though.

John Liu
Guest
John Liu

What a human driver should have seen and done isn’t really the issue.

The issue is that the AV sensors (radar, lidar, video with nightvision) are not limited by human vision or attention. They should have been able to detect the pedestrian entering the roadway, regardless of headlight coverage. I’m pretty certain they did. The AV thus should have braked or swerved.

John Liu
Guest
John Liu

https://mobile.nytimes.com/2018/03/23/technology/uber-self-driving-cars-arizona.html

Waymo (Google) AV cars averaging 5,600 miles between safety driver interventions. Uber cars averaging 13. Detail in article.

Recall Waymo sued Uber for theft of technology, settlement required Uber to not use any Waymo hardware or software in Uber’s AV cars. Maybe that set Uber back?

Uber basically needs to be first to AV cars, or the company will be gone. Once AV technology is working and proven, big companies will launch AV taxi fleets and the human driver Uber service won’t be able to compete. Would you take a ride in a Google (or GM, or Amazon) AV taxi that is purpose built (electric, compact, wide doors, seats facing each other, tons of headroom and legroom) for $5, or ride with an Uber driver in his random car for $10? Uber is already losing tons of money and burning through cash. Imagine how much worse it will be if half their business quickly disappears.

So the pressure to catch up with Waymo, who have a multi-year lead in AV, must be huge. Waymo is getting ready to street test AV cars with no safety drivers. Uber is obviously nowhere close. Is Uber getting desperate enough to cut corners?

A thought about safety drivers:

Humans are not capable of sitting passively for hours on end, staring out a windshield, and then suddenly detecting and reacting to an emergency in a couple of seconds. As AVs get better, the safety driver will become even less useful.

Imagine being the safety driver in a Waymo car, you have to sit there for almost 6K miles of driving (250 hours maybe) before something happens that you have to deal with. If that something is the car becoming confused by a situation (construction, accident ahead, etc) and stopping unable to proceed, so that a human is needed to take over and drive through the confusing location, you can do it. If that something is someone appearing in your headlights at night a couple seconds before impact, you won’t be able to do anything.

I expect the next generation of AV test car, that has no safety driver, will have a remote operator who can take over when the car is confused and stopped.

John Liu
Guest
John Liu

Maker of the LIDAR sensor used by Uber:

“We are as baffled as anyone else,” Thoma Hall wrote in an email. “Certainly, our Lidar is capable of clearly imaging Elaine and her bicycle in this situation. However, our Lidar doesn’t make the decision to put on the brakes or get out of her way.”

Todd Boulanger
Guest

More interesting news coming out of Arizona…as to Ubers recent cutbacks on the number of lidar sensors on its newest testing vehicles (Volvo has 1 vs. previous Ford’s 7) and the fail of AZ Governor’s Self Driving Oversight Committee, as it only had 1 ‘public’ meeting back in 2016 and then dropped off the radar (should have had at least 6 more quarterly oversight meetings to give guidance to on how to manage the on-road testing by DOT, Police, Insurance, AZ University Researchers, etc.):

https://www.abc15.com/news/region-phoenix-metro/central-phoenix/governors-office-arizonas-self-driving-oversight-committee-fulfilled-its-purpose