This week’s Monday Roundup is sponsored by Endurance PDX, a cycling studio and physical therapy provider conveniently located off North Williams Avenue.
Here are the most noteworthy items we came across in the past seven eight days…
‘Breaking Away’ remake?: The Indiana University newspaper says a group of students want to remake Breaking Away, the classic movie based on the Little 500 cycling race.
The helmet fiasco: What happens when concern-trolling, paternalistic, out-of-touch bureaucrats attempt to address dangerous roads that are killing a growing number of bicycle users? They propose a federal mandatory helmet law. The idea is so bad that the National Association of City Transportation Officials (NACTO) has already issued a statement urging it to be taken off the table.
More federal shenanigans: If it makes you feel any better about how America’s federal agencies don’t give local governments enough authority over their own roads, the problem appears to be just as bad in Canada.
Advertisement
Party culture and its discontents: An outdoor industry news outlet ponders the impact and root causes of ageism when it comes to the work culture and marketing of outdoor sports.
Your bicycle’s journey: The BBC put together a fun look at the history of bicycle production and how modern global supply chains take bicycles from factory to customer.
Mobility justice: In light of Commissioner Eudaly’s focus on racial disparities in the Rose Lane Project, read more about the national movement to put “mobility justice” at the center of transportation reform.
Sending help: Since it doesn’t seem like American officials have been successful implementing lessons learned on study trips abroad, a delegation from The Netherlands is in California this week to help hasten bikeability.
Advertisement
Better curb zones: Columbus has become the second U.S. city to start a pilot using Curbflow, a technology service provider that optimizes curb zones to make them safer and more efficient. (Portland officials have met with company reps and are considering the same move.)
The wonder drug: We all know cycling does wonder for physical and mental health — that’s why medical doctors in some parts of the U.K. are starting to prescribe it to people with long-term illnesses.
Blind spot boulder: An innocent rock in an Omaha parking lot has become repeatedly victimized by reckless people unable to safely operate their large vehicles.
Uber sucks: Not sure what bothers me more: The fact that a private company’s untested and unregulated new product led to the death of an innocent road user, or the fact that most major media outlets referred to the victim as a “jaywalker”.
American in Copenhagen: Always fun to read how U.S. reporters are impacted after visiting one of the most bike-oriented cities in the world.
— Jonathan Maus: (503) 706-8804, @jonathan_maus on Twitter and jonathan@bikeportland.org
— Get our headlines delivered to your inbox.
— Support this independent community media outlet with a one-time contribution or monthly subscription.
Thanks for reading.
BikePortland has served this community with independent community journalism since 2005. We rely on subscriptions from readers like you to survive. Your financial support is vital in keeping this valuable resource alive and well.
Please subscribe today to strengthen and expand our work.
FYI, blind spot boulder link goes to: https://nyc.streetsblog.org/2019/11/05/surprise-federal-panel-seeks-mandatory-helmet-laws/
Thank you again for the delightful writing, as always.
Thanks for the round-up, as always. FYI the “blind spot boulder” link currently points to the helmet law story.
Thanks for the heads up Josh and Eli. I’ve edited the links.
Getting the links switched was an honest mistake. I know the first thing I thought of when I saw all those photos of SUVs that had crashed into that rock was, “This is why road safety efforts need to focus on getting people who bike to wear helmets”.
I can’t help but wonder if Uber’s unleashing a self-driving car that doesn’t recognize jaywalkers could be a symptom of the attitude you see in comments about unwarranted police shootings:
“The person who was shot did not comply, therefore they deserved the consequence”. People seem comfortable with the idea that someone who say, doesn’t stop walking immediately when told to by a cop deserves death. Did Uber’s engineers have a similar attitude–perhaps at least subconsciously?: “If the person is jaywalking, they are violating the law, and deserve the consequences. ” Actually, that attitude is expressed regularly in the comments section in news stories about pedestrians getting run over if the pedestrian was outside a crosswalk, wearing dark clothing, intoxicated, distracted, etc.
“Did Uber’s engineers have a similar attitude”
Though I have no knowledge of the particular people involved in Uber’s self-driving vehicle program, I can assure you with absolute certainty that this is not the case.
You left the second half of my sentence off that you quoted: “….perhaps at least subconsciously”.
I agree that Uber’s engineers weren’t likely to have sat around the table saying “Anyone jaywalking deserves to be run over.” And I’m also not equating a cop using blatantly excessive force with someone driving at night and not watching for pedestrians crossing against the light. But they’re part of the same continuum (albeit at the extreme ends)–the idea that extreme consequences (deaths) are acceptable when people stray from following the law.
I can easily see engineers failing (as they did) to work diligently enough to ensure their self-driving system adequately protects pedestrians who are jaywalking, because it wasn’t central to their own approach to driving, an approach colored by the idea that drivers don’t have to concern themselves (as much as they should) with the safety of people who behave unsafely. I can see that being true because that approach to driving is so pervasive among drivers.
I’d guess much more attention went into ensuring the self-driving cars avoid drivers who are behaving illegally or unsafely, and I’d guess that’s because those present a danger to the self-driving cars’ occupants much greater than hitting a pedestrian.
Creating software that evaluates whether a given pedestrian is legally or illegally crossing would be quite difficult, so doing it “subconsciously” seems a near impossibility.
The driving software almost certainly has a system (or systems) continuously evaluating the scene identifying objects (people, cars, rocks, balls, bikes, etc.), tracking their motion, predicting their paths, and feeding that data to a higher level function. That system would be doing the same work whether the vehicle is at a legal pedestrian crossing or not, and even if the vehicle is not moving. It obviously failed for reasons that I hope are now well understood by the engineers (further real-world testing without understanding and repairing the fault would be highly negligent).
I think the likelihood your theory is correct is vanishingly small. More likely, the object detection/recognition system simply didn’t “see” the pedestrian, and so told the car the way was clear. Maybe it was texting.
“Creating software that evaluates whether a given pedestrian is legally or illegally crossing would be quite difficult, so doing it “subconsciously” seems a near impossibility.”
I never said that the software was evaluating whether what the pedestrian was doing was legal or not. The fact–according to the article–is that the software did not identify people jaywalking as something to be avoided. (I assume it does identify and avoid people who are not jaywalking, otherwise there would have been even more deaths.)
When the flaw is as enormous as this one–designing and putting out into the real world a car that kills pedestrians who are outside crosswalks because it makes no attempt to not hit them (!)–I don’t think it’s at all unreasonable to wonder if that enormous failure could at least in part be explained by the designers having–at least subconsciously–an attitude that pedestrians who are not following the law are deserving of less protection than other users of the roads.
Since a car cannot easily visually distinguish between a jaywalking pedestrian and a non jaywalking pedestrian, it is logical to conclude that the software had problems detecting pedestrians in general, and it happened to manifest itself in a jaywalking pedestrian. No one who knows about machine vision would be surprised that there could be a set of inputs that to us were clearly a person, but which the computer did not recognize as such. This may a singular event, or it may happen a dozen times per day across the Uber fleet, and it was only this one instance where the driver didn’t intervene in time.
Thanks for the defense of software engineers. In my experience they are committed and detailed problem solvers who thrive on edge cases. But my generalization, like Q’s, is irrelevant. Just read a thorough analysis of the collision, like I included in another comment. The software did detect ‘something’, but got confused for reasons that are complex and, understandably, perhaps unanticipated. That’s what real world testing, as opposed to white boards and labs, is designed to reveal, and why there was a human behind the wheel. Had that human been paying attention and doing their job, this would likely have been a non-event, and simply provided a data dump later that night of more scenarios to be solved before a self driving car is actually unleashed.
Unfortunately, the people running the project, after creating software that allowed the car to drive right into the pedestrian even though it sensed their presence several seconds before, also hired a driver who for some reason didn’t understand that he shouldn’t be watching a video while the car was driving straight into the pedestrian.
I agree; it is unfortunate that the software do not perform correctly, and that the driver was not paying attention. I was going to say those were independent events, but they probably weren’t. It is probably the case that the software was generally so good that the driver did not feel they needed to pay attention. This is the same problem that some Tesla owners have with their “autopilot” system.
My favorite tweet last week was John Greenfield’s (@Greenfieldjohn) calling Uber/Lyft *private chauffeur* companies.
They aren’t “ride sharing” companies, except in rare cases.
My favorite article on Uber this week:
https://www.businessinsider.com/uber-needs-buy-merge-lyft-survive-make-profit-2019-11
Hey, they’ll be fine! They just need to have a complete monopoly in their marketplace!
One of the many transportation friendly options here in Slovenia is an app called Prevoz, it’s a people to people ride share, no money given to an uber or lyft company. It’s an actual ride share with someone who has room in their car going where you want to go anyway. It’s not someone driving around the block waiting to be called. I’ve used it to take my bike to the beach, did some biking and camping for a few days, then got a ride back with my bike. I still think if there’s public transportation that’s what you should ultimately support. I’ve put my bike on the buses & trains here too. Living the car-free, bike friendly dream in Slovenia.
I like good old “taxi.” One of those words like “coffee” that works almost worldwide:
https://www.indifferentlanguages.com/words/taxi
Re: boulder story – https://jalopnik.com/evil-boulder-menace-somehow-manages-to-take-out-a-fistf-1839642818?rev=1572988470711
It’s interesting (though not surprising) that Jalopnik takes the tack of blaming the drivers for not paying attention. An alternative interpretation is that at least a sizeable percentage of adult humans (I would say it’s all adult humans) are incapable of operating motor vehicles (and especially huge SUVs) safely.
It reminds me of the attempts to inspire guilt in everyday people for their carbon footprint. The alternative approach is to note that systemic change, led by government, is what’s necessary to solve the problem, so individual actions don’t much matter either way.
While I agree that there is a solid percent of operators who can’t do so safely…I just lump in attentiveness as one of the criteria for safe operation.
Are you implying that some drivers are seeing the stone but can’t learn what the turning radius of their machine is, so they roll up on to it?
It may be that these folks don’t remember clearly seeing the boulder (moments before) , and then presuming none to be there- not unlike right-hooking a cyclist.
In defense of Jalopnik, they wrote the following line in a follow-up boulder blog:
“Will we heed the sage boulder’s warning? Or will we silence the rock’s call to a more sensible world, perhaps one where any jackass is not allowed to drive big, stupid cars on public roads just because they feel like it?”
Source: https://jalopnik.com/actually-the-suv-defeating-rock-is-good-1839669833
So it is just through dumb luck that hundreds of millions of people drive billions of miles every year and do not have an accident? I mean, I ride all the time, so I see super dumb stuff constantly, but to argue that there is no one capable of adequately operating a motor vehicle safely is pure hyperbole.
As with all things, it depends on your definition of “safely”.
Please leave Breaking Away alone. No remake will do it justice especially one with a SJW slant.
Re: Uber Sucks
While Uber’s untested and unregulated new product led (as in ‘contributed) to the death, most analysis of the facts from the investigation conclude the death was avoidable had the human involved been paying attention. Whether Uber should have had more than one operator in the car during testing aside, a human was present and responsible for monitoring the system and taking control as needed. To paraphrase findings:
“Investigators looked at the speed of the car, braking ability, lighting conditions, and the operator’s half-second reaction time once the victim as seen.
Had the operator been paying attention, they could have stopped the car 42.6 feet before the impact.
The crash would not have happened even with a driver whose reaction time was twice as slow as the operator present, had the driver been watching the road.”
In other words, this death can be seen as just another one of 1,000s of avoidable deaths that will continue as long as humans are the primary operators of large and powerful vehicles.
Forbes has an excellent analysis here: https://www.forbes.com/sites/bradtempleton/2019/11/06/new-ntsb-reports-on-uber-fatality-reveal-major-errors-by-uber/#d28d141781db
It concludes, ‘Each year self driving deployment is delayed will result in the deaths of people at the hands of drivers who did not have the opportunity to give up the wheel and make their trip more safely in a self driving car.’
Uber may suck, and there are certainly lessons to be learned, but the skepticism and feet dragging around safety innovation and progress results in more, not fewer, tragedies.
Just because you want safety innovation doesn’t mean that’s anything Uber has accomplished.
Safety doesn’t need to involve innovation/progress. In fact they might be somewhat opposed. Walking is the safest mode by far and also the least innovative. But when someone innovated to hitch horses to carts, the incidence of grisly injuries ticked upward. When humanity continued to progress innovatively through all the supporting and precursor technologies that eventually culminated in the internal combustion engine, injuries and deaths soared even higher. Progress didn’t stop there, though; we developed computers, and from there, smartphones. So now we have something to pay attention to, rather than the road, and the mind-numbing tedium that is our lives. If you kill someone… too bad, it’s progress!
You wrote, “In other words, this death can be seen as just another one of 1,000s of avoidable deaths that will continue as long as humans are the primary operators of large and powerful vehicles.”
No, this was a controlled test of an unproven product, and Uber–which had 100% control over who was driving–had a responsibility to put a professional driver in it who took his job seriously.
Obviously a hack for some tech company. Very bad logic.
1) Technology failed and ran down an innocent vulnerable road user.
2) human back-up operator for safety failed by inattentiveness
3) Blaim the human for the accident
4) Argue for technology to replace humans…..
DUMB
Correct answer – Fewer cars, more options, safer urban design (like very reasonable distances between safe crossings) and most vehicles using the urban roads at least being mass transit buses driven by professional drivers.