- cross-posted to:
- technews@radiation.party
- cross-posted to:
- technews@radiation.party
Cruise robotaxi collides with fire truck in San Francisco, leaving one injured::A crash between Cruise robotaxi and a San Francisco Fire Department truck occurred last night in Tenderloin. The incident happed a week after the California Public Utilities Commission (CPUC) approved 24/7 autonomous taxi ride services.
I’ve had my entire career working within industrial automation and I see the value AI and automated efforts being to the world.
I do not see the value in allowing private companies to playtest autonomous driving with human life as a potential collateral.
The argument keeps getting made — “how many humans make that same mistake daily?” — and it’s not equivocal; if autonomous vehicles cannot reach a 100% safety and accuracy feature, they should not be allowed to risk human lives.
Don’t let perfection be the enemy of good. I’m not suggesting we’re don’t have a really high bar, but 100% is just unreasonable.
It’s also the only acceptable level.
Let’s say autonomous vehicles were ten times safer than human drivers, but still not 100%. Would you let those people die by denying them the technology?
No, because most drivers are idiots. Comparing with the average is comparing against idiots.
Planes are mostly on auto pilot these days, where most accidents are actually due to pilot error. Will you never go on a single flight for the rest of your life unless it’s somehow 100% (not 99%, not 99.9%, but 100%) safe?
The only way to get to 100% is to ban cars.
The difference being that autonomous vehicles could reach 100% safety by removing all non autonomous vehicles from the road and imposing a communication standard between vehicles so they all know what the other vehicles are doing at all times.
That only applies to regions of the world where there’s no snow because autonomous driving in a snowstorm will probably never be solved.
And making walking illegal
US has been working for that for decades.
Yes they have
I guarantee that it would still not be 100%. Maybe 99% or even 99.9%, but not 100%.
“autonomous vehicle” in the article can’t handle most basic shit like emergency vehicle approaching. I’ve spent enough years in automotive engineering and all of this autonomous drive bullshit is ADAS with a few gimmicks and shouldn’t be nowhere near full control of the car, however this got out of hand and this shit is on public roads somehow.
You’re arguing that even if autonomous vehicles are safer drivers than humans, we should choose to make ourselves less safe by disallowing them? Fuck that. Nobody should have to die because AI makes you squeamish.
Unnecessarily hostile comment, too bad that attitude didn’t stay with Reddit.
AI doesn’t make me squeamish at all. Ignoring the context in which I stayed my background with automation was a choice, but the tub is using the general public to beta test hazardous equipment. Humans make errors and can be held responsible; corporations putting people at risk for no responsibility is reckless.
This is the best summary I could come up with:
A passenger riding inside the Cruise self-driving vehicle suffered “non-severe injuries” and was transported in an ambulance, according to an official company post on X (formally Twitter) this morning.
“We are investigating to better understand our AVs performance, and will be in touch with the City of San Francisco about the event,” Cruise’s post reads.
The incident comes less than a week after the California Public Utilities Commission voted to allow paid 24/7 robotaxi services in San Francisco, handing companies like Cruise and Alphabet-owned Waymo a huge victory.
City officials and residents have pleaded with the state to slow down the efforts, citing incidents in which self-driving cars have interfered with emergency vehicles.
Since Cruise began testing in San Francisco, its vehicles have obstructed traffic on multiple occasions, including a situation where 10 autonomous vehicles halted traffic in a busy intersection during a music festival.
And a cement mason’s worst nightmare occurred on Tuesday when a Cruise vehicle reportedly got stuck in wet concrete.
The original article contains 285 words, the summary contains 164 words. Saved 42%. I’m a bot and I’m open source!
I hate how we can’t just have decent public transit, like self driving cars can be cool but they usually end up like this.
Allowing operation of autonomous vehicles in high risk situations should include stipulations for major fines and restitution for anyone hurt by a misbehaving vehicle. It’s not the fire truck’s responsibility to give way, Waymo/Cruise better figure it out, they have smart people on their team.