Self driving Uber car kills human.

We are going to see a lot more of this once skynet goes live.
 
so if the human wasnt in the way, the human would still be alive.
 
Found a picture of the vehicle:

tumblr_ouniivZG4h1s1v3r1o1_400.gif
 
I wouldn't be surprised if the person on the bike did something erratic and the car didn't have enough time to adjust. Probably would have happened even with a human behind the wheel.

What may have happened is that the human driver thought the AI was better than he thought. Steve Wozniak, cofounder of apple, has talked about Tesla's AI. How you really got a keep an eye on it. Even like a construction cone.

"Man you have got to be ready — it makes mistakes, it loses track of the lane lines. You have to be on your toes all the time," says Wozniak. "All Tesla did is say, 'It is beta so we are not responsible. It doesn't necessarily work, so you have to be in control.'

https://www.cnbc.com/2018/01/31/app...-believe-anything-elon-musk-or-tesla-say.html


Tesla had not responded to CNBC Make It by the time this story was published, but in 2016, when Tesla announced the upgraded autopilot technology, Musk said calling the system "autopilot" even though it requires some monitoring is fair because, "It does not represent self-driving anymore than autopilot in an aircraft makes it self-flying."
 
Last edited:
If she was on a bike and there was vehicle operator in the car then I'm guessing she did something erratic. As noted, bicycles are vehicles, even though bike riders tend to forget that. The worse thing they do is jump from one side of the road to the next to make a turn and expect everyone to stop or slow down as if they were pedestrians.

Either way, unfortunate loss of life.
 
What may have happened is that the human thought the AI was better than he thought. Steve Wozniak, cofounder of apple, has talked about Tesla's AI. How you really got a keep an eye on it. Even like a construction cone.

"Man you have got to be ready — it makes mistakes, it loses track of the lane lines. You have to be on your toes all the time," says Wozniak. "All Tesla did is say, 'It is beta so we are not responsible. It doesn't necessarily work, so you have to be in control.'

https://www.cnbc.com/2018/01/31/app...-believe-anything-elon-musk-or-tesla-say.html


Tesla had not responded to CNBC Make It by the time this story was published, but in 2016, when Tesla announced the upgraded autopilot technology, Musk said calling the system "autopilot" even though it requires some monitoring is fair because, "It does not represent self-driving anymore than autopilot in an aircraft makes it self-flying."

Good point. The cars cameras may not have even seen the biker. I actually have thought about this a couple of times when I couldn't see shit out of my backup camera. I'm like umm what happens if this car was supposed to drive itself right now?
 
Why is this in the WR? Is there a political aspect to this story I'm missing?

Story feels incomplete without knowing exactly what happened. Was it a failure of the system or just an unpreventable accident that happened because the woman did something erratic? Can't really use this as an argument for or against vehicle autonomy without those details.
 
If she was on a bike and there was vehicle operator in the car then I'm guessing she did something erratic. As noted, bicycles are vehicles, even though bike riders tend to forget that. The worse thing they do is jump from one side of the road to the next to make a turn and expect everyone to stop or slow down as if they were pedestrians.

Either way, unfortunate loss of life.

It could also be the driver's fault for overestimating the AI. Maybe he thought he could relax more. You have to keep an eye on these cars. You can't just let it drive by itself at current stage. You have to monitor it closely.
 
It could also be the driver's fault for overestimating the AI. Maybe he thought he could relax more. You have to keep an eye on these cars. You can't just let it drive by itself at current stage. You have to monitor it closely.

That's true too.
 
Why is this in the WR? Is there a political aspect to this story I'm missing?

Story feels incomplete without knowing exactly what happened. Was it a failure of the system or just an unpreventable accident that happened because the woman did something erratic? Can't really use this as an argument for or against vehicle autonomy without those details.

I see it as a business/current events story. Nothing to do with AI really. AI cars are fine in my book and will have kinks. Whatever failed doesn't really matter. Just that somebody was killed makes Uber look bad. I consider it a relevant story about an emerging company about to go public. And other AI companies must fear this and have had similar incidents.

Other people can discuss the merits of AI though.
 
Last edited:
For a few years now. They have millions if not a billion miles already.


Honestly i didn’t know they were road worthy yet. I might be confused because google is still working on their car and that’s maybe why I didn’t think they were on the roads yet.
 
I see it as a business/current events story. Nothing to do with AI really. AI cars are fine in my book and will have kinks. Whatever failed doesn't really matter. Just that somebody was killed makes Uber look bad. I consider it a relevant story about an emerging company about to go public. And other AI companies must fear this and have had similar incidents.

Other people can discuss the merits of AI though.

Sounds like a mayberry thread to me.
 
She was j walking so I'll go with the accident is her fault.
 
Massive civil suit incoming.

Knowing how douchey Uber is, I wonder if they rushed through with the roll-out of these vehicles.
 
What may have happened is that the human driver thought the AI was better than he thought. Steve Wozniak, cofounder of apple, has talked about Tesla's AI. How you really got a keep an eye on it. Even like a construction cone.

"Man you have got to be ready — it makes mistakes, it loses track of the lane lines. You have to be on your toes all the time," says Wozniak. "All Tesla did is say, 'It is beta so we are not responsible. It doesn't necessarily work, so you have to be in control.'

https://www.cnbc.com/2018/01/31/app...-believe-anything-elon-musk-or-tesla-say.html


Tesla had not responded to CNBC Make It by the time this story was published, but in 2016, when Tesla announced the upgraded autopilot technology, Musk said calling the system "autopilot" even though it requires some monitoring is fair because, "It does not represent self-driving anymore than autopilot in an aircraft makes it self-flying."

The Woz also got scammed out of $70k worth of bitcoin. He sold them to a private individual and let the guy pay with a credit card. You can guess what happened next.
 
We need more details regarding how the vehicle reacted..did it slow down? Swerve? Attempt to brake?

one difference is financial liability for Uber. compared to you..if you hit somebody on accident you more than likely have an insurance policy that's going to cover it.
Uber's is going to get sued to the high heavens. They will settle.
 
Will be interesting to have the debates of the future, when car fatalities are down 90%, but we still have difficulty accepting them because there is no person to blame.
 
Didn't the a Google self driving car kill someone a couple of years ago?
 
Back
Top