- Joined
- Nov 9, 2017
- Messages
- 9,813
- Reaction score
- 4,603
We are going to see a lot more of this once skynet goes live.
I wouldn't be surprised if the person on the bike did something erratic and the car didn't have enough time to adjust. Probably would have happened even with a human behind the wheel.
What may have happened is that the human thought the AI was better than he thought. Steve Wozniak, cofounder of apple, has talked about Tesla's AI. How you really got a keep an eye on it. Even like a construction cone.
"Man you have got to be ready — it makes mistakes, it loses track of the lane lines. You have to be on your toes all the time," says Wozniak. "All Tesla did is say, 'It is beta so we are not responsible. It doesn't necessarily work, so you have to be in control.'
https://www.cnbc.com/2018/01/31/app...-believe-anything-elon-musk-or-tesla-say.html
Tesla had not responded to CNBC Make It by the time this story was published, but in 2016, when Tesla announced the upgraded autopilot technology, Musk said calling the system "autopilot" even though it requires some monitoring is fair because, "It does not represent self-driving anymore than autopilot in an aircraft makes it self-flying."
If she was on a bike and there was vehicle operator in the car then I'm guessing she did something erratic. As noted, bicycles are vehicles, even though bike riders tend to forget that. The worse thing they do is jump from one side of the road to the next to make a turn and expect everyone to stop or slow down as if they were pedestrians.
Either way, unfortunate loss of life.
It could also be the driver's fault for overestimating the AI. Maybe he thought he could relax more. You have to keep an eye on these cars. You can't just let it drive by itself at current stage. You have to monitor it closely.
Why is this in the WR? Is there a political aspect to this story I'm missing?
Story feels incomplete without knowing exactly what happened. Was it a failure of the system or just an unpreventable accident that happened because the woman did something erratic? Can't really use this as an argument for or against vehicle autonomy without those details.
Wait a second. Self driving cars are on the roads already ? Wtf
For a few years now. They have millions if not a billion miles already.
I see it as a business/current events story. Nothing to do with AI really. AI cars are fine in my book and will have kinks. Whatever failed doesn't really matter. Just that somebody was killed makes Uber look bad. I consider it a relevant story about an emerging company about to go public. And other AI companies must fear this and have had similar incidents.
Other people can discuss the merits of AI though.
What may have happened is that the human driver thought the AI was better than he thought. Steve Wozniak, cofounder of apple, has talked about Tesla's AI. How you really got a keep an eye on it. Even like a construction cone.
"Man you have got to be ready — it makes mistakes, it loses track of the lane lines. You have to be on your toes all the time," says Wozniak. "All Tesla did is say, 'It is beta so we are not responsible. It doesn't necessarily work, so you have to be in control.'
https://www.cnbc.com/2018/01/31/app...-believe-anything-elon-musk-or-tesla-say.html
Tesla had not responded to CNBC Make It by the time this story was published, but in 2016, when Tesla announced the upgraded autopilot technology, Musk said calling the system "autopilot" even though it requires some monitoring is fair because, "It does not represent self-driving anymore than autopilot in an aircraft makes it self-flying."