Why do they have to look so shit.
The 'driver' is there to provide oversight and intervene if necessary. It's slightly different here because they are there in a paid capacity, but the same exists in other autonomous systems like Tesla Autopilot. I've said this many times: it's total bullshit. Give someone a task where they're mostly redundant and you will inherently get disengagement. You can either have full autonomy or you make the driver perform some continuous activity, no middle ground.I also saw there was a "driver" in the uber car that killed that poor woman, but why, what were they for? Given that the legislation and insurance for Driverless cars is still in it's infancy, I wonder who will get the blame, the non driver, the programmer or uber?
This is basic stuff too if you understand how attention works psychologically. If a task doesn’t fully engage our attention, the excess capacity latches onto something else. If that engages us, our limited capacity system diverts to focus on that instead. Overcoming this process is very hard and the subject of its own field of endeavour (ISTR it’s called something like psycho-ergonomics but a search doesn’t reveal anything so I must have that wrong.). Putting people in a car with nothing to do except in an emergency breaks every rule in that book.The 'driver' is there to provide oversight and intervene if necessary. It's slightly different here because they are there in a paid capacity, but the same exists in other autonomous systems like Tesla Autopilot. I've said this many times: it's total bullshit. Give someone a task where they're mostly redundant and you will inherently get disengagement. You can either have full autonomy or you make the driver perform some continuous activity, no middle ground.
A trans woman apparently. The Daily Mail were trying to make a big issue out of that but they had nothing. She's an ex con, but thoroughly reformed.I also saw there was a "driver" in the uber car that killed that poor woman, but why, what were they for? Given that the legislation and insurance for Driverless cars is still in it's infancy, I wonder who will get the blame, the non driver, the programmer or uber?
Wrong.A few things spring to mind after watching.
A human wouldn't have been able to react.
38 in a 35, apparently. The video above is a 50 limit, FWIW, although the actual speed is unknown.The car was driving really fast, I assume legally.
The driver is meant to be there for safety, but as discussed, it's a flawed premise.As mentioned earlier, the "pilot" isn't there for safety. Had no eyes on the road and isn't concentrating. Even if she had doubt it'd have made a difference. Can't be blamed but I bet she feels awful.
Nothing complicated - it should have been able to detect the person. It didn't. No avoidance, no braking.Why couldn't the car detect the person in the road earlier? If they can't drive better than people what's the point?
...
The car also doesn't swerve. Could it have missed the pedestrian but then been involved in another accident? Was there a calculation made by the algorithms to not swerve in case the pilot or more people were hurt? I assume there is something like this built in? What are those calculations?
This is a massively overblown fear/discussion point, I think, that, if I’m right, derives from a possible misunderstanding of how machine learning works (unless it’s me that misunderstands how machine learning works). It’s an issue raised more by philosophers than programmers. The assumption is that the machine makes decisions in the same way that people makes decisions, which involves moral imperatives. But I don’t think the machine is capable of working through the consequences of an act more than a very short time period ahead. I don’t think it forms a possibility space of downstream consequences and chooses between them. I think it just acts to avoid a developing risk in the way that the data shows is most effective.The car also doesn't swerve. Could it have missed the pedestrian but then been involved in another accident? Was there a calculation made by the algorithms to not swerve in case the pilot or more people were hurt? I assume there is something like this built in? What are those calculations?
No, for all the reasons already given.Looking at that video, it seems when she became visible from the shadows, she was only a couple of car lengths in front of the uber car. As the 'thinking' distance at 40 mph is 3 car lengths/12m, with braking distance another 6 car lengths, I am not convinced that a driver could have changed the outcome, and can understand the police chief saying the car is likely not at fault.
Yeah, but also: so does a human driver, if they even do that (target fixation etc). The trolley problem is something confined to philosophers and disaster movies.This is a massively overblown fear/discussion point, I think, that, if I’m right, derives from a possible misunderstanding of how machine learning works (unless it’s me that misunderstands how machine learning works). It’s an issue raised more by philosophers than programmers. The assumption is that the machine makes decisions in the same way that people makes decisions, which involves moral imperatives. But I don’t think the machine is capable of working through the consequences of an act more than a very short time period ahead. I don’t think it forms a possibility space of downstream consequences and chooses between them. I think it just acts to avoid a developing risk in the way that the data shows is most effective.
No, for all the reasons already given.
Some police spokesperson (the chief?) came out early on and said it was the victim's fault, and then the video came out, and the police department backtracked, claimed they were taken out of context.Your conclusion is just based the video, from which it's hard to conclude one way or the other, but 2 seconds is spot on the time it takes a human to normally react, then you also have the braking time on top of that. The police on the ground clearly have far more to go with in order to judge the situation, so I am more inclined to accept what they have said at this point in time.
Some police spokesperson (the chief?) came out early on and said it was the victim's fault, and then the video came out, and the police department backtracked, claimed they were taken out of context
Chief of Police Sylvia Moir told the San Francisco Chronicle on Monday that video footage taken from cameras equipped to the autonomous Volvo SUV potentially shift the blame to the victim herself, 49-year-old Elaine Herzberg, rather than the vehicle.
“It’s very clear it would have been difficult to avoid this collision in any kind of mode [autonomous or human-driven] based on how she came from the shadows right into the roadway,” Moir told the paper, adding that the incident occurred roughly 100 yards from a crosswalk. “It is dangerous to cross roadways in the evening hour when well-illuminated managed crosswalks are available,” she said.
https://jalopnik.com/video-shows-pedestrian-in-fatal-uber-crash-stepped-in-f-1823922228The police statement was after seeing the video...
I haven't seen anything about the police backtracking, do you have a link?
Some police spokesperson (the chief?) came out early on and said it was the victim's fault, and then the video came out, and the police department backtracked, claimed they were taken out of context.
Regardless, have you looked at many dashcam videos? They're all much worse than a human at capturing available information that you take in when driving at night. The sensor (the infeed to purely visible light visual processing) is probably worse than the human eye, and then the rendered output is worse than the sensor data.
Then, the idea of the 'car not being at fault' misses the myriad ways in which it's supposed to be able to detect things better than the human eye.
I don't see what machine learning has to do with it. There will be some basic rules about car handling. The car could have swerved but flipped over so perhaps this is a rule that it couldn't break. Maybe it should have?This is a massively overblown fear/discussion point, I think, that, if I’m right, derives from a possible misunderstanding of how machine learning works (unless it’s me that misunderstands how machine learning works). It’s an issue raised more by philosophers than programmers. The assumption is that the machine makes decisions in the same way that people makes decisions, which involves moral imperatives. But I don’t think the machine is capable of working through the consequences of an act more than a very short time period ahead. I don’t think it forms a possibility space of downstream consequences and chooses between them. I think it just acts to avoid a developing risk in the way that the data shows is most effective.
The concept of machines making a calculus of life is dependent on the idea that the machine is modelling future scenarios and choosing between them. This just isn’t the case — it’s not how machine learning works.I don't see what machine learning has to do with it. There will be some basic rules about car handling. The car could have swerved but flipped over so perhaps this is a rule that it couldn't break. Maybe it should have?
Maybe not explicitly to do with calculus of life but there is a calculation that was made that meant the car wasn't braking or swerving. Should it have? I imagine there's a bunch of engineers looking at that now...The concept of machines making a calculus of life is dependent on the idea that the machine is modelling future scenarios and choosing between them. This just isn’t the case — it’s not how machine learning works.
That's some serious fuck up then. So who is responsible?Uber's cars have lidar and radar sensors in addition to cameras, and those sensors don't require ambient light to function. So the vehicle should have spotted Herzberg even if the road was pitch black.
In the States it's almost impossible for a driver to be at fault in a pedestrian fatality, unless you run a red light (red light cameras are unconstitutional). The victim would be to blame even on a sunny day, in this instance. Even if it's where everyone crosses the road because it'd where the shops are. Zebra crossings are an amazing invention.The police statement was after seeing the video...
I haven't seen anything about the police backtracking, do you have a link?