If you think it’s easy, go sit in the passenger seat without talking or using your phone and see how long you can maintain focus.
There are decades of experiments showing that boredom and fatigue are huge problems for this kind of work where someone is mostly idle except for rare events. It’s why the TSA mixes fake images into the stream on X-ray scans so people see things regularly and don’t zone out. Anyone smart asking humans to do repetitive work builds layers of safeguards in to keep people engaged – or in Uber’s case they just hope they can shift blame to the person in a bad design, which worked so far.
I think the digression to whether the concept of a safety driver is flawed from the start irrelevant. Sure, maybe the process of having a safety driver backup is physically inhumanly possible because they'll get road blind and won't be able to react in time. But that didn't happen here. We have no evidence of that ever happening in self driving tests. Of the millions of self driving car test miles with safety drivers not once has a safety driver been looking at the road, but not able to react in time because of the monotony of the job.
What we have here is someone who was watching TV on the job. I adamantly refuse to believe that humans are incapable of not watching TV for a stretch of several hours.
If the driver had nodded off, looked out with glazed over eyes, or any number of other situations consistent with your scenario, then you'd have a point. But this was not that case.
> I think the digression to whether the concept of a safety driver is flawed from the start irrelevant. … What we have here is someone who was watching TV on the job
The fact that this particular person dealt with boredom by watching a video makes it easier to blame but that's just one of many ways in which people cope with boredom and it's not the root cause. You can blame the worker if it makes you feel better about yourself but if your goal is to reduce the number of errors the system has to be redesigned not to depend on people acting like robots rather than humans.
> Of the millions of self driving car test miles with safety drivers not once has a safety driver been looking at the road, but not able to react in time because of the monotony of the job.
Do you have any evidence supporting this claim? In particular, you'd need to prove that all of the self-driving car tests have the same one-person setup (which we know to be incorrect), every company ignored decades of well-understood risks and similarly didn't have any tasks for that person to perform and thus stay engaged, and you'd need to know how frequently incidents occur which require driver action to prevent a problem.
It's far more likely to be the case that there have been many situations where someone was distracted or focused on an area other than where a potential risk was but the other driver or the self-driving system successfully avoided it turning into an incident which made the news.
Pretty much all of them do. In fact Uber used to do that before they moved to AZ. Having a single person responsible for driving & monitoring was a way to save cost and log more miles.
This person’s job would be more accurately described as “fall guy” than “driver”. Maintaining constant attention toward a system that works properly by itself for hours, with a response time of seconds to correct its mistakes, is not a reasonable task for humans. This guy’s role is to take a paycheck for doing virtually no work, at the exchange of taking the risk of being blamed if things go wrong.
Not watching TV while you’re supposed to be driving is a pretty low bar.
But apparently, humans can’t even be relied on to do this, even when it’s their one single job.