Can a Self Driving Car Have a Bad Day (2017)

 

Road Traffic Drivers Beware 

 

Statistics show there are approximately 1,250,000 road traffic deaths globally every year.  There are approximately 35,000 road traffic deaths in the United States every year.  These are not just numbers.  These are the friends, neighbors, classmates, colleagues, and families.  The numbers are scary and the numbers do not lie.  The obvious and lay explanation is that "We are only human after all" and that humans make something called "Human Error."  We should therefore move over and let "Self-Driving" and  "Intelligent" cars take over.  Because we can repeat: 1,250,000 road traffic deaths and the numbers do not lie. 

 

The mature explanation is that numbers also do not tell the truth.  Numbers do not tell anything.  They are "only numbers after all."  But more on that in a moment - that is the runner-up point number 2.

 

Point number 1 is about getting "Self-Driving" cars on the road.  Because a "Self-Driving" car is infallible, precise, and "intelligent."  That is the lay explanation.  It is not human.  It is a computer.  Therefore it would be a better operator than a flesh and blood human. 

 

The mature explanation is that calling a computer intelligent is an insult to intelligence everywhere.  A slime mold would turn its... appendage analog of a nose... up at the computerized term of intelligence.  But first, some context. 

 

We humans tend to socialize everything.  We are social entities.  We connect with others by inferring intents via our mirroring functions - likely rooted in our prefrontal cortex: inferior frontal gyrus.  It helps us to better match and complement our friends, family, and opponents. 

 

But this also gives us the tendency to anthropomorphize.  Lightning bolt?  The long haired, long bearded God (Goddess?) of Thunder is angry.  Earthquake?  The Land Lord is shifting in his (her?) bed.  MBTA Subway Gate took our money but failed to open the gate?  The Gate is having a bad day.  We ascribe human qualia where there is no human.  "Watch out for the speeding Cars..." we say, as if the cars are actually Cars from the Pixar movie of the same along with faces, eyes, nose, and mouth.  The truth is that a car is simply "4 wheels and a seat."  Our neighbor Jimmy is driving it. 

 

There is no such thing as a true self-driving car.

 

When Jimmy (or Navya) decides to remote control drive it, he still drives it even though his seat is not moving.  When Jimmy's (or Navya's) remote controlled car crashes, that's still Jimmy.  If Jimmy put a bobble-head in the front seat with a brick weighing down the accelerator and a pendulum on the steering wheel, how Jimmy placed them and let them go while Jimmy runs off to lunch is still up to Jimmy.  It's all Jimmy all the way down.  

 

Should Jimmy try to point fingers in case of an accident and say "it was not my fault" or "the other driver should have stopped but did not" or "I had right of way" is irrelevant.  We are social creatures.  A collision is a collision.  Right of way is a convention, not a zone of dereliction of responsibility.  A mature driver would know for example why speeding is so dangerous.  Hint: it has nothing to do with Jimmy the driver losing control of his own "4 wheels and a seat" and skidding.  Answer: it has to do with signaling and coordinating with the other drivers in a share-the-public-road setup.  Jimmy can speed in an armed tank for all we care - if he is alone on his own private road.  So again, why did the "intelligent" "self driving car" get into an accident? 

 

If Jimmy the programmer with his bobble-heads and pendulums says:

 

"The [programmed computer driven] shuttle did what it was supposed to do and stopped. Unfortunately the human element, the driver of the truck, didn’t stop.”

 

Then he has perjured himself and completely misunderstood what his job was.  Engineers are human too.  Therefore, all parties involved were human elements.  If the Navya engineers' shuttle stopped, it did so because the sophisticated bobble-head and brick fell off the accelerator.  The business manager told the Navya engineers to have the brick fall off the accelerator.  If the engineers stop at this point - we put in exactly what the manager told us so it's not my fault, it's the manager's - that speaks volumes on the Navya corporate culture of responsibility, cross business vertical communications, and the general mismatch of its business and technology divisions.

 

If a mature driver who has the right of way - translation: the convention / onus of responsibility to act first and finish using the way so the next driver can - sees another driver coming to the same space, I doubt stopping in all cases without communicating with the other driver in any way is exactly a wise choice.  The spokesperson stating the accident was at low speed - a misguided attempt to showcase it was not dangerous - only highlights there was no excuse from not detecting and tracking the other driver. 

 

No, a self [computer] driving car cannot have a bad day. The computer computes precisely the formulas that Jimmy writes in.  Jimmy's business manager wrote up the rules in plain English.  Jimmy's manager presumably read the Motor Vehicles instruction manual for a class D license.  They all hopefully had some experience driving.  How well the manual's authors, Jimmy's manager, and Jimmy understood and communicated with each other in this telephone game chain determines how and when the bobble-head/computer starts or stops the car.  Nothing more.  The car cannot have a bad day.  But Jimmy's team might have.

 

Returning to point 2, the proponent of Jimmy's code driven car reiterates the 1,250,000 road traffic deaths.  Bobble-head or no, it is better in saving lives.  This is definitely possible.  But the death rate for bus and train traffic puts all car traffic to shame by several orders of magnitude.  It would be a far more rational, higher return/lower risk solution to saving lives if we invested in more bus and rail traffic.  And these drivers and conductors are all human.