Would you buy a microwave that freezes every time it has to warm a dish that it has never seen before? How about a self-driving car?

The first public demonstration using microwave radiation in the electromagnetic spectrum to heat food was at the 1933 Chicago World Fair. 84 years later we don’t think twice about walking away from the microwave and letting it cook by itself. That is not the case for self-driving cars and will not be the case for some time to come.

Will it be possible one day? Of course but the technology is not a few years away as some have suggested. Several issues with self-driving cars exist, which to my amazement, very few people have been discussing. Big tech companies are pushing it is like this technology is inevitable and therefore should be rushed onto the street and damn safety and logic. Below I have created my own list of some of the issues I have identified with driver-less cars and that I hope these tech companies and advocates for this technology will find answers to:

Self driving cars cannot be programmed for every potential scenario: Last June, a Tesla using Autopilot failed to apply brakes and killed the driver when the system could not distinguish a white 18 wheeler and the sky. The scenario was never programmed into the system or the sensor upgraded to adapt to this scenario because it was never thought of. The real world will throw at you (maybe not often but will) scenarios that the best scientists have never dreamed of and situations that will outsmart the best designed systems and AI’s. A human even if not perfect, has inherited the capacity for pattern recognition that has evolved over 200,000 years. Our ability to see patterns puts any AI or machine learning software to shame. We gained this pattern recognition in order to survive as a species and to be swindled out of our money in casinos by betting on red because the big sign above the roulette table shows the last to numbers to hit were red. Reacting to unforeseen events, even mundane ones such as recognizing the difference between a bright sky and a white truck trailer is easy to a human but incredibly complex to a machine. Watch the Tesla Accident Video.

Another example of a quasi self driving car failure is the case of the above video. This time the vehicle failed to see that the lane was merging. If you didn’t notice, the second big sign off the right lane at the beginning of the video warns of a lane merger. The quasi self driving Tesla and it’s distracted driver don’t see it which is followed by a big crash into a big bright yellow and black metal crash barrier. Tesla with its Silicon Valley approach brought a system online that is not ready. Not only is the system not capable of reacting to sudden changes and unknown events, it’s name “auto-pilot” is completely deceiving. Drivers believe that the car can drive itself when it is only a best a cruise control with a bunch of sensors and cannot react to unforeseen road conditions.

Ask yourself how many unusual driving experiences have you experienced in say the last year? From traffic lights being out, a detour sign oddly placed on the side of the road, washed out or faded lane lines, and chickens crossing the road to get to the other side. These things might be minor but for a system that has never encountered them it is like HAL 9000 shutting down. “I’m afraid. I’m afraid, Dave. Dave, my mind is going”.

Who will make life or death decisions? Imagine in the near future, you are in a self driving car enjoying your coffee while reading the Wall Street Journal on your way to work without a care or worry in the world. A little ahead is a truck in front of you and on the emergency lane a bicyclist with a child in the back. Suddenly a big piece of cargo falls off the truck and at 60 mph the vehicle cannot stop on time and therefore will crash killing you or the vehicle can chose to swerve onto the emergency lane and kill the bicyclist who happens to be a renowned neurosurgeon taking his 5 year old on a ride. Will the AI choose you or the neurosurgeon and his child? How will people react to the aftermath of such a situation? Will they thank the AI for letting them live or be so grief stricken that they will go Sarah Conner on any AI in sight? Maybe a solution would be like the current waiver message on touchscreen displays in cars that could read:

“Do want us to save your life no matter the cost to others?

“I Agree” - “No go ahead and sacrifice me”

Your car is smart but everyone else around you is dumb: If you think that your smart driver-less car is safe you are forgetting the millions of other cars all around you with their dumb drivers. If you want proof then stop reading this article and go on Youtube and watch those dashboard videos from Russia. Sure even humans can’t avoid other humans but at this stage in driver-less car development, humans still have an edge. Below is a picture of the latest crash of a Uber self driving vehicle. According to Uber, the accident was not the fault of the automated vehicle (but it is the only car on it’s side so it did not react very well).


Let’s be honest, at first driver-less cars will be a status symbol, a sign of wealth and a lot of drivers out there with take offense to them. This will result in some drivers testing or taking shots at them by cutting them off, swerving into or driving too close to them, or slamming on the breaks in front of them. I remember how I hated and cursed all those Prius drivers when they first came out. Their owners were so slow and sitting there watching the on board digital display showing them how fuel efficient their car was. I admit that a few times the idea of running them off the road was appealing. The only true way for complete safety would be to convert all the cars on the road to self-driving cars and use the same protocols in order to have a standard in emergency situations and reactions to driving conditions.

Hackers, terrorist, kidnappers and pranksters: Many articles and stories have come out talking about hackers and the risk they pose to driver-less cars if they took control of them but how about ISIS? Terrorists could use them as weapons like 911. They could use them to sow panic by running over pedestrians like the terror attacks in London recently or in Nice or use them to crash into shops and restaurants. Kidnappers could use the same technology to drive their victims to them without breaking a sweat or how about a bunch of teenagers, after a downing a few beers, deciding to paint some extra road lanes or change a few signs around and confuse self-driving cars into crashing? All are in the realm of possible but nearly impossible in the case of a human behind the wheel. Due to security, will you be able to accept to sit in your driver-less car for 20 minutes while it downloads the latest Norton antivirus or update and then reboot every other week?

Less accidents but more light injuries: Self driving cars are projected to reduce the number of road accidents and deaths which have been falling for a long time (there was a spike in 2015). But how many injuries will be caused inside the vehicle due to passengers doing so many things other than paying attention? Burns due to coffee spills, laptops/cellphones bumping into people heads, moving around the vehicle, doing things that should be done in the bedroom (oh yes a lot more of that will happen). Traffic deaths will go down but bumps, bruises, and concussions will go up because we are human and humans can’t sit still for very long.

Insurance and legal: In the case of an accident who will be at fault and who will pay? Will it be the human for failing to pay attention? The manufacturer’s for not adding the proper response to a rare scenario that resulted in a crash? The subcontractor which made the faulty sensor? The developer who forgot a comma in a line of code? The weather that an ordinary human can see thru but artificial sensors cannot? No one would fault a human for failing to see a deer in the dark but what will be the excuse of the car with infrared sensors and radar? Will there be higher or lower standards for self driving cars?

A dual insurance system could be adopted, with one in force when self-driving is on and another when the human is driving. This could mirror the insurance scheme for Uber and Lyft drivers but that hasn’t worked out very well. Ride sharing companies have too often tried to hid behind “we are a tech company not a transportation company” slogan to avoid liability. Without clear laws before self driving cars hit the road in numbers we are going to create a legal mess. Unfortunately tech companies like Uber, Google, and Apple have been blocking any attempts to regulate the nascent industry.

Using humans as backups is a bad bet: Right now, it’s all about the race to get self driving cars on the road and many companies are using the concept of the backup driver. Going full autonomous is too difficult and expenses. Using a human backup is easier and reduces some of the liability by passing ultimate responsibility to the human. Two issues there. First of all why would anyone ever buy one if you have to keep your hands inches above the steering wheel at all times waiting for an emergency. The second, is do you really expect humans to pay attention if they are not driving. Most will forget completely that they are the backup either on day one or after a few weeks. People will be watching movies, reading a book, typing on their laptops, playing games, pretty much everything except for paying attention to the road. The bad news, they will never know what hit them and the good news is they will never know what hit them. If the FAA knew that a primary backup system on the Boeing 787, had or will have a high failure rate wouldn’t they ground the entire fleet? Shouldn’t self-driving cars live up to the same standards?

The cylons are coming: Self driving car will not lead to the destruction of the human race (or maybe?) but still many people are weary of this technology. There is something about giving up control that has always been difficult for humans to do. According to AAA, a survey of American drivers found that 78% of them were afraid of riding in a self-driving car.

Elmer Ambrose Sperry invented the first autopilot for airplanes back in 1914 but air crafts today are still manned by a pilot and copilot. Today’s commercial air crafts are capable of landing by themselves but ask any pilot and they will tell you they rarely use it for takeoffs and landings.

Driving was never just about getting from A to B: Some IT experts forget that the most important object or icon in American culture in the last 60 years has been the car. It is not just a machine to get you from A to B because then we would all drive those ugly Prius’s (no offense if you own one) and drive like driving Miss Daisy.  The car is an expression of one’s self. It can be status symbol (Mercedes, BMW) or a show of rebellion (muscle cars, drag racers, low riders), or an expression of our personality or values (Little car, bug truck, eco-friendly, pink).

Many of us have fond memories of our first car because it took us on a ride from teenager to adulthood. It gave us our first taste of freedom, responsibility and some cases even romance. My first car was awesome, it was a beat up Mazda 323 with no power steering, no AC and half functioning heater. I would freeze in the winter and sweat in the summer, and took me forever to parallel park but boy did we have some adventures. Yes the car will change but I believe that our love affair for the car is not over. Self-driving cars are going to be important and be part of society but will not supplant the old fashion hands at 9 and 3 o’clock for decades. If everyone wanted to be driven around then public transportation would be booming in America.

Nov 21, 2017