One of the worst kept secrets of the past few years is that Apple is working on an Apple Car, ostensibly electric, and probably of the self-driving variety. Designing, building, and selling a self-driving car is a big risk for Apple. No, it’s not the money. Apple could dump $10-billion into the whole process, including a new manufacturing plant (perhaps in China) and you’d never find the loss in the annual financial statement.
Yes, Apple is that rich.
The Apple Car must be a big hit that brings both revenue and profit to the bottom line, otherwise AAPL will be the first self-driving car casualty.
Here’s a running list of problems as I see them. I’m from the Midwest. I love to drive. Self-driving cars will grow in popularity, most states in the U.S. will allow them, but put in a simple restriction that a licensed driver be in the car, and then we sit back and let our cars do the driving wherever we want to go.
In a few years, our self-driving cars will be able to drop us off at our destination and then pick us up when summoned or according to some pre-set schedule. Self-driving vehicles will become so commonplace that fleets will replace taxicabs (you can see why Uber wants into the electric self-driving car space). A few years after that, as the safety record of self-driving vehicles begins to impact the number of injuries and deaths in traffic accidents– for the better– a growing percentage of the driving populace will have forgotten how to drive.
Wait. Isn’t driving like riding a bicycle?
Perhaps, but humans age and many senior citizens lament the time when they are forced to sell their car because driving their car poses a greater risk to themselves and others. A self-driving car could allow many of us well beyond retirement age to continue to own a vehicle without actually being required to drive the vehicle.
Driverless cars are the stuff of science fiction legend. So is Isaac Asimov’s ‘I, Robot,’ and the Three Laws of Robotics. Essentially, a self-driving vehicle is a moving robot. Will the device adhere to Asimov’s laws?
- A robot may not injure a human being or, through inaction, allow a human being to come to harm.
- A robot must obey the orders given it by human beings except where such orders would conflict with the First Law.
- A robot must protect its own existence as long as such protection does not conflict with the First or Second Laws.
When faced with a potential accident at an intersection where school children are present, will an Apple self-driving car be able to make the right decision when faced with two evils? For example, crash the car into a wall, thereby injuring passengers, vs. saving the passengers by driving into a crowd of school children?
At the other end of the scale, being a Chicago native, I worry about self-driving cars being able to handle the slush, sleet, blinding snow and layers of ice on roadways during a typical Midwest winter.
As cool as a self-driving car might seem, there also seems to be a growing number of unanswered questions. Does Apple have the answers?