If I took away your computer and made you use a typewriter, how would you do? With no cursor or backspace, you would, I suspect, make a lot of typing mistakes. Without spell and grammar checking your work would likely not be as crisp and clean as usual. After a while, you’d get used it. Muscle memory would build: without the safety net of the screen you would type more slowly and with more forethought.
Here in Minnesota we’re approaching the first big snow event of the season. The snowstorm commute is always an event of cars in ditches, fender benders and the like. It’s not that we can’t drive, just that we forget how to drive in the snow: traction is terrible, visibility both forward and down to the roadway can be atrocious.
What links these two things is uncommon activities that we can (re) learn to do, but typically get little practice at. Successfully driving a snow covered road, especially at night, during a storm, takes skill, patience and, critically, substantial insight and inference into the likely behavior of other drivers.
Now consider a level 5 vehicle. Level 5 is defined by the NHTSA as
An automated driving system (ADS) on the vehicle can do all the driving in all circumstances. The human occupants are just passengers and need never be involved in driving.
A true1 level 5 car can drive for us in the winter, regardless of the conditions. Or can it? If the road ahead is unclear, or other traffic is behaving in an unsafe manner, might the AI ‘decide’ it’s too dangerous to make forward progress? Perhaps it will resort to a ‘limp home’ mode? Five miles an hour max until conditions improve?
How would the passenger react in this circumstance? Many are going to want to take over, if they can: “Stupid car, I remember how do this!”. They’ll be assuming control at precisely the worst time to do so. The conditions are difficult, worse, their skills will be rusty. If the level 5 vehicle can safely carry out the vast majority of driving will most passengers drive 99.9% of the time?
The aviation world has long wrestled with this automation paradox. Thirty years ago, Warren Vanderburgh coined the term “Children of the Magenta”2 to describe pilots who were too focused on the automation in the cockpit and not enough on the act of flying, leading to unsafe situations when automation failed.
If a highly regulated and professional industry wrestles with this issue, what hope for the average driver? One thing is clear: an airliner in flight cannot just stop and refuse to continue. We do have that option in a car. That decision is not without consequence: whether it is blocking the road, frustrating the passenger or worse.
What do you think automation will do?
The difficulty of driving in the winter makes me highly suspicious that there will be any time soon when AI can do it for us. It’s hard enough to train a machine to drive with well lit, clearly marked roads. In the absence of these, how will an AI cope? Is there enough input data to really train a vehicle to drive safely on a snow covered road? Current AI lacks the higher level cognition to real time reason about the conditions. ↩︎
If you’ve never watched his lecture on this, I guarantee it’s worth 25 minutes of your time. ↩︎