The grounding of all produced Boeing 737 Max aircraft worldwide I imagine gives some pause to the proselytizers of a glittery robotic future for all manner of transportation. Or if it doesn’t, it well ought to.
Max pilots around the world — in the U.S. via the pilot-issues-reporting system handled away from FAA regulators by NASA to protect the pilots from retribution — have reported issues with an autopilot-engaged feature that, during ascent, can pitch the nose of the plane down when it otherwise shouldn’t. I’m sure you’ve heard this if you’ve caught any news on the twin disasters in Indonesia and Ethiopia that killed all people on-board in remarkably similar circumstances — similar at least to the layperson. (The planes dove back to earth after an initial partial ascent.)
If both disasters are proved to be directly related to pilots’ inability to deal with the aforementioned autopilot-engaged software issues … boy does that raise all manner of questions that also happen to be germane to what on-highway pilots are facing today. Many such questions are obvious. Is there an adequacy-of-training issue when it comes to new features on these planes? (Reportedly, yes.) Were regulators too quick to approve new automated features in the plane’s design coming to market?
If aviation, which is often pointed to by the proselytizers of all manner of transportation automation as example of levels of safety that can be achieved with it .. if aviation isn’t always getting it right, what does that say for the rest of us?
I’m awfully tempted to take the view of psychiatrist Vatsal G. Thakkar, writing in the New York Times in a March op-ed that argued a traditionalist’s view of the value of engagement, one grounded in his case in knowledge of how the human brain works to ignore things it doesn’t absolutely have to pay attention to. The core of his argument? Let’s keep shifting our vehicles manually, simply put, in order to stay engaged behind the wheel and avoid our brains’ instincts to turn us into passengers when we ought to be pilots. (Thakkar outlined his realization of his own growing reliance on back-up-assist tech in a newer family car that supplements his own older stick shift model.)
Thakkar’s piece is certainly worth a read. It stands out to me given the week following its publication I sat in conversation around the Mid-America Trucking Show at a certain point with a motorist who confessed his reliance on his spanking-new four-wheeled vehicle’s automatic lane-keeping, cruise and auto-braking features to combat a significant issue staying awake behind the wheel on highway runs. Yikes.
We’ve seen where such over-reliance can lead in the well-publicized fatalities involving Tesla’s cars’ so-called “Autopilot” driving mode — named after, what else? — and Thakkar mentions those crashes in his piece, likewise Uber’s pedestrian strikes while testing self-drive tech on the road with a reportedly overly complacent driver.
There’s a valid argument made that there’s the stakes are very high in getting automation right in the transportation of hundreds of people at a time in giant metal tubes with wings high above the earth at speeds of hundreds of miles an hour. Those stakes may well be higher than in moving 80,000 lbs. of freight and one human at 70 mph. Yet, given the crowded go-go reality of the road, the superhighways of today and tomorrow present pretty darn high stakes themselves.
The incremental additions of automated features at highway speeds is ultimately an experiment with how much distraction the human brain can resist before it falls asleep at the wheel.
Vehicle makers new and old, and end users, would do well to recognize and take very seriously the implications.