Since the well-publicized fatal collision in Florida of an Autopilot-enabled Tesla Model S with the underside of a trucker’s trailer, national attention has focused on the accelerating speed of autonomous vehicle development.
Truckers are in part worried for their own safety in sharing the road with cars such as the Tesla, whose Autopilot function is sold to consumers as having the ability to effectively steer – and stop – the vehicle. The death in the truck crash, however, exposed a clear design flaw in that the program did not distinguish between the white side of the trailer and the sky.
Truckers polled this month agreed by and large with much of Consumer Reports’ “Too much autonomy too soon” verdict about the technology. Slightly more than a third of respondents say that most all of the magazine’s recommendations (detailed in the poll responses) should be taken up before Autopilot gets prime time on U.S. highways.
Some readers took it further, stressing a need for appropriate barriers to operation. License to operate such a unit might be “the proper place for a speed limiter, and an ELD, driver training, riding with an experienced driver, then a special license to operate with an endorsement for passengers, groceries, etc.” So cracked wise a reader posting as Tdktrans under this story about the National Transportation Safety Board’s ongoing investigation of the accident. “Then you can have your little self-driving car to go as you please.”As is the case in many accidents, Don McGrady pointed out, fault here can’t be placed solely on the Tesla and its driver. Though the car was found to be traveling at 9 mpg over the posted speed limit, the truck and the trailer it ran under, McGrady noted, “should not have crossed the road in front of the car if there was not sufficient room to do so without causing the risk of collision.”
Added Tdktrans: “Every driver has been in the situation where they have said ‘Screw it, they’ll stop’ ” when trying to maneuver into a crowded space. “That didn’t work in this case.”
The commenter went on to suggest that, combined with the possible outcome of Autopilot being pulled or disabled in the future, engineers’ inability to fix the problem of effective automatic control might deliver benefits: “Maybe we’ll get lucky, and Tesla can’t fix the problem, and this unfortunate accident will never happen again.”