Truckers: No to ‘beta’ testing when lives at stake

| August 15, 2016

Since the well-publicized fatal collision in Florida of an Autopilot-enabled Tesla Model S with the underside of a trucker’s trailer, national attention has focused on the accelerating speed of autonomous vehicle development.

Truckers are in part worried for their own safety in sharing the road with cars such as the Tesla, whose Autopilot function is sold to consumers as having the ability to effectively steer – and stop – the vehicle. The death in the truck crash, however, exposed a clear design flaw in that the program did not distinguish between the white side of the trailer and the sky.

williston-f3

The wreckage of the Tesla Model S involved in a crash that was fatal for its driver earlier this year. NTSB noted that, with Tesla’s beta “Autopilot” function enabled at the time of the crash, the vehicle was speeding.

Truckers polled this month agreed by and large with much of Consumer Reports’ “Too much autonomy too soon” verdict about the technology. Slightly more than a third of respondents say that most all of the magazine’s recommendations (detailed in the poll responses) should be taken up before Autopilot gets prime time on U.S. highways.

Should Tesla’s ‘Autopilot’ system be modified or disabled in the wake of recent crashes?

Some readers took it further, stressing a need for appropriate barriers to operation. License to operate such a unit might be “the proper place for a speed limiter, and an ELD, driver training, riding with an experienced driver, then a special license to operate with an endorsement for passengers, groceries, etc.” So cracked wise a reader posting as Tdktrans under this story about the National Transportation Safety Board’s ongoing investigation of the accident. “Then you can have your little self-driving car to go as you please.”

Related

Self-driving Tesla that hit truck, killed driver was speeding at time of crash, NTSB reports

The Tesla hit a tractor-trailer that was crossing the road in front of the car. The Tesla was in its autonomous Autopilot mode, and the ...

As is the case in many accidents, Don McGrady pointed out, fault here can’t be placed solely on the Tesla and its driver. Though the car was found to be traveling at 9 mpg over the posted speed limit, the truck and the trailer it ran under, McGrady noted, “should not have crossed the road in front of the car if there was not sufficient room to do so without causing the risk of collision.”

Added Tdktrans: “Every driver has been in the situation where they have said ‘Screw it, they’ll stop’ ” when trying to maneuver into a crowded space. “That didn’t work in this case.”

The commenter went on to suggest that, combined with the possible outcome of Autopilot being pulled or disabled in the future, engineers’ inability to fix the problem of effective automatic control might deliver benefits: “Maybe we’ll get lucky, and Tesla can’t fix the problem, and this unfortunate accident will never happen again.”

Electric truck startup co. promises no-emissions big rig

“We are not aware of any zero emission truck in the world that can haul 80,000 pounds more than 1,000 miles and do it without ...

0 comments