Seeing Machines’ Guardian system

The unfolding dashcam revolution

By Todd Dills

Imagine a future where a dashcam records your head nods, rate of blinking, the length of the blinks and how wide open your eyes are when at rest. Where road-facing cameras note improper lane keeping and following distances. Where a remote computer crunches that and other data, prompting emergency warnings to you and your dispatcher.

It doesn’t require too much imagination to visualize such a system. That future is already here, at least in part and in various forms from purveyors of multi-camera systems and other related technologies around the world. The number of trucks in U.S. operations equipped with variations of such systems is at least in the tens of thousands.

Fleets are benefiting from the collaborative safety programs that more sophisticated video-event-capture technologies are enabling. Some bode well for insurance rates, but could pose complications in scheduling as the systems flag drivers who need to pull over and rest.

The National Transportation Safety Board, a policy advisory group, in 2016 called for video-platform use in its annual Top Ten Most Wanted list of safety improvements. Such systems have grown in market penetration and levels of intrusion in recent years.

It’s no surprise that drivers often don’t share carriers’ enthusiasm for camera-based fatigue management. Owner-operators in large numbers have taken up forward-facing cameras, but driver-facing cameras imposed by fleets are seen as an invasion into the relative privacy of a home away from home.

Others view increasingly advanced in-cab technology as either unnecessary or a direct threat to the driver’s livelihood. “Are people becoming so stupid that we really need a monitor to tell us we’re tired?” asked owner-operator Jeff Guyton. “This is all idiot technology.”

“The telemetry is already there on the trucks to monitor what is needed,” says Jeff Wagner, commenting on Overdrive’s youtube channel. “Stop selling tools that are used to ruin drivers’ careers and harass them when in the wrong hands. Even when these are in the hands of good people, they make some drivers feel intimidated, singled out, create a feeling of social anxiety like they are being ‘watched,’ feel nervous, and more likely to make a mistake.”

Fleets who use such technologies, and some drivers, however, sing a different tune. Fraley & Schilling driver Don Cuddeback has experience with the fleet’s SmartDrive dual-camera system, which among other things triggers the event recorder when following distance is too short. The following-distance monitor, provided by another company, also delivers an audible warning to the driver.

“I think it helps a lot,” Cuddeback says. Reviewing incidents with company personnel, he says, has helped him be more vigilant about following distance. “Though I’ve had 2 million-plus miles without an accident or a ticket, there’s always room for improvement.”

Brian Kohlwes, safety vice president and chief counsel for East Dubuque, Illinois-based refrigerated hauler Hirschbach Motor Lines, calls Lytx DriveCam’s ActiveVision add-on platform a “game-changer” regarding fatigue. With its ability to detect lane departure, track speed relative to traffic and capture video of the driver, “it allows us to be proactive” on detecting unsafe behavior, Kohlwes says. Accidents have been much less frequent at the 950-truck fleet since adopting ActiveVision in December.

Fatigue specialists got a detailed look at one advanced system at the International Managing Fatigue conference in San Diego last March. Mining operations in Chile and Africa are using a system from the Australian company Optalert, said Vice President Christopher Hocking.

Optalert outfits truck drivers with glasses equipped with sensors that calculate the velocity of eyelid opening and closing, as well as the duration of each closure. It rates the driver every minute, 0-10, on the Johns Drowsiness Scale, Hocking said. Drivers rating 5 or above receive an in-cab warning.

While no system available in North America uses glasses like Optalert’s, most other new systems offer warnings of fatigue or distraction. The more sophisticated video technologies offer “machine vision” with some degree of “learning” capabilities that enable them to tailor a driver’s rating to his normal driving habits.

Both leading dual-camera (road- and driver-facing) event-recorder systems, SmartDrive and Lytx DriveCam, have moved toward video coverage of side and other views. They both also introduced an “always-on” capability as an option, making retrieval of driving events from any recent period possible for managers looking to improve driver performance or support a defense in accident litigation.

The four platforms detailed below are moving toward, or already offering in part, functionality such as Optalert’s sophisticated fatigue and/or distraction monitoring based on facial reading and other data.


After testing the Guardian camera, shown here on the dash of a truck in the private fleet of Atlanta-area Royal Food Service, Royal outfitted the entire fleet with the system.

Seeing Machines’ Guardian system

By Todd Dills

Guardian, by the Australia-based Seeing Machines company, tracks the driver’s eyes and head position, as well as road views. It combines that data with “advanced sensors and algorithms” to make a fatigue/distraction determination, says brand manager Melissa Byron. The system then delivers audio alerts and seat vibrations to stimulate a fatigued driver back to the task at hand.

As with North American video-event recorders from SmartDrive and Lytx, any fatigue or distraction event triggers recording of forward- and driver-facing video that provides context for the event. Byron says Guardian’s footage is sent to a 24/7 company office “where a specialist reviews the footage in minutes and calls the driver’s manager if required.”

Guardian uses infrared sensors that “sit on the dashboard so that the driver-facing camera can detect driver fatigue and distraction at any time of day and when a driver is wearing sunglasses,” Byron says.

Other driver-facing camera systems use similar technologies and, like Guardian, allow fleets to customize the parameters triggering video-recording events. Guardian’s settings include speed thresholds; limitations for when to send fatigue and/or distraction events; the configuration of in-cab alarms; speed settings for capturing “overspeed” events; and manual recording settings for drivers.

The company has 130 fleet customers globally, 10 percent of them in the United States. Many are in the food distribution and motorcoach sectors, Byron says.

One Guardian adopter is Royal Food Service, operating mostly within a 40-mile radius of downtown Atlanta. It’s a refrigerated private fleet of two tractor-trailers and 69 straight trucks that employs about 110 drivers, says Mark McClendon, chief financial officer. Over the last four years, a couple of serious accidents left the company mystified as to their root causes.

“So, we set about figuring out how to know what was happening inside the cabs of our trucks,” McClendon says.

Royal piloted the Guardian system with 15 cameras, then outfit the entire fleet four months later. McClendon says the system alerts his staff to events each week, mostly fatigue- and distraction-related.

Early one Tuesday morning, one driver had three fatigue-related events. “Each time, the Guardian system vibrated the seat and gave an audible alarm to rouse the driver and keep him awake. Of course, when we got the email notifications, we alerted the driver to pull over and call the office ASAP so we could check on him and assess his fitness to finish the route. As I recall, he did finish the route without incident after getting an energy drink and walking around a little.”

Later, managers learned the driver “was taking a class on Monday nights and didn’t get home until 10:30,” McClendon says. While “the driver didn’t want us to know he was sleepy because he wanted the hours … we immediately told him that we would just take him off Tuesday routes until his class was over. Problem solved.”

Had the company not had the Guardian cameras, McClendon believes, “the only way we would have discovered that this driver was sleepy on Tuesdays was if he hit something or somebody, and even then, we may not have known what actually happened.”

McClendon believes this kind of risk mitigation is its true value. “The only way we can control our insurance costs is to control our risk,” he says.


A test of Netradyne’s interior camera shows how it could read drivers’ bodies to potentially detect fatigue or distraction.

Netradyne’s Driver-i platform

By Todd Dills

Since news of the relatively new always-on Netradyne video platform began appearing last year, Vice President Adam Kahn has been stressing a driver-rewards focus. It’s part of the strategy to flip on its head the traditional justification for dual-view dashcams – that the video is used mostly for driver coaching and occasionally for evidence in a wreck.

Kahn, formerly with the better-known SmartDrive dual-camera video platform, says that with Netradyne’s Driver-i system, harnessing new super-chips with maximum processing power, “I can actively turn on the camera and analyze every minute” of driving. With legacy systems recording maybe two to three minutes a day, fleets are “missing a lot,” he says.

And lest drivers bristle at the notion of always-on recording, Kahn emphasizes the majority of their driving is very good. Netradyne’s system of data analytics paired with its smart cameras aims to score drivers for that excellence, offering an “incentive for the drivers and more recognition of what they’re doing well.”

While Driver-i’s machine-vision aspects mostly are road-focused, Netradyne is working on the ability to recognize signs of fatigue and distraction inside the cab. With the current system, high-severity events detected from the data stream within the vehicle or via the road camera can come with a snapshot of the driver via an internal camera if the fleet wants it.

“There’s potential for fatigue detection there,” Kahn says, but only really after the fact. However, his company is headed toward more proactive use of data analytics around fatigue. For example, today, every time the truck is cranked, the internal camera checks for seatbelt usage. If it’s on, the driver retains points in the scoring system.

In-cab warnings via sounds, lights or seat vibrations create “a lot of tech noise that gets ignored” unless they’re delivered at the right time, Kahn says. Rather than emphasizing such alerts, he envisions a focus on fatigue prediction. A system using data analytics to pinpoint times of the day where fatigue is likely for an individual driver could reduce the hours in which the in-cab camera is on.

“Do we start looking at drowsiness [indicators] in the first hour and the seventh and eighth hour of the driving day, or after lunch?” Kahn asks. “Trends of data may tell us when drowsy driving is more prevalent” for any driver.

Netradyne is moving toward reading not only outside of the cab – with lane departure, following distance and other indicators of possible fatigue or distraction – but inside as well.

“As we get more comfortable in the application of facial recognition and eye movement and head movement” that indicate fatigue, Kahn says, “we’ll apply the same type of notification to the fleet” that might be received today for lane departure. “We can somehow notify the driver and not inundate him with beeps and buzzes – alert the driver to change his day, maybe park and take a lap around his truck.”

Around 100 fleets have shown interest, and Netradyne’s testing and beginning installation in a small number of them, Kahn says. Its first announced customer, expedited-niche fleet Load One, has tested 15 units in its owner-operators’ vehicles and is planning a bigger roll-out.

John Elliott, chief executive officer of the Taylor, Michigan-based company and well-known for his driver support and recognition efforts, hopes to use the system to build incentive programs for his contractors, partly in combination with the StayMetrics predictive analytics service.


The ActiveVision machine-vision-capable ER-SV2 unit from Lytx DriveCam.

Lytx’s ActiveVision add-on to the DriveCam service

By Todd Dills

Lytx is perhaps the biggest dual-camera-system purveyor. Its basic DriveCam dual-camera platform is in almost 400,000 passenger and commercial vehicles, says Gretchen Griswold, communications director. About 100,000 units are in for-hire trucking and another 70,000 in private fleets.

ActiveVision, which adds machine-vision technology to the road-facing camera to detect lane departure, following distance and more, is in less than 10 percent of those vehicles.

The company’s related Lytx Video Services (formerly “Unisyn”) platform offers an always-on capability for video retrieval. Video Services also offers side-view and other expandable options for a full view around the vehicle. However, ActiveVision and DriveCam remain exception-based in terms of what the cameras capture.

Unlike Seeing Machines’ Guardian system and, to a lesser extent, Netradyne’s Driver-i, however, ActiveVision turns its machine vision only outward to the road, detecting lane markings and reading how the truck is conforming to the driving environment.

“It tracks the driver’s speed relative to traffic,” Griswold says. Audible and visible warnings are delivered in-cab if fatigue or distraction is sensed. The system also captures video of the driver and the road ahead, as does the basic DriveCam system.

Events are reviewed by Lytx at a 24/7-staffed facility, then forwarded to the fleet with notes on what the reviewer sees as coaching-worthy or laudable, says Holly Williamson, Lytx’s senior portfolio manager for ActiveVision.

Three primary trigger events have been covered under ActiveVision since its 2016 introduction: lane departure, inadequate following distance and what the company calls “critical distance,” or when the camera detects there’s “less than 0.6 seconds between you and the vehicle ahead,” Williamson says.

To this trio, Lytx added a fourth last month: the rolling stop. The system has evolved to where it can recognize road signs. The system will be triggered when a truck “is traveling through a stop sign between 3 and 20 miles an hour,” Williamson says.

 


SmartDrive and other camera manufacturers have expanded beyond their initial single or dual (driver- and road-facing) camera setups to cover side views, as illustrated in this video feed. The expanded views add greater context to captured events.

SmartDrive’s SmartIQ data analytics platform

By Todd Dills

SmartDrive’s multi-camera event recording system is available in configurations ranging from a single road-facing camera to a multi-camera system that includes a driver-facing cam and full 360-degree coverage around the truck. It offers optional “extended recording” of full driving footage accessible at any moment. Its platform appears to have seen a wide uptake in U.S. trucking, though the company declined to share customer numbers.

The system detects fatigue only insofar as signs of it are observed by event reviewers or can be extrapolated from on-highway interactions. Jason Palmer, chief operating officer, says the system combines multiple observations into a data-rich stream that can analyze driver performance. For SmartDrive, that’s through the SmartIQ platform that scores drivers according to collision risk.

Ten years ago, fleets could detect severe maneuvers via hard-brake recording built into the electronic control module and collected by telematics devices or detected by g-force camera sensors. Now, Palmer says, “You’re bringing multiple sensors together to be more intelligent.”

In the case of fatigue, knowing the time of day or a driver’s place in his duty cycle is key. Such data can be combined with analyzing “particular types of maneuvers,” such as a swerve out of and back into a lane, captured via road-facing cameras.

The SmartDrive system can combine telematics and g-force data with visual in-cab data of the driver at the time of the maneuver and “offload that video in real time and prioritize it for a driving analyst,” Palmer says. “You’re able to see whether that driver is fatigued and [know whether] alerting the driver” with audio-visual warnings is working.

If it’s not, prioritization to the supervisor could be the next step, Palmer says. “That’s where a video-based safety system like ours can really come into play” as another level of protection.

Drivers can get performance feedback via an app or a mobile website. “They can see how they stack up against their peers,” Palmer says.

The driver-facing video aids in fatigue recognition only after the fact, when combined with truck performance data. But the company is investigating real-time fatigue detection and measurement with a variety of third-party systems.

Reliance on one element – eye-tracking data alone, for instance – “produces a lot of false positives” of fatigue, Palmer says. SmartDrive studied hard-brake reports from telematics units and determined that “87 percent were false positives,” where brake use was justified.

Some fleets Palmer and company talk to, he says, “want to be more conservative” than simply working within hours of service limits. “If the driver appears distracted, to signal them to pull over and complete a cognitive test.” Or to do the same “prior to starting a shift. At the end of the day, fleets are looking to make sure the driver is cognitively ready to operate the vehicle.”


There are 12 comments

Your email address will not be published. Required fields are marked *