When humans don’t give their full attention to the task at hand, they miss deadlines, fail certifications, and collide with objects or other drivers. In an age of constant digital interruptions, it is no wonder humans have trouble ignoring distractions. The human brain does not possess the ability to pay attention to everything going on in the cab and the environment surrounding heavy-duty mobile machinery; the brain must choose where to focus.
As demands for mobile machine safety and higher levels of autonomous equipment push for more advanced perception systems, there continues to be an increasing emphasis on diversifying sensor sets to include multiple technologies such as Radar, Camera Monitoring Systems (CMS), LiDAR, and GPS to provide better resolution. However, with the ever-expanding amount of information now readily available in the cab, manufacturers and end-users must work together to find a balance in this flow of data to safely keep equipment operators’ attention on the task at hand.
When operators seem to give in to distractions, how can the latest technologies aid the brain to ensure operations run smoothly? Today, OEMs can leverage this biological limitation as an advantage by using powerful, innovative tools to turn “distractions” into useful information. Through a filtering process, systems can now prioritize the information and messages provided by purpose-built sensor technology to serve operators and end-users better.
Computers on Wheels – If It Can Be Sensed, It Can Be Monitored
You may have heard before that the modern truck is essentially a computer on wheels. Today’s fleets can be described as computers on wheels, using more than 400 sensors with lines of code in the millions. And as the drive towards improving operational safety and maximizing efficiency, not to mention autonomous vehicles, the numbers are only growing.
Heavy-duty equipment today can monitor everything from vehicle location, to sensing approximate load weight, to selecting the appropriate transmission gear for performance and fuel economy. Essentially, if it can be sensed, it can be monitored.
In order to communicate useful information to a driver or technician, these sensors must work as part of a larger system. Sensors collect data, and that data is then transmitted to the larger system via the CAN bus. However, the CAN bus is only a tool for communication. It operates sort of like a telephone, connecting electronic control units (ECU) and allowing them to communicate, but not providing the “language” for them to talk to each other.
Most of today’s commercial vehicles use the Society of Engineers (SAE) protocol J1939. J1939 is a standardized language across ECUs, meaning that instead of relying on manufacturer-specific protocols (or languages), all the ECUs in a vehicle communicate using one language. That standardization is important because it streamlines communication and enables in-vehicle connectivity and advanced telematics.
As technology evolves, sensors are being tasked with providing an increasingly complex and accurate perception system. For full-automation, and for the system to understand maintenance needs, perception systems and ‘similar’ technologies are needed — combining multiple sensor systems to offset deficiencies is key, and redundancy within sensor fusion, combining sensor systems with other technologies to improve system performance is essential.
An important aspect of any technology is the ability to cut through the noise and deliver only information that is meaningful and actionable. Not every audible alarm or flash on the visual display represents an event delivered or incident avoided; and that’s why separating useful information from the distracting information, like false-positive alerts or even positive-positive alerts, have to be thoroughly tested, vetted, and proven useful.
Sensor and Display Distractions
Each year the Federal Ministry of Transport and Digital Infrastructure (BMVI) invests 41 million euros in road safety research carried out by the Federal Highway Research Institute (BASt). In 2021, BASt published a long-term study assessing the viability of “Camera-Monitor Systems (CMS) as a Source for Driver Information,” as well as an assessment of assistance safety systems in the context of human-machine interactions (HMI).
To examine the effect of “continuously assisting driving functions” in the context of human-machine interactions (HMI) in particular, BASt developed an evaluation tool. Using this instrument, their scientists have observed and evaluated comprehensive and standardized systems from various manufacturers concerning HMI (i.e., the interaction behavior between the driver and the system in test scenarios).
In the study, researchers noted the “initial accidents involving vehicles with these systems in the USA indicate that drivers are having difficulty understanding the systems fully and behaving correctly. The particular complexity lies in the fact that drivers have to monitor functions instead of performing them themselves. This fundamentally changes the perception of the driver’s role. Permanent monitoring is less successful if drivers are not actively involved” (BASt, HMI Assessment of Safety).
After assessing camera systems compared to or in conjunction with mirrors, the BASt study concluded that a CMS “would currently be suitable for use in road traffic under certain conditions, but these conditions are still too restrictive. This includes, for example, the high light sensitivity, which makes driving in the dark with bright light sources in rear traffic or in low sunlight difficult to impossible.”
Vision systems have become a critical part of heavy equipment safety. Currently, many OEMs are offering rear visibility cameras to help companies avoid unnecessary accidents and potential litigation. However, this technology is a passive approach to collision avoidance; CMS requires the equipment operator’s attention, putting the operator’s responsibility to identify an obstacle or person.
However, it is essential to note the BASt study added that CMS has the potential to offer a safe alternative to conventional exterior mirrors and that by combining “the advantages of an integrated display of the right and left sides of the driver’s own vehicle with a positioning of the system close to the visual axis. The situation can thus be quickly grasped with a few glances.”
Although it’s clear that CMS, and even mirrors, can distract drivers from the task at hand, they have the potential to offer useful information. Alone, cameras can recognize objects, while Radar is handy for determining an object’s distance and relative motion. By fusing vision systems with active radar technology, such as PreView® Radar, information from both technologies is used to identify a person, the distance to that person, and the relative motion of both — which neither can do alone. Therefore, one of the most common sensor fusion developments is combining CMS with radar.
It’s Not Only False-Positives, But Positive-Positives
A conventional sensor-rich system can detect objects, do primary classification, alert the operator to hazardous road conditions, and sometimes move, slow down, or even stop vehicles. If done well, this form of assistance can provide a welcomed sense of convenience and safety, allowing the operator to feel more confident.
However, while operating heavy-duty machinery with systems that alert the operator to every object within range of the sensor – nuisance alerts or false positives – have been seen even to push some of the most experienced operators to turn the system off eventually. This action then inadvertently prevents the operator from receiving and reacting to positive positives and placing them in danger of colliding with an object.
Often, we have seen systems such as these being ultimately turned off or ignored altogether, permeating an already dangerous situation, which is why, in the on-road system, operators typically only want to be alerted to moving objects, especially those moving towards the vehicle.
Understanding this struggle, our engineers sought to create a solution intelligent enough to understand the difference between a high-level threat and a non-threat. When a heavy machine is in use, the operator needs to be aware of their surroundings, especially where they cannot see. That may seem obvious, but there has been a lag in engineering technology that can solve this issue effectively until now. The importance of differentiating between stationary and moving objects to avoid nuisance and false-positive alerts has been the force driving the engineering behind PreView’s systems.
For example, the PreView Side Defender®II system’s audible alert is activated by the turn signal. So, when the operator’s in route, this, in turn, makes the audible alert live, providing an audible and visual alert to warn the operator if the radar detects an object(s) in the blind spot. The radar will not give off nuisance alerts because the system will only provide visual alerts, and only when the turn signal is used will the system audibly alert the operator if an evasive maneuver is needed.
Issues With Overreliance
As vehicle automation advances, companies look to revolutionize the way we travel entirely, and for the most part, eliminate the highest risk component, the human operator, which is still in control of most vehicle operating functions. This human component is referred to as “Human Factors,” which is defined as the science dealing with the application of information on physical and psychological characteristics to the design and systems for human use.
This focus on advanced vehicle technology is to ultimately compensate for human limitations with the goal of safety and efficiency in mind. Vehicle systems such as Lane Departure Warnings, Automatic Braking, and Blind Spot Detection have considered these human limitations, and in some instances, take the human out of the equation. However, as the industries seek to radically change the overall safety of vehicle operation and safety, there are lessons to be learned from the past.
Back in the 1960s and 1970s, the aviation industry began to automate flight systems to substantially reduce the incidence of human error to increase safety and efficiency. This soon came with one major downfall: the over-reliance on automation and these systems to consistently perform as engineered had catastrophic outcomes. This “over-reliance” on these systems degraded pilot skills over time and created severe confirmation bias instances.
One such example occurred on June 1, 2009, when an Air France flight (#447) departed Rio de Janeiro, Brazil, en route to Paris, and while crossing the Atlantic Ocean, the aircraft encountered severe weather conditions. Unfortunately, the icy weather conditions caused the Pitot Tubes’ to fail (i.e., the airspeed sensors in front of the plane), which deactivated the aircraft’s autopilot system. The pilot struggled to fix the issue but was unable to understand the most fundamental part of flying in order to take control of the aircraft, and the plane crashed in the ocean.
As heavy-duty vehicle engineering approaches new levels of automation and control, it is crucial that engineers learn from other industries in their development and integration of these automated controls and that the operators are engaged enough to recognize and react appropriately to vehicle warning systems. Solutions must reach a balance not to overload the operator with incoming data, messages, and alerts to allow operators to make a correct decision.
Operators still gather most of their input through their eyes, and advanced sensor technologies enhance our perception and vision capabilities to make heavy-duty mobile machinery and those around them safer. To operators, there are currently still obvious benefits and situational negatives, unfortunately. It is common knowledge that these systems are integral to the future of operating heavy-duty machinery, and there is always a margin of human error for which these solutions are to counteract and avoid costly and dangerous collisions.
While ADAS and other sensor technologies continue to develop, work environments have improved with lower margins of error, fewer collisions, and increased reassurance for the operator. However, when considering the differences between the distractions and useful information these advanced technologies provide, safety must remain the true driving force behind automation.
April is Distracted Driving Awareness Month – a designation created in large part to our technological strides in communication As far as we’ve come in bringing the future of communication to our front doors, we’ve also taken a big step backward when the actions we take on our devices while behind the wheel are proven to increase crash risk.
You can create awareness in your workplace, your home and community by sharing the distracted driving message. The National Safety Council offers infographics, a poster, fact sheet and several social media-friendly graphics, which you can download.