The move toward full autonomy will see safety systems that combine their data for more accuracy.
As demands for machine safety and autonomous mobile equipment have increased, perception systems have become more critical than ever to accurately recognize the position and velocity of surrounding vehicles, Vulnerable Road Users (VRU), and other hazards and obstructions.
This has come with an increased emphasis on diversifying sensors’ set to increasing resolution requirements within an extensive range of heavy-duty operating conditions. Increased emphasis on functional safety and safety requirements has grown, diversifying the sensor sets to include multiple technologies such as Radar, Camera Monitoring Systems (CMS), LiDAR, and GPS.
By melding the data from these different sensor types require high-performance systems with software solutions, including sensor fusion, machine learning, AI, and advanced sensor learning algorithms. By bringing together data-sets from different sensors, the machine can provide better information about objects, VRUs, and the equipment’s environment.
Due to this, sensor fusion is one of the most critical topics in autonomous machines and is recognized as the path towards autonomy — the ultimate safety solution.
Today, new heavy-duty mobile equipment — at many different levels of autonomy — use various sensors to understand their environments, locate, and move. But most sensors are operating independently today.
Sensor Fusion & Its Different Forms
In performing a fusion of sensors, different data for the same object are taken into account, and as these systems work together, they make a machine more intelligent. To create the solutions the market’s driving for, one of the requirements to enable the delivery of successful autonomous vehicles is effective sensor fusion.
To effectively combine the data from multiple sensors via sensor fusion, it’s critical that the systems provide accurate, functionally safe information (i.e., meet functional safety requirements and information levels required by autonomy).
One of the premises of achieving functional safety is eliminating single points of failure. Sensors have weaknesses and strengths, and although their weaknesses are common, they are considered single points of failure — which is why functional safety can be a commonplace of failure. However, if a fusion of sensors is introduced, and various systems are used simultaneously, then the reliability of the system is increased.
Through fusion, collision mitigation technologies such as Radar, CMS, brake assist, LiDAR, and more have the capability of providing more advanced, functionally safe solutions than what’s on the market today:
- Camera sensors are a reliable tool that transcribes the driver’s vision, reading the environment by identifying roads, pedestrians, signs, Etc.
- LiDAR can accurately estimate positions, but are affected by environmental conditions;
- While the Radar accurately detects objects and their relative position, and motion — but cannot classify objects
Each of these sensors has advantages and disadvantages. Sensor fusion aims to use the benefits of each to precisely understand its environment.
The Levels of Sensor Fusion and Their Important Roles
The applications of high-level fusion approaches demonstrate a sequence of significant advantages in multi-sensor data fusion, and safety fusion systems are no exception to this. Advanced Driver Assistance Systems (ADAS) and autonomy have become the leading forces behind sensor fusion research and played a large role in deep learning over recent years. However, object detection continues to pose a critical challenge in designing a robust perception system for heavy-duty machines.
Object detection technology, like CMS, are commonplace within heavy-duty markets, but alone they are not enough to significantly reduce the danger associated with equipment operations. Alone, cameras are capable of object-recognition, while Radar is handy for determining an object’s distance and relative motion. By fusing vision tech with active Radar technology, such as PRECO’s PreView® Radar, information from both technologies is used to identify a person, the distance to that person, and the relative motion of both — which neither can do alone. This is why one of the most common sensor fusion developments is combining CMS with Radar.
Today, many forward-thinking OEMs are taking the first steps towards fusion by combining HMIs, followed by the fusion of lower-level data, like people-recognition and relative distance. By fusing safety technologies with HMIs, data from various systems can be prioritized in order of importance, like alerting the operator to an imminent threat to avoid an incident, resulting in improved operational safety.
Include Radar in Sensor Fusion
Accurate information on distance and relative motion are of the utmost importance for operator and equipment safety. The top three benefits of Radar (Radio Detection and Ranging) include:
- Radar sensors are good at identifying an object’s relative motion, location, and solid-state.
- Radar is not affected by environmental conditions as others are, unlike LiDAR. Radar’s effectiveness in fog, dust, and other challenging environmental conditions is durable.
- Radar is a mature technology, compared to the image recognition in CMS, which is relatively new and evolving; neither CMS nor LiDAR is as mature as Radar.
Unlike other sensors that calculate the difference in position between two measurements, the Radar measures the change in the next radio wave frequency, like when a nearby vehicle moves towards or away from the Radar.
A strong suit of Radar sensors is its ability to estimate the distance and accurate location of objects. The Radar sees the world via three measurements:
- The distance to the object
- The angle of the object
- The velocity of the object to or from the sensor
When combining vision systems with active safety systems, it’s important to know the areas in which each is or isn’t capable. Radar can identify if an object is moving or stationary, communicating to the operator what the object is doing, but not what it is. On the other hand, CMS can classifyobjects, like a tree, sign, or a VRU, like a pedestrian. However, CMS can’t identify an object’s motion (i.e., what the object is doing). While LiDAR successfully does both, but it’s expensive and heavily impacted by environmental conditions within a specified range, like dust, fog, or rain.
In the context of autonomy, Radar data is used to identify targets of interest from surrounding clutter. Using fusion, you can combine the strengths of Radar with those of LiDAR to create an accurate image of the environment around the sensor / machine and address the environmental weaknesses of LiDAR. Through this fusion, you can paint a more reliable picture of the areas surrounding the equipment. This ultimately helps prevent or minimize collisions’ impact, whether with other vehicles, pedestrians, or other objects.
How Sensor Fusion Can Improve Industry Safety
The technological advancements we see in the automotive space are often a little ahead of what we see on heavy-duty equipment. But many OEM and the heavy-duty market leaders are hopeful. As the frequency in which new automotive vehicles now come equipped with advanced front and rear CMS, as well as other sensors, decision-makers are recognizing the use-cases of these advanced safety technologies. They understand that their industry’s future lies in the fusion of safety technologies to protect equipment and assets.
Today, the only true example of sensor fusion in automotive vehicles is rear and forward-facing CMS, which control braking systems for collision avoidance. Automobiles also implement ADAS, which is capable of supporting Adaptive Cruise Control (ACC), Automatic Emergency Braking (AEB), lane departure, blind spot monitoring, and self-parking. ADAS fuses raw data from multiple sensors that include cameras, short- and long-range Radar, ultrasonic, and IMUs (Inertial Measurement Unit).
Compared to the automotive industry, off-road heavy-duty machines have very different safety requirements — blind spots are a strong example of this. Regardless of the safety requirement, fusion brings more reliability to identifying objects in blind spots and offers visual support to operators on what’s happening around the vehicle, especially in blind spots.
There’s no question that sensor fusion provides more comprehensive and dependable information than individual discrete sensors. For example, if multiple sensors are surveying the same area with RF interference, sensor fusion offers operators a backup. The more complete the sensor fusion, the better the object-recognition is, helping operators gauge and efficiently act on the threat level.
Sensor Fusion Advancements at PRECO — the Integration of Sensors to Create Smarter Systems
PRECO specializes in solutions optimized for heavy-duty mobile machines by purpose-building electrically and environmentally hardened systems to detect and communicate various information levels to operators. By collecting dense amounts of information, PRECO’s systems combine that data (i.e., simplifying high-level information), creating object lists and threat analyses’ to inform simple HMI’s and fusion controllers.
Sensor fusion results from combining multiple sensing technologies, so at PRECO, integrative partners are paramount. With PRECO’s integration partners’ help, such as Autonomous Solutions (ASI), Xite Solutions, Danfoss Power Solutions, Aspöck, and others, PRECO can offer more advanced solutions to the market.
Strengths can offset equipment weaknesses with each integration partner, allowing heavy-duty mobile machines to use each integrated system’s strengths. These partnerships have played a part in propelling the industry towards the future of fusion by providing accurate and reliable safety systems. As a result of such integrations, heavy-duty mobile equipment can access accurate and reliable safety solutions — an invaluable asset for OEMs looking to thrive with sensor fusion.
As sensor fusion and the adoption of sensors grows in demand, PRECO’s engineers and software developers have created flexible solutions that seamlessly integrate our safety products with existing systems. These current safety technologies include CMS, ADAS, mobile DVRs, safety alarms, telematics, and GPS systems.
PRECO has a long-standing history of working with the top OEMs in the market — providing products for many brands seen on- and off-road. As an increasing amount of market-driven products become deeply involved in what is next in the heavy-duty space, PRECO has continued to invest in Radar innovations and progressive partnerships to help heavy-duty mobile machine operations become safer than ever.