For the need for multi-sensor fusion for autopilot, most automotive automation systems or ADAS systems rely on the integration of three types of sensors for environmental sensing: millimeter-wave radar, laser radar Sensor and camera. With the increasing importance of sensor and environmental awareness technology, the industry generally holds the view that sensor fusion will provide a more robust and truly automated system.

   With the recognition of sensors and environment-aware technology increasing the resolution, the sensor has gone beyond the simple detection and ranging function, replaced by a real 'visual' perception, such as classification, drawing and so on. The technology of integrating sensor data such as millimeter-wave radar, camera, and lidar is essential to ensure that the global positioning and understanding of the surrounding environment are critical and provide the necessary for the implementation of the Level 3-Level 5 autopilot program Technical reserves. In the environment perception, each sensor has a unique advantage and weakness. For example, millimeter-wave radar ABS Sensor can be measured in low resolution range, and affected by weather factors; and the camera has a higher resolution, can perceive the color, but by the impact of light; laser radar can provide three Scale awareness of information, the ability to refactor the environment stronger. In this context, only a few sensors in order to provide the integration of the vehicle environment more accurate mapping information, and OEM OEM to meet the safety standards. At present, high-performance laser radar production and cost issues, is still leading to multi-sensor fusion technology program, and even one of the obstacles automatically driving.
Enhance vehicle safety
  
   With the vehicle to improve the processing capacity of complex environmental data, will eventually achieve more accurate detection, classification and positioning capabilities. Dealing with complex environmental data can lead to delays in vehicle driving decisions, while minimizing this delay is another key part of enhancing vehicle safety. One solution to this problem is to integrate multi-source data, and data fusion can reduce the computational resources required by the system to make driving decisions. The addition of 3D data from a lidar simplifies environment-aware computational tasks as compared to a single-dependent camera and millimeter-wave radar for environmental perception calculations. In the future, traditional sensors are increasingly unable to meet the needs of more advanced environmental perception, then the new sensor will be introduced, especially Speed Sensor in higher speed perception and other scenes. Object detection and classification performance also need to be improved to meet the needs of more advanced autopilot systems. In addition, pedestrians, bicycles and small road debris and other obstacles, are the current vehicle environment perception has not been completely overcome the problem.

  Although the software scheme enhances the detection rate of detection, even the odds of zero detection rate, in the face of the real driving scene, is still need to be resolved. Recently, more and more artificial intelligence algorithms have been introduced into environment-aware schemes, but their robustness and stability are not yet validated due to the 'black box' problem of the depth learning path. With the increase in the resolution of vehicle sensors, especially millimeter-wave radar sensors, the concept of 'visual perception' in automotive automation systems will be further expanded. Lidars and other sensors for automatic driving vehicles, in fact, act as a key 'infrastructure' role, to support the visual system and AI algorithm to achieve. The more powerful the underlying device, the more developed the software performance. Therefore, not only to collect the vehicle around the original data, visual perception technology also has a broader definition and application scenarios, such as high-precision 3D map data acquisition, drawing, which can meet the Throttle Position Sensor environmental awareness of vehicle self-positioning, and so on. Multi-sensor fusion is the only way to more advanced autopilot technology. Each architecture has its own advantages and disadvantages, but the use of multi-sensor solution redundancy is indispensable.