Automatically driving the car to observe the surrounding environment, the key is to integrate all the sensor information. Most of the sensor fusion is the convergence of the target layer, rather than the integration of the original data layer. In most cases, the core sensor will be in their own chip processing to generate target data, the other sensor to the original data transmission Sensor to the main processor to generate the target, and then the sensor fusion, this fusion method called 'late fusion.' When the target is generated, the unrelated information is filtered out, and the loss of the original data occurs. These filtered raw data may be fused with other sensor data. Imagine some scenes, the direct sunlight caused the camera blinding, snow just covered the radar, or sensor data from the results are not consistent, in this case, the target layer integration is facing a great challenge.
California Mountain View has a start-up company called DeepScale, who has developed a 'perceptual system' that can be used for ADAS and highly autopilot cars. 'Perception system' is the biggest characteristic of the use of raw data (raw data) fusion, rather than the target data (object data) fusion, and can be achieved in the embedded device.
DeepScale CEO Forrest Iandola said that many of the depth of the neural network (DNN) research is in the existing DNN framework to adjust the changes. But DeepScale developed its own DNN network using raw data, not only from image sensors, but also radar and laser radars. There is already a trained DNN framework for computer vision, and it is well developed because there are many early drivers of automatics driving the field. But for other perceptual data, such as radar and lidar data, did not use DNN neural network for training. DeepScale wants to Fuel Rail Pressure Sensor enter this blank area. Iandola at the University of California at Berkeley and the team is the main study of a depth called SqueezNet neural network model. Some of the team later joined Iandola's start-up company. SqueezNet was not designed to solve the problem of automatic driving, the team's original intention is to 'make this model as small as possible, while ensuring that the application of computer vision data set, can have a more reasonable accuracy.' Iandola is also involved in the development of a similar DNN framework - FireCaffe, used to accelerate training, can be used for embedded devices. In a paper, Iandola and his team said that based on a GPU group, FireCaffe could train large-scale neural networks on a large scale.
DeepScale has partnered with a number of radar and laser radar vendors to develop well-trained algorithms for the OEMs. Partners are both veteran suppliers and new technology research and development companies. 'DeepScale's goal is to develop DNNs without the need for customization of DNN, which can import data from different sensors.' The team uses a variety of sensors to share information, maximizing the accuracy of perceived links and reducing uncertainty. Tagged training data can be reused, and these data can come from different sensors, with only minimal calibration on the sensor side. DeepScale's solutions are not affected by different sensor solutions, and their development of DNN can also run on different processor platforms. DeepScale's technology can help OEMs and TIER 1 build an AI-based environment modeling solution without having to train their own neural networks and do not write their own algorithms.
DeepScale believes that the advantage of its own program Temperature Sensor is that it is not affected by the sensor scheme and the processor. More importantly, it is in the processor on the operation of high efficiency, power consumption is also smaller. The sensor raw data comes from four cameras and one radar, which can be run on a processor for smartphones APP (such as the high-profile card Snapdragon). The raw data for 12 sensors can be processed on a NVIDIA GPU. In designing a highly autopilot car architecture, data layer sensor fusion is not the only way, but it is the smartest way. Collecting a variety of sensor data, real-time data layer integration, this idea in the automotive industry caused a different response. Perry believes that the gradual development of OEM development history of the design ideas will bring some opposition to the sound. Not so many companies can provide AI software with pre-trained algorithms for sensor data layer fusion to achieve the entire environment modeling. He added, 'AImotive and Mobileye have a similar study, but need a dedicated and specific main processor.'
DeepScale's focus is on the first two parts - Perception, Positioning \u0026 Planning, which will support the entire environment modeling, including: target recognition, grid network, lane segmentation, target tracking Pressure Switch and self-locating. Iandola does not feel that DeepScale must wait for a full-automatic ride to come and promote the technology.