Sensor and data fusion combines the benefits of different sensors and measuring principles in the most effective way possible providing data that individual sensors working independently are unable to generate.
Data fusion of multiple sensors increases the measurement reliability, range and accuracy. The different measuring principles are also used to confirm detected objects.
Sensor and data fusion has a variety of applications such as in GPS/INS. In this applications, Global Positioning System and inertial navigation system data is fused using various different methods like the extended Kalman filter (EKF), the nonlinear version of the Kalman filter which linearizes about an estimate of the current mean and covariance.
In the case of well-defined transition models, the EKF has been considered the de facto standard in the theory of nonlinear state estimation, navigation systems and GPS.
Sensor and data fusion is also useful in determining the attitude of an aircraft using low-cost sensors.
Additionally, a data fusion approach can determine the traffic state (low traffic, traffic jam, medium flow) using road side collected acoustic, image and sensor data.
Although technically not a dedicated sensor fusion method, modern Convolutional neural network based methods can simultaneously process very many channels of sensor data (such as Hyperspectral imaging with hundreds of bands and fuse relevant information to produce classification results.
The fusion of radar sensor and multi-purpose camera data is also highly relevant for automated driving. The Bosch road signature makes it possible for automated vehicles to determine their precise position and enables highly accurate and robust vehicle localization based on road features.
In this context, radar and camera sensor systems and the fusion of their data provide important information on objects in the vehicle’s surroundings. While the video sensor system is able to detect relevant features in the surroundings of the vehicle, like road lanes and traffic signs, the radar sensor can identify stationary features, such as guardrails and traffic sign gantries.
The fusion and processing of this collected data provides a constantly updated picture of the surroundings to help with high-precision localization.
Want to know more? Tonex offers Sensor and Data Fusion Training Bootcamp, a 3-day course that covers technologies, tools and methods to automatically manage multi sensor data filtering, aggregation, extraction and fusing data useful to intelligence analysts and war fighters.
Learn about application of artificial neural network technology to data fusion for target recognition, airborne target recognition, activity-based intelligence, C4ISR, Electronic Warfare (EW), radar and EO-IR thermal imaging sensors, missile defense, cyber warfare, air, space and maritime surveillance, net-centric warfare, Activity-based Intelligence, effects-based operations process control, proactive maintenance and industrial automation.
For more information, questions, comments, contact us.