Sensor fusion

This is an old revision of this page, as edited by Lvsubram (talk | contribs) at 06:01, 31 August 2015 (Applications). The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.

Sensor fusion is the combining of sensory data or data derived from sensory data from disparate sources such that the resulting information has less uncertainty than would be possible when these sources were used individually. The term uncertainty reduction in this case can mean more accurate, more complete, or more dependable, or refer to the result of an emerging view, such as stereoscopic vision (calculation of depth information by combining two-dimensional images from two cameras at slightly different viewpoints).[1][2]

The data sources for a fusion process are not specified to originate from identical sensors. One can distinguish direct fusion, indirect fusion and fusion of the outputs of the former two. Direct fusion is the fusion of sensor data from a set of heterogeneous or homogeneous sensors, soft sensors, and history values of sensor data, while indirect fusion uses information sources like a priori knowledge about the environment and human input.

Sensor fusion is also known as (multi-sensor) Data fusion and is a subset of information fusion.

Sensory fusion is simply defined as the unification of visual excitations from corresponding retinal images into a single visual perception a single visual image. Single vision is the hallmark of retinal correspondence Double vision is the hallmark of retinal disparity

Examples of sensors

Sensor fusion algorithms

Sensor fusion is a term that covers a number of methods and algorithms, including:

Example sensor fusion calculations

Two example sensor fusion calculations are illustrated below.

Let   and   denote two sensor measurements with noise variances   and   , respectively. One way of obtaining a combined measurement   is to apply the Central Limit Theorem, which is also employed within the Fraser-Potter fixed-interval smoother, namely [3] [4]

  ,

where   is the variance of the combined estimate. It can be seen that the fused result is simply a linear combination of the two measurements weighted by their respective noise variances.

Another method to fuse two measurements is to use the optimal Kalman filter. Suppose that the data is generated by a first-order system and let   denote the solution of the filter's Riccati equation. By applying Cramer's rule within the gain calculation it can be found that the filter gain is given by [4]

 

By inspection, when the first measurement is noise free, the filter ignores the second measurement and vice versa. That is, the combined estimate is weighted by the quality of the measurements.

Centralized versus decentralized

In sensor fusion, centralized versus decentralized refers to where the fusion of the data occurs. In centralized fusion, the clients simply forward all of the data to a central location, and some entity at the central location is responsible for correlating and fusing the data. In decentralized, the clients take full responsibility for fusing the data. "In this case, every sensor or platform can be viewed as an intelligent asset having some degree of autonomy in decision-making."[5]

Multiple combinations of centralized and decentralized systems exist.

Levels

There are several categories or levels of sensor fusion that are commonly used.* [6] [7] [8] [9] [10] [11]

  • Level 0 – Data alignment
  • Level 1 – Entity assessment (e.g. signal/feature/object).
    • Tracking and object detection/recognition/identification
  • Level 2 – Situation assessment
  • Level 3 – Impact assessment
  • Level 4 – Process refinement (i.e. sensor management)
  • Level 5 – User refinement

Applications

One application of sensor fusion is GPS/INS, where Global Positioning System and Inertial Navigation System data is fused using various different methods, e.g. the Extended Kalman Filter. This is useful, for example, in determining the attitude of an aircraft using low-cost sensors.[12] Another example is using the Data fusion approach to determine the traffic state (low traffic, traffic jam, medium flow) using road side collected acoustic, image and sensor data.[13]

A practical example how to combine data of different displacement and position sensors in order to obtain high bandwidth at high resolution can be found in this master thesis.[14] One can see the applied methods of optimal filtering (in sense of minimizing e.g. the energy norm) or the MIMO Kalman filter.

See also

References

  1. ^ Elmenreich, W. (2002). Sensor Fusion in Time-Triggered Systems, PhD Thesis (PDF). Vienna, Austria: Vienna University of Technology. p. 173.
  2. ^ Haghighat, M. B. A., Aghagolzadeh, A., & Seyedarabi, H. (2011). Multi-focus image fusion for visual sensor networks in DCT domain. Computers & Electrical Engineering, 37(5), 789-797.
  3. ^ Maybeck, S. (1982). Stochastic Models, Estimating, and Control. River Edge, NJ: Academic Press.
  4. ^ a b Einicke, G.A. (2012). Smoothing, Filtering and Prediction: Estimating the Past, Present and Future. Rijeka, Croatia: Intech. ISBN 978-953-307-752-9.
  5. ^ N. Xiong; P. Svensson (2002). "Multi-sensor management for information fusion: issues and approaches". Information Fusion. p. 3(2):163–186.
  6. ^ Rethinking JDL Data Fusion Levels
  7. ^ Blasch, E., Plano, S. (2003) “Level 5: User Refinement to aid the Fusion Process”, Proceedings of the SPIE, Vol. 5099.
  8. ^ J. Llinas; C. Bowman; G. Rogova; A. Steinberg; E. Waltz; F. White (2004). Revisiting the JDL data fusion model II. International Conference on Information Fusion. CiteSeerx10.1.1.58.2996.
  9. ^ Blasch, E. (2006) "Sensor, user, mission (SUM) resource management and their interaction with level 2/3 fusion" International Conference on Information Fusion.
  10. ^ http://defensesystems.com/articles/2009/09/02/c4isr1-sensor-fusion.aspx
  11. ^ Blasch, E., Steinberg, A., Das, S., Llinas, J., Chong, C.-Y., Kessler, O., Waltz, E., White, F. (2013) "Revisiting the JDL model for information Exploitation," International Conference on Information Fusion.
  12. ^ Gross, Jason; Yu Gu; Matthew Rhudy; Srikanth Gururajan; Marcello Napolitano (July 2012). "Flight Test Evaluation of Sensor Fusion Algorithms for Attitude Estimation". IEEE Transactions on Aerospace and Electronic Systems. 48 (3): 2128–2139. doi:10.1109/TAES.2012.6237583.
  13. ^ Joshi, V., Rajamani, N., Takayuki, K., Prathapaneni, N., Subramaniam, L. V., (2013). Proceedings of the Twenty-Third International Joint Conference on Artificial Intelligence.{{cite conference}}: CS1 maint: extra punctuation (link) CS1 maint: multiple names: authors list (link)
  14. ^ Piri, Daniel (2014). "Sensor Fusion in Nanopositioning". Vienna, Austria: Vienna University of Technology. p. 140.