Sensor fusion is the aggregation of data from multiple sensors to gain a more accurate picture of the sensors' subject or environment than can be determined by any one sensor alone.
Software that synthesizes results from multiple sources yields insights faster and enables more sophisticated analysis than was possible when data from each sensor had to be processed separately. The processing power and algorithms used to fuse the combined data is commonly found in mobile devices such as tablets, exercise and health monitors and smartphones. For example, a smart phone's e-compass uses data from the device's gyroscope, magnetometer and accelerometer to provide more stable and accurate directional readings than those made by any one sensor.
Analysis of similar, complimentary sensors can reduce errors. For example, an accelerometer’s data can help to isolate the occasional offset drift in a gyroscope that might otherwise skew the gyroscope’s collected data. Even identical sensors can provide new data when paired. 3D cameras, for example, use dual cameras to gather data about depth and mimic stereoscopic vision.
Microsoft has integrated sensor fusion into Windows 8 and later systems and Google Android also supports applications that use multiple sensors. Sensor fusion is commonly used in the military for intelligent processing of remote sensing imagery. Other use cases include automotives and transportation systems, healthcare, public safety and weather forecasting.