What is Sensor Fusion?
Sensor fusion plays a large role in any device that is attempting to produce estimated, quantifiable data. Sensor fusion is the ability to bring together inputs from multiple sensors to produce a single model whose result is more accurate than that of the individual inputs alone. There are three fundamental methods of sensor fusion:
- Redundant Sensors- All sensors give the same information for the environment
- Complementary Sensors- The sensors provide independent, disjointed information about its environment
- Coordinated Sensors- The sensors collect information about its environment sequentially
From there, the information is communicated in one of three different ways. In a centralized setup, all sensors provide information to a common central node. If the configuration is decentralized, no information is communicated between the sensors and the nodes. If it is a distributed organization, then the nodes interchange sensor information at a given rate.
Sensor Fusion Implemented in Kalman Filters
The Kalman filter was developed in 1960 by Rudolf E. Kalman who designed it for aerospace navigation applications in the Apollo program.
These filters have been integral in the navigation of US Navy nuclear ballistic missile submarines, in the navigation of cruise missiles, reusable launch vehicles and the attitude control of a spacecraft.
In a general sense, a Kalman Filter utilizes a series of observed measurements over time. These measurements often naturally contain statistical noise and other inaccuracies that could cause its outputs to be skewed over time.
The Kalman Filter’s job is to produce estimates of these unknown variables, these estimations tend to be more accurate than data recorded by the sensors alone. The Kalman filter calculates an estimate of the current state of the variables, their certainty, and puts the collected data into a weighted average. The weighting corresponds to the level of certainty of the output. The more the filter is certain about the data, the higher its weighting will be.
At Inertial Labs, we use sensor fusion in such a way that the estimation data is more accurate than the raw data that is given by individual sensors.
Sensor Fusion in the OptoAHRS-II

The OptoAHRS-II is an optically enhanced Attitude and Heading Reference system that is small, consumes a low amount of power and contains a precise North-finding and North-keeping system. With a reliable 3DOF orientation module, it can operate in almost any environment. It works through the use of reference images, which is any picture of the horizon or nearby object in the given direction. Within these images the system will identify a constellation of visual features. Any subsequent images can determine the heading by comparing it back to the appropriate reference image. We incorporate this orientation data into the sensor fusion solution, this process allows the device to be resilient against changes in magnetic interferences presented by the environment. These technologies form a safety net for each other as each sensor compensates for the other’s error, allowing for very accurate measurements.
Applications for OptoAHRS-II
Indirect Fire Control
The OptoAHRS-II is an effective solution for those that are required to perform training simulations that demonstrate realistic conditions in the field. The OptoAHRS-II has been proven to be a useful training solution for 60mm, 81mm, 105mm, and 120mm mortars. Our sensor fusion filtering algorithm allows the user to be able to trust the optical data even when you are in a dynamic environment. The filter can adjust the certainty weighting of the sensors based off the environment that the IMU detects.
Antenna Pointing
The OptoAHRS-II stabilizes heading without any GNSS input. In GNSS denied environments, the OptoAHRS-II is a great solution for antennas positioned in environments requiring high pointing accuracy with low latency. The sensor fusion algorithm allows for accurate heading, pitch, and roll of the antenna such that it can achieve logon with the satellite regardless of its location. With Inertial Labs’ experience in the field and a wide variety of AHRS models, we are confident in our ability to provide solutions for Antenna Pointing.
Sensor Fusion in the INS-P

When using the INS-P, sensor fusion is applied to determine the orientation of a system in a 3D space. We use magnetometers, MEMS gyroscopes, and accelerometers; all of which are prone to error. We use a Kalman filter to estimate the orientation of the object based off of the information presented by the sensors.


Applications of INS-P
Remote Sensing
LiDAR’s use infrared sensors to determine the distance to an object. A rotating system makes it possible to send out light waves and measure the time is takes for the light wave to come back to it.


Inertial Labs: The Reliable Source
For the past decade, Inertial Labs has continued to produce units with accuracy and precision that our customers have come to count on. The OptoAHRS-II and INS-P are no exception to this expectation, with GNSS availability or implementation of powerful magnetometers, both devices deliver incredibly accurate orientation information through the use of their robust Kalman filters. Here at Inertial Labs we pride ourselves in producing reliable and accurate products at the most cost effective price point possible. Our sensor fusion algorithm is one of the many ways we get the best performance out of our solutions.
