Particle Filter Data Fusion Enhancements for MEMS-IMU/GPS
Yafei Ren, Xizhen Ke
DOI: 10.4236/iim.2010.27051   PDF    HTML     11,587 Downloads   19,244 Views   Citations

Abstract

This research aims at enhancing the accuracy of navigation systems by integrating GPS and Mi-cro-Electro-Mechanical-System (MEMS) based inertial measurement units (IMU). Because of the conditions re-quired by the large number of restrictions on empirical data, a conventional Extended Kalman Filtering (EKF) is limited to apply in navigation systems by integrating MEMS-IMU/GPS. In response to non-linear non-Gaussian dynamic models of the inertial sensors, the methods rely on a particle cloud representation of the filtering distribution which evolves through time using importance sampling and resampling ideas. Then Particle Filtering (PF) can be used to data fusion of the inertial information and real-time updates from the GPS location and speed of information accurately. The experiments show that PF as opposed to EKF is more effective in raising MEMS-IMU/GPS navigation system’s data integration accuracy.

Share and Cite:

Ren, Y. and Ke, X. (2010) Particle Filter Data Fusion Enhancements for MEMS-IMU/GPS. Intelligent Information Management, 2, 417-421. doi: 10.4236/iim.2010.27051.

This research aims at enhancing the accuracy of navigation systems by integrating GPS and Micro-Electro-Mechanical-System (MEMS) based inertial measurement units (IMU). Because of the conditions required by the large number of restrictions on empirical data, a conventional Extended Kalman Filtering (EKF) is limited to apply in navigation systems by integrating MEMS-IMU/GPS. In response to non-linear non-Gaussian dynamic models of the inertial sensors, the methods rely on a particle cloud representation of the filtering distribution which evolves through time using importance sampling and resampling ideas. Then Particle Filtering (PF) can be used to data fusion of the inertial information and real-time updates from the GPS location and speed of information accurately. The experiments show that PF as opposed to EKF is more effective in raising MEMS-IMU/GPS navigation system’s data integration accuracy.

1. Introduction

Inertial sensors are widely used for navigation systems [1]. Compared to GPS tracking result, inertial tracking offers attractive complementary features. Given perfect sensors, no external information other than initial pose estimation is required. Lang et al. [1,2] show that inertial sensors can provide a good signal-to-noise ratio, especially in cases of rapid directional change (acceleration/deceleration) and for high rotational speed. However, since inertial sensors only measure the variation rate or accelerations, the output signals have to be integrated to obtain the position and orientation data. As a result, longer integrated time produces significant accumulated drift because of noise or bias.

In MEMS-IMU/GPS integration, there are nonlinear models that should be properly handled, for example: 1) nonlinear state equations describing the MEMS-IMU error propagation; 2) nonlinear measurement equations that are related to pseudo ranges, carrier phases and Doppler shifts measured in the GPS receiver [3]. In recent years, to overcome the problems with the nonlinearity, other nonlinear filters are also considered for use in the MEMS-IMU/GPS integration, for example: 1) Particle Filter(PF), 2) Unscented Kalman Filter(UKF), 3) SIR Particle Filter(SPF) [4,5]. It is reported [5,6] that the integrated systems with these nonlinear filters show the similar performances, producing almost the same accuracies in horizontal position and velocity while the accuracy of the heading angle can be improved.

This paper combined GPS and inertial sensor with a two-channel complementary PF, which can take advantage of the low-frequency stability of GPS sensors and the high-frequency tracking of gyro sensors.

2. System Overview

Hybrid solutions attempt to overcome the drawbacks of any single sensing solution by combining the measurements of at least two tracking methods. The fusion of complementary sensors should be used to build better tracking systems. Synergies can be exploited to gain robustness, tracking speed and accuracy, and to reduce jitter and noise. Nowadays, a hybrid tracking system is the best solution to achieve a better object pose estimation and is widely applied in recent research works.

A Particle Filter (PF) is used to estimate motion by fusion of inertial and GPS data. The Extended Kalman Filter (EKF) is used for data fusion and error compensation.

Most of the hybrid approaches use the EKF to estimate the object state by fusion of inertial and GPS data. However, the EKF algorithm provides only an approximation to optimal nonlinear estimation. This can introduce large errors in the true posterior mean and covariance of the transformed Gaussian random variables, which may lead to suboptimal performance and sometimes divergence of the filter.

Our GPS system consists of a robust pose and an Inertial Measurement Unit (IMU) providing 3-D linear acceleration and 3-D rate of turn (rate gyro) [7]. Moreover, the Particle Filtering algorithm is suggested and assessed to model the variations of the MEMS sensors’ performance characteristics. Initial results show the efficiency and precision of the proposed PF modeling algorithm.

3. Three-Dimensional Error Estimation

3.1. Motion Model and System Dynamics

Any navigation systems tracking approach requires some kind of motion model, even if it is constant motion. Since we have no a priori knowledge about the forces changing the motion of GPS system or the objects, we assume no forces (accelerations) and hence constant velocities. Augmented reality (AR) systems [1] have the goal of enhancing a person’s perception of the surrounding world. We used the motion model proposed by Ref [7], where the objects motion is represented by a 15 × 1 vector:

(1)

where θ is the orientation of GPS with respect to the world (we use Z – Y – X Euler angles), ω is the angular velocity, are the position, velocity and acceleration of GPS with respect to the world. With these states, the discretized system dynamics are given by:

(2)

Figure 1. System dataflow.

where is the sampling period, is the Jacobian matrix that relates the absolute rotation angle to the angular rate and is the system random distribution noise. At each time, the 3-D GPS pose is given by the values of the and parameters.

 

3.2. Inertial Sensor Measurements

Our inertial sensor consists of three orthogonal rate gyroscopes to sense angular rates of rotation along three perpendicular axes and three accelerometers which produce three acceleration measurements. GPS orientation changes are reported by the inertial sensor, so the transformation between the {G} and {I} is needed to relate inertial and GPS motion. The rotation motion relationship between the two coordinates can be derived by:

(3)

where and denote the angular velocity relative to the GPS coordinate frame and the inertial coordinate frame, respectively. In order to determine the transformation matrix we have developed an efficient calibration method which is described in more detail in Ref [8]. The accelerometers produce three acceleration measurements, one for each axis (units: mm/s2). The accelerometers provide linear acceleration measurements in their own coordinate frame {I}. However, the acceleration term in our state vector is the linear acceleration from {C} to {W}. The function relating the state vector and the linear acceleration measurements is given by:

(4)

where the rotation from {W} to {I} is, is the position of the origin of the {I} frame with respect to the {C} frame and g is gravity. and are determined by the calibration procedure.

 

3.3. Fusion Filter

The goal of the fusion filtering is to estimate object pose parameters of (1) from the measurements of the vision and inertial sensors [9]. The basis of our fusion algorithm is a SIR particle filter [8,10]. In this section we will explain how to use such a filter to estimate the camera pose. For more details on particle filter theory, see Refs [11-13]. The motion tracking can be considered as a single-target non-linear discrete time system whose process model can be expressed as follows:

(5)

where denotes the state vector of the system at time k, it is defined by (1). represents the deterministic process which is given by (2). The measurement model is given by:

(6)

where represent the measurements noise. The nonlinear function relates the state vector to the measurements. In our design, the input measurements to the fusion filter come from three different sensors, i.e., GPS, gyroscope and accelerometers, each with its own output equation and an uncertainty in the output space:

3.3.1. For the Gyroscope

The gyroscope produces three angular velocity measurements, one for each axis (units: rad/s). This information will be associated only with the angular velocity term in the state vector. The gyroscope measurement model is then given by:

(7)

relating the state vector with the measurement vector using an identity matrix, so:

(8)

where represents a 3 × 3 zero matrix and a 3 × 3 identity matrix.

3.3.2. For the Accelerometers

The accelerometers produce three acceleration measurements, one for each axis (units: mm/s2). The accelerometer measurement model is defined by:

(9)

where is given by (4) and is the accelerometer’s measurements noise.

3.3.3. Particle Filtering

After having formulated the process and measurement models for both of the inertial and GPS sensors, we can now give an iteration of the SIR particle filter:

1) Hypotheses.

Let: be the number of particles.

be the prior distribution (at).

2) Initialization.

For: initialize the particles and generate from the initial distribution initialize the weights.

3) Evolution.

Predict new particles using different noise realization and the process model:

(10)

4) Weighting.

In our application, the measurement noises are considered Gaussian whose means are zero, and whose error covariance matrices are, respectively, , and. In this case, the weights are computed as follows:

(11)

where is the observed measurements from GPS or inertial sensors and is the measurement model, corresponding either to, or according to the availability of the measurement data.

5) Estimation.

Compute the output of the SIR filter by:

(12)

Increase k and iterate to item (3).

Since GPS data are obtained at a slower rate than the inertial sensor data, the filter will perform object pose estimation when gyroscope data, accelerometer data or GPS data is available. Thus, we implement a complementary filter as shown in Figure 2. There are two particles weighting channels sharing a common estimation module: one is for GPS measurements and the other is for inertial sensor measurements. Independent channel processing handles incomplete information measurements. For example, when no GPS measurement is available (e.g., due to occlusions), the overall system maintains object pose tracking by only using the inertial weighting channel vice versa, when GPS measurement is available, only the GPS weighting channel is used to estimate object pose to overcome the problems of inertial sensor drift due to longer integrated time.

Figure 2. Block diagram of the fusion filter.

4. Experiments

This experiment evaluates the accuracy and the robustness of our fiducially detection and recognition method. We propose to use the particle filter framework for the sensor fusion system on MEMS-IMU/GPS. Particle filters are sequential Monte-Carlo methods based upon a point mass (or ‘particle’) representation of probability densities, which can be applied to any state space model and which generalize the traditional Kalman filtering methods. We have tested our algorithm to evaluate its performance and have compared the results obtained by the particle filter with those given by a classical extended Kalman filter.

Experimental data are presented in Figure 3.

First, the experiments research based on Matlab simulation soft, which is the language of technical computing. The original data of a pilot study based on the inertia output data of the inertial measurement unit and realtime GPS location and speed, which is used to the particle filtering data fusion arithmetic. The first result is taken on one-dimensional position above the error data analysis as Figure 3. Then, and respectively, used Kalman filter、the traditional extended Kalman filter、the unscented Kalman filter and particle filter to data fusion experiment. Three filters arithmetic of the first are not to introduce, particle filter specific reference to the course of the last section, the formula is derived, and available Experimental results are presented in Figure 4.

Finally, a new technique augmenting the powerful PF predictor with the traditional KF for improving the integrated MEMS-IMU/GPS integration system performance is presented. Initial test results show the significance of the proposed PF augmentation in reducing position and velocity drifts during GPS outages.

5. Conclusions

In this paper we presented a hybrid AR approach which uses a particle filter to combine GPS and inertial technologies in order to improve the stability and the accuracy of the registration between the virtual and real world objects when enhancing the user perception.

An overview of the developed navigation system was described, and experiments demonstrated the feasibility and reliability of the system under various situations. Otherwise, we have implemented a SIR particle filter to fuse inertial and GPS data and, thus, to estimate the object poses. We have used the RMSE analysis to describe the performances of the filter. The results have been very satisfactory compared to those of classical AR techniques; they showed that the fusion method using the particle filter achieves high tracking accuracy, stability and robustness.

It is possible to apply more than three distributions to

Figure 3. Emulational data of experiment.

Figure 4. Result of the fusion filter.

Table 1. The RMSE of the fusion filter.

the PF and there are more error states to be investigated in real situations. Therefore the relations between the number of distributions and the filter performances will be investigated in the future work.

[1] P. Lang, A. Kusej, A. Pinz and G. Brasseur, “Inertial Tracking for Mobile Augmented Reality,” IEEE Instrumentation and Measurement Technology Conference, Anchorage, AK, USA, 2002, pp. 1583-1587.

[2] E. Marchand, P. Bouthemy and F. Chaumette, “A 2D-3D Model-Based Approach to Real-Time Visual Tracking,” Image Vision Computer, Vol. 19, No. 13, 2001, pp. 941-955.

[3]M. S. Grewal, L. R. Weill and A. P. Andrews, “Global Positioning Systems, Inertial Navigation, and Integration,” 2nd Edition, John Wiley & Sons, Hoboken, 2007.

[4] Y. Yi and D. A. Grejner-Brzezinska, “Nonlinear Bayesian Filter: Alternative to the Extended Kalman Filter in the GPS/INS Fusion Systems,” Proceedings of the Institute of Navigation, ION GNSS 2005, Long Beach, CA, 2005, pp. 1391-1400.

[5] Y. Kubo, T. Sato and S. Sugimoto, “Modified Gaussian Sum Filtering Methods for INS/GPS Integration,” Journal of Global Positioning Systems, Vol. 6, No. 1, 2007, pp. 65-73.

[6] M. Nishiyama, S. Fujioka, Y. Kubo, T. Sato and S. Sugimoto, “Performance Studies of Nonlinear Filtering Methods in INS/GPS In-Motion Alignment,” Proceedings of the Institute of Navigation, ION GNSS 2006, Fort Worth, TX, 2006, pp. 2733-2742.

[7] L. Chai, W. Hoff and T. Vincent, “Three-Dimensional Motion and Structure Estimation Using Inertial Sensors and Computer Vision for Augmented Reality,” Presence: Teleoperators and Virtual Environments, Vol. 11, No. 5, 2002, pp. 474-492.

[8] M. Maidi, F. Ababsa and M. Mallem, “Vision-Inertial System Calibration for Tracking in Augmented Reality,” Proceedings of the 2nd International Conference on Informatics in Control, Automation and Robotics, Barcelona, 2005, pp. 156-162.

[9] S. You and U. Neumann, “Fusion of Vision and Gyro Tracking for Robust Augmented Reality Registration,” Proceedings of IEEE Conference on Virtual Reality, Washington, DC, 2001, pp. 71-77.

[10] A. Doucet, N. J. Gordon and V. Krishnamurthy, “Particle Filters for State Estimation of Jump Markov Linear Systems,” IEEE Transactions on Signal Processing, Vol. 49, No. 3, 2001, pp. 613-624.

[11] N. Bergman and A. Doucet, “Markov Chain Monte Carlo data Association for Target Tracking,” Proceedings of IEEE International Conference on Acoustics, Speech, and Signal Processing, Istanbul, 2000, pp. 705-708.

[12] N. J. Gordon, “A Hybrid Bootstrap Filter for Target Tracking in Clutter,” IEEE Transactions on Aerospace and Electronic Systems, Vol. 33, No. 1, 1997, pp. 353- 358.

[13] A. Doucet, N. de Freitas and N. Gordon, “An Introduction to Sequential Monte Carlo Methods,” In: A. Doucet, N. de Freitas and N. Gordon, Eds., Sequential Monte Carlo Methods in Practice, Springer, New York, 2001, pp. 3-14.

Conflicts of Interest

The authors declare no conflicts of interest.

References

[1] P. Lang, A. Kusej, A. Pinz and G. Brasseur, “Inertial Tracking for Mobile Augmented Reality,” IEEE Instrumentation and Measurement Technology Conference, Anchorage, AK, USA, 2002, pp. 1583-1587.
[2] E. Marchand, P. Bouthemy and F. Chaumette, “A 2D-3D Model-Based Approach to Real-Time Visual Tracking,” Image Vision Computer, Vol. 19, No. 13, 2001, pp. 941-955.
[3] M. S. Grewal, L. R. Weill and A. P. Andrews, “Global Positioning Systems, Inertial Navigation, and Integration,” 2nd Edition, John Wiley & Sons, Hoboken, 2007.
[4] Y. Yi and D. A. Grejner-Brzezinska, “Nonlinear Bayesian Filter: Alternative to the Extended Kalman Filter in the GPS/INS Fusion Systems,” Proceedings of the Institute of Navigation, ION GNSS 2005, Long Beach, CA, 2005, pp. 1391-1400.
[5] Y. Kubo, T. Sato and S. Sugimoto, “Modified Gaussian Sum Filtering Methods for INS/GPS Integration,” Journal of Global Positioning Systems, Vol. 6, No. 1, 2007, pp. 65-73.
[6] M. Nishiyama, S. Fujioka, Y. Kubo, T. Sato and S. Sugimoto, “Performance Studies of Nonlinear Filtering Methods in INS/GPS In-Motion Alignment,” Proceedings of the Institute of Navigation, ION GNSS 2006, Fort Worth, TX, 2006, pp. 2733-2742.
[7] L. Chai, W. Hoff and T. Vincent, “Three-Dimensional Motion and Structure Estimation Using Inertial Sensors and Computer Vision for Augmented Reality,” Presence: Teleoperators and Virtual Environments, Vol. 11, No. 5, 2002, pp. 474-492.
[8] M. Maidi, F. Ababsa and M. Mallem, “Vision-Inertial System Calibration for Tracking in Augmented Reality,” Proceedings of the 2nd International Conference on Informatics in Control, Automation and Robotics, Barcelona, 2005, pp. 156-162.
[9] S. You and U. Neumann, “Fusion of Vision and Gyro Tracking for Robust Augmented Reality Registration,” Proceedings of IEEE Conference on Virtual Reality, Washington, DC, 2001, pp. 71-77.
[10] A. Doucet, N. J. Gordon and V. Krishnamurthy, “Particle Filters for State Estimation of Jump Markov Linear Systems,” IEEE Transactions on Signal Processing, Vol. 49, No. 3, 2001, pp. 613-624.
[11] N. Bergman and A. Doucet, “Markov Chain Monte Carlo data Association for Target Tracking,” Proceedings of IEEE International Conference on Acoustics, Speech, and Signal Processing, Istanbul, 2000, pp. 705-708.
[12] N. J. Gordon, “A Hybrid Bootstrap Filter for Target Tracking in Clutter,” IEEE Transactions on Aerospace and Electronic Systems, Vol. 33, No. 1, 1997, pp. 353- 358.
[13] A. Doucet, N. de Freitas and N. Gordon, “An Introduction to Sequential Monte Carlo Methods,” In: A. Doucet, N. de Freitas and N. Gordon, Eds., Sequential Monte Carlo Methods in Practice, Springer, New York, 2001, pp. 3-14.

Copyright © 2024 by authors and Scientific Research Publishing Inc.

Creative Commons License

This work and the related PDF file are licensed under a Creative Commons Attribution 4.0 International License.