
GPS & IMU Sensor Fusion for Automotive Dead Reckoning
Picture a car weaving through Boston’s concrete jungle, where skyscrapers scramble GPS signals and the IMU’s compass spins like a lost hiker. This project tamed the chaos by fusing the IMU’s split-second reflexes (100 Hz accelerometer/gyroscope) with GPS’s steady voice (1 Hz fixes). Using MATLAB and ROS, I transformed raw sensor data into a precision navigation engine:
Why it matters? It’s autonomy’s backbone—proving sensors can thrive in cities where GPS falters. Skills showcased: sensor fusion wizardry, real-time filtering, and turning data storms into navigation gold
Navigating a vehicle through dense urban environments—where GPS signals flicker between skyscrapers and IMU sensors drift like a compass in a storm—is a formidable challenge. This project engineered a sensor fusion pipeline to merge the IMU’s rapid motion tracking with GPS’s steady geospatial anchors, delivering a robust solution for autonomous navigation in signal-degraded areas.
Technical Approach
Sensor Setup & Calibration:
VN-100 IMU: Captured 100 Hz accelerometer, gyroscope, and magnetometer data, mounted on the vehicle dashboard.
BU-353S4 GPS: Provided 1 Hz latitude/longitude fixes via NMEA GPGGA strings, roof-mounted for optimal sky visibility.
Magnetometer Calibration:
Hard Iron Correction: Removed static magnetic offsets caused by the vehicle’s metal frame, recentering raw data to the origin.
Soft Iron Correction: Normalized elliptical distortions in the magnetic field using axis scaling, achieving a near-perfect unit-circle distribution.
Yaw Estimation via Complementary Filtering:
Magnetometer Yaw: Derived from tilt-compensated magnetic field measurements, stable but prone to urban electromagnetic interference.
Gyroscope Yaw: Integrated angular velocity for smooth short-term tracking but susceptible to drift.
Fusion Strategy: A complementary filter blended the two:
Low-Pass Filter (0.15 Hz): Smoothed magnetometer noise (e.g., power lines, steel structures).
High-Pass Filter (2 Hz): Suppressed gyroscope drift, preserving agility during sharp turns.
Outcome: 42% reduction in yaw error compared to raw gyroscope integration.
Velocity Alignment & Trajectory Reconstruction:
IMU Velocity: Bias-corrected accelerometer data was integrated to estimate speed, with high-pass filtering (0.1 Hz) to suppress drift.
GPS Validation: Velocity derived from Haversine-calculated distances between consecutive GPS fixes, acting as ground truth during stops.
Trajectory Estimation:
Decomposed IMU velocity into Easting/Northing components using fused yaw.
Integrated velocities over time to reconstruct the vehicle’s path, aligning it with GPS coordinates via initial heading rotation.
Key Results
Heading Stability: Complementary filter reduced yaw drift by 42%, critical for lane-keeping in urban turns.
Velocity Consistency: IMU-GPS speed estimates aligned within 0.48 m/s RMSE post-correction.
Trajectory Precision: IMU dead reckoning maintained <2 m deviation from GPS for 90 seconds—long enough to navigate short urban blocks or tunnels.
Why It Matters
This project bridges the gap between high-frequency sensor agility and low-frequency geospatial reliability, proving that autonomous systems can thrive even when GPS falters. By mastering sensor fusion, calibration, and real-time filtering, the system:
Mitigated urban magnetic distortions.
Delivered sub-meter velocity accuracy.
Extended reliable navigation during GPS outages.
Skills Demonstrated
Sensor Fusion: Harmonized IMU and GPS data streams using complementary filtering.
Algorithm Design: Implemented calibration, filtering, and integration workflows in MATLAB.
ROS Integration: Synchronized multi-sensor data with ROSbags for cohesive analysis.
Geospatial Analytics: Transformed raw GPS coordinates into actionable trajectories.
Impact & Future Vision
This work is a blueprint for urban-ready autonomy, addressing real-world challenges like signal blockages and sensor drift. Future iterations could integrate Kalman filters for adaptive noise handling or fuse LiDAR data for SLAM-based navigation.
Project Gallery



