Workshop 4 Preview: Understanding yDx.M - What Yaanendriya Adds

Part 4 of 4: From DIY VIO to Production-Grade IMU Solutions

ros2
imu
perception
workshop
roscon-india
yaanendriya
ydxm
Author

Rajesh

Published

December 17, 2025

The Journey Complete

Over the past three posts, we experienced the full problem domain with real measurements:

Part Experience Key Measurement
Part 1 IMU Alone 12cm drift in 5s, yaw drifts 2-5°/min
Part 2 Vision Alone 100% frames lost on fast motion & textureless
Part 3 VIO Fusion Only 15% lost, auto-recovery works!

Now we understand WHY sophisticated IMU solutions matter - we measured the problems ourselves.

This final post introduces Yaanendriya and their yDx.M module - the professional solution to problems we experienced firsthand.


About Yaanendriya Pvt. Ltd.

Company Profile

┌─────────────────────────────────────────────────────────────────────────┐
│                        YAANENDRIYA PVT. LTD.                            │
│                   Indigenizing India's Sensor Ecosystem                 │
│                                                                         │
│    Founded:     2023                                                   │
│    Location:    Bengaluru, Karnataka, India                            │
│    Incubation:  ARTPARK (AI & Robotics Technology Park)                │
│                 at Indian Institute of Science (IISc) Bangalore        │
│                                                                         │
│    Support:     Ministry of Heavy Industries                           │
│                 Department of Science & Technology                     │
│                                                                         │
│    Mission:     Building indigenous sensor systems for                 │
│                 autonomous vehicles, drones, and robots                │
│                                                                         │
└─────────────────────────────────────────────────────────────────────────┘

ARTPARK Connection

ARTPARK (AI & Robotics Technology Park) is a unique initiative at IISc Bangalore that:

  • Translates deep-tech research into commercial products
  • Incubates startups focused on robotics and AI
  • Connects academia with industry
  • Receives government and industry funding

Yaanendriya emerged from this ecosystem, bringing academic research to production-grade products.

Their Product Line

Product Description Use Case
yDx.M Inertial Sensing Module (AHRS) Primary motion estimation
Syncore Nano Navigation stack controller Drone/robot autopilot
YPS M9N GNSS module (u-blox M9) Outdoor positioning

yDx.M: The Workshop’s Star

What is yDx.M?

yDx.M is an Attitude and Heading Reference System (AHRS) - a calibrated, integrated IMU solution that outputs orientation directly.

┌─────────────────────────────────────────────────────────────────────────┐
│                          yDx.M ARCHITECTURE                             │
│                                                                         │
│    ┌─────────────────────────────────────────────────────────────────┐ │
│    │                    SENSOR ARRAY                                  │ │
│    │   ┌──────────────┐ ┌──────────────┐ ┌──────────────┐            │ │
│    │   │ Accelerometer│ │  Gyroscope   │ │ Magnetometer │            │ │
│    │   │ ±2/4/8/16G  │ │ 450-2000°/s │ │  (optional)  │            │ │
│    │   └──────┬───────┘ └──────┬───────┘ └──────┬───────┘            │ │
│    └──────────┼────────────────┼────────────────┼───────────────────┘ │
│               │                │                │                      │
│               └────────────────┼────────────────┘                      │
│                                ▼                                       │
│    ┌─────────────────────────────────────────────────────────────────┐ │
│    │                  ONBOARD PROCESSING                              │ │
│    │   ┌──────────────────────────────────────────────────────────┐  │ │
│    │   │ Factory Calibration + Sensor Fusion Algorithm            │  │ │
│    │   │ (Runs on integrated MCU)                                 │  │ │
│    │   └──────────────────────────────────────────────────────────┘  │ │
│    └─────────────────────────────────────────────────────────────────┘ │
│                                │                                       │
│                                ▼                                       │
│    ┌─────────────────────────────────────────────────────────────────┐ │
│    │                       OUTPUT                                     │ │
│    │   • Calibrated acceleration (m/s²)                              │ │
│    │   • Calibrated angular velocity (rad/s)                         │ │
│    │   • Computed orientation (quaternion/Euler)  ← Key difference! │ │
│    │   • Temperature compensated                                     │ │
│    └─────────────────────────────────────────────────────────────────┘ │
│                                                                         │
│    Interface: I2C / UART / SPI                                        │
│    DOF Options: 6-DOF, 9-DOF, 10-DOF                                  │
│                                                                         │
└─────────────────────────────────────────────────────────────────────────┘

Specifications

Feature yDx.M (6-DOF) yDx.M (9-DOF) yDx.M (10-DOF)
Accelerometer ±2/4/8/16G ±2/4/8/16G ±2/4/8/16G
Gyroscope 450-2000°/s 450-2000°/s 450-2000°/s
Magnetometer
Barometer
Yaw Reference Relative Absolute (N) Absolute (N)
Altitude Barometric
Calibration Factory Factory Factory

Unique Feature: Distributed Sensing

┌─────────────────────────────────────────────────────────────────────────┐
│                   DISTRIBUTED SENSOR NETWORK                            │
│                                                                         │
│    Traditional Approach:           yDx.M Approach:                     │
│    ──────────────────────          ─────────────────                   │
│    ┌───────┐                       ┌───────┐ ┌───────┐ ┌───────┐      │
│    │ Single│                       │ yDx.M │ │ yDx.M │ │ yDx.M │      │
│    │  IMU  │                       │  #1   │ │  #2   │ │  #3   │      │
│    └───┬───┘                       └───┬───┘ └───┬───┘ └───┬───┘      │
│        │                               │         │         │           │
│        ▼                               └─────────┼─────────┘           │
│    Single point                                  ▼                     │
│    of failure                          ┌─────────────────┐             │
│                                        │ Fusion & voting │             │
│                                        │   (redundancy)  │             │
│                                        └─────────────────┘             │
│                                                                         │
│    Benefits:                                                           │
│    • Fault tolerance (sensor failure doesn't crash system)             │
│    • Improved accuracy (multiple measurements averaged)                │
│    • Vibration rejection (distributed sampling)                        │
│    • Flexible placement (sensors where needed)                         │
│                                                                         │
└─────────────────────────────────────────────────────────────────────────┘

D435i DIY vs yDx.M Production

What We Built (D435i)

# Our DIY VIO stack
realsense2_camera          # Raw IMU data
    
imu_filter_madgwick        # Compute orientation
    
rtabmap_ros                # VIO + SLAM
    
robot_localization         # EKF fusion (optional)

What yDx.M Provides

# yDx.M integrated solution
yDx.M hardware             # Calibrated sensors + onboard AHRS
    
yDx.M ROS 2 driver         # Direct orientation output
    
Your VIO/SLAM stack        # Integration point

Feature Comparison

Feature D435i DIY yDx.M Production
Raw IMU
Orientation Output Need Madgwick ✅ Built-in
Magnetometer ✅ (9/10-DOF)
Barometer ✅ (10-DOF)
Factory Calibration ❌ Manual ✅ Factory
Distributed Sensing ✅ Network of modules
Temperature Compensation
Form Factor Camera module Standalone board
Interface USB (via RealSense) I2C/UART/SPI
Price Point ~$250 (whole camera) TBD

When to Use Which

┌─────────────────────────────────────────────────────────────────────────┐
│                    CHOOSING YOUR IMU SOLUTION                           │
│                                                                         │
│    D435i IMU (Our DIY Approach)                                        │
│    ───────────────────────────────                                     │
│    Best for:                                                           │
│    • Learning and prototyping                                          │
│    • Projects where camera is primary sensor                           │
│    • Cost-sensitive applications                                       │
│    • Non-critical navigation                                           │
│                                                                         │
│    Limitations:                                                        │
│    • Manual calibration required                                       │
│    • No absolute heading (no magnetometer)                             │
│    • Single point of failure                                           │
│    • USB interface only                                                │
│                                                                         │
│    ─────────────────────────────────────────────────────────────────── │
│                                                                         │
│    yDx.M (Production Approach)                                         │
│    ───────────────────────────────                                     │
│    Best for:                                                           │
│    • Production robotics                                               │
│    • Drones and UAVs                                                   │
│    • Applications requiring absolute heading                           │
│    • Safety-critical systems (distributed sensing)                     │
│    • Indian manufacturing ecosystem                                    │
│                                                                         │
│    Advantages:                                                         │
│    • Factory calibrated (deploy immediately)                           │
│    • Magnetometer for absolute yaw                                     │
│    • Multiple sensor support                                           │
│    • Low-level interface flexibility                                   │
│                                                                         │
└─────────────────────────────────────────────────────────────────────────┘

Workshop Exercise Mapping

How Our Preparation Maps to Workshop

Workshop Exercise Our Preparation What Workshop Adds
1. IMU Data Acquisition ✅ Part 1: Experiments 1-3 yDx.M driver specifics
2. IMU Filtering ✅ Part 1: Experiments 4-7 yDx.M built-in AHRS
3. Sensor Fusion ✅ Part 3: Experiments 11-12 yDx.M fusion features
4. Inertial-Aided SLAM ✅ Part 3: Experiments 13-14 yDx.M + SLAM integration
5. End-to-End Pipeline ✅ Complete VIO pipeline yDx.M production deployment

What We Know vs What We’ll Learn

┌─────────────────────────────────────────────────────────────────────────┐
│                    KNOWLEDGE FRAMEWORK                                  │
│                                                                         │
│    What We Now Understand (From Parts 1-3):                            │
│    ───────────────────────────────────────────                         │
│    ✅ Why raw IMU has no orientation                                   │
│    ✅ How Madgwick filter works                                        │
│    ✅ Why yaw drifts without magnetometer                              │
│    ✅ How position integration fails                                   │
│    ✅ Why vision fails on fast motion                                  │
│    ✅ Why vision fails on textureless surfaces                         │
│    ✅ How VIO combines IMU + Vision                                    │
│    ✅ EKF fusion configuration                                         │
│                                                                         │
│    What Workshop Will Teach:                                           │
│    ───────────────────────────────────────────                         │
│    🎯 yDx.M ROS 2 driver API                                          │
│    🎯 Their specific fusion algorithms                                 │
│    🎯 Distributed sensor configuration                                 │
│    🎯 Production calibration procedures                                │
│    🎯 Best practices for deployment                                    │
│    🎯 Purchasing and support channels                                  │
│                                                                         │
│    Our Advantage: We understand the "why" before learning the "how"!  │
│                                                                         │
└─────────────────────────────────────────────────────────────────────────┘

Questions to Ask at Workshop

Technical Questions

  1. Driver & Integration
    • “What’s the ROS 2 message type for yDx.M output? Standard sensor_msgs/Imu or custom?”
    • “Is the ROS 2 driver open-source? Where can we find documentation?”
    • “What’s the recommended update rate for different applications?”
  2. Calibration & Accuracy
    • “How does factory calibration compare to on-site calibration?”
    • “What’s the typical yaw accuracy with the 9-DOF magnetometer?”
    • “How do you handle magnetic interference in production robots?”
  3. Distributed Sensing
    • “How do multiple yDx.M modules communicate in a sensor network?”
    • “What’s the failure mode if one sensor in the network fails?”
    • “Is there a ROS 2 node that handles multi-IMU fusion?”
  4. SLAM Integration
    • “Which SLAM packages have you tested with yDx.M?”
    • “Any specific recommendations for IMU preintegration parameters?”
    • “How does yDx.M compare to Intel T265 (discontinued) for VIO?”

Business Questions

  1. Availability
    • “How can workshop participants purchase yDx.M modules?”
    • “What’s the typical lead time for orders?”
    • “Are there educational/maker discounts?”
  2. Support
    • “What documentation and tutorials are available?”
    • “Is there a community forum or Discord?”
    • “What’s the warranty and support policy?”

Workshop Day Preparation

Hardware Checklist

Software Verification

# Before workshop, verify everything works:

# 1. Container launches
./scripts/launch-container.sh 4
# Should see: workshop4-imu container running

# 2. RealSense with IMU
ros2 launch realsense2_camera rs_launch.py \
    enable_gyro:=true enable_accel:=true unite_imu_method:=2
# Should see: Camera topics publishing

# 3. IMU filter
ros2 run imu_filter_madgwick imu_filter_madgwick_node --ros-args \
    -r imu/data_raw:=/camera/camera/imu
# Should see: /imu/data with orientation

# 4. RTAB-Map VIO
ros2 launch rtabmap_launch rtabmap.launch.py \
    visual_odometry:=true imu_topic:=/imu/data
# Should see: Odometry output

Mental Model

Going into the workshop with this understanding:

┌─────────────────────────────────────────────────────────────────────────┐
│                    YOUR PREPARATION ADVANTAGE                           │
│                                                                         │
│    Most Attendees:                You (After Parts 1-4):               │
│    ───────────────────            ─────────────────────                │
│    "IMU gives orientation"        "Raw IMU needs filtering"            │
│    "Just plug and play"           "Calibration matters"                │
│    "Why is it drifting?"          "Integration error, need fusion"     │
│    "Vision should work"           "Fast motion/textureless fails"      │
│    "What's VIO?"                  "IMU + Vision = robust poses"        │
│                                                                         │
│    You'll ask insightful questions and absorb advanced concepts!       │
│                                                                         │
└─────────────────────────────────────────────────────────────────────────┘

The Big Picture: Why This Matters

For Indian Robotics Ecosystem

Yaanendriya represents something important:

  1. Indigenous manufacturing - Not dependent on imports
  2. Local support - Same timezone, same language
  3. Cost optimization - Rupee pricing, no forex
  4. Government backing - Ministry support means stability

For Your Projects

Understanding IMU perception deeply means:

  1. Better debugging - Know where problems originate
  2. Architecture decisions - Choose right sensors for your needs
  3. Integration skills - Connect any IMU to ROS 2 pipelines
  4. Appreciation - Value what professional solutions provide

Series Conclusion

What We Actually Measured

┌─────────────────────────────────────────────────────────────────────────┐
│                    4-PART JOURNEY - REAL RESULTS                        │
│                                                                         │
│    Part 1: IMU Alone (7 Experiments)                                   │
│    ──────────────────────────────────                                  │
│    • Raw IMU: No orientation (pitch/roll = 0.0000)                     │
│    • After Madgwick: Tilt detection works!                             │
│    • Yaw drift: 2-5° per minute without magnetometer                   │
│    • Position: 12cm drift in 5 seconds → Unusable                      │
│                                                                         │
│    Part 2: Vision Alone (3 Experiments)                                │
│    ─────────────────────────────────────                               │
│    • Baseline: ~900 features, ~120 inliers → OK                        │
│    • Fast shake: 100% frames LOST                                      │
│    • Textureless: 100% LOST, no recovery                               │
│                                                                         │
│    Part 3: VIO Fusion (3 Experiments)                                  │
│    ──────────────────────────────────                                  │
│    • Fast shake: Only 15% frames lost (6.7x better!)                   │
│    • Textureless: Lost, but AUTO-RECOVERED!                            │
│    • Combined stress: 40% lost, still usable                           │
│                                                                         │
│    KEY FINDING: VIO transforms UNUSABLE → USABLE                       │
│                                                                         │
│    ═══════════════════════════════════════════════════════════════    │
│    Ready for: "Perception in ROS 2 — IMU-Centric Hands-On Deep Dive"  │
│    ═══════════════════════════════════════════════════════════════    │
│                                                                         │
└─────────────────────────────────────────────────────────────────────────┘

The Learning Philosophy

We didn’t just read about IMU problems - we experienced them:

  • Watched orientation stay at zero
  • Felt yaw drift away from true heading
  • Saw position explode to meters in seconds
  • Broke visual odometry with rapid motion
  • Stared at a white wall and lost tracking
  • Then fixed everything with fusion

This experiential learning means the workshop concepts will click immediately.


Final Checklist

Before Workshop Day

During Workshop

After Workshop


Resources

Yaanendriya

  • Company website (search: Yaanendriya Pvt Ltd)
  • ARTPARK incubator information

ROS 2 IMU Ecosystem

Background Reading


See You at ROSCon India 2025!

Workshop 4: Perception in ROS 2 — IMU-Centric Hands-On Deep Dive Presenter: Yaanendriya Pvt. Ltd. Date: December 18-20, 2025 Venue: COEP Tech, Pune

May your IMU never drift, and your features always match!