// 001

Robot
Racer

Date
January 2026
Type
Autonomous Navigation
Stack
C++ · ROS2 · Docker · LiDAR

Developed autonomous navigation logic for a LiDAR-equipped robot capable of navigating to any user-specified coordinate. The system was deployed and managed using ROS2 and Docker, and simulated using Foxglove for visual debugging.

The core navigation stack implements A* Pathfinding for global path planning and Pure Pursuit Controls for local trajectory following — all written in C++.

Navigation Stack

The core navigation stack implements A* Pathfinding for global path planning and Pure Pursuit Controls for local trajectory following — all written in C++. Uses tf2 for geometric calculations for navigation.

Perception Stack

Scans input packets from LiDAR sensor and creates a costmap based on input. Real-time updates of costmap occur, managed with ROS2, to create a full updates occupancy grid which is used in navigation.

ROS2 Architecture

Nodes were structured as a pub/sub system with separate packages for perception, planning, and control. Docker was used to containerise the full stack for reproducible deployment.

Simulation

The system was validated in Foxglove, as seen in the video, allowing visual debugging of LiDAR point clouds and planned paths in real time.

Pure Pursuit Tuning

Getting the lookahead distance right took far longer than expected — too short and the robot oscillated, too long and it cut corners. Learned to treat control parameters as experiments, not guesses.

Docker Networking

ROS2 node discovery across containers required careful configuration of DDS settings. First real exposure to containerised robotics workflows.

Perception Math

Never worked on LiDAR perception before. Went through numerous guides on how to do it, yet the task was very difficult.

C++ROS2DockerLiDARA* PathfindingPure PursuitFoxglove