Intelligent Ground Vehicle Robot - In Progress

I am currently a Software Team member for the IGVC autonomous ground vehicle project at California State University, Chico. The project focuses on developing a fully autonomous outdoor robot capable of navigating complex environments using computer vision, perception pipelines, and real-time decision-making algorithms.

My primary contributions are within the perception and autonomy stack, where I work on machine learning integration, lane segmentation, and path planning algorithms.

Software & Embedded System

A major component of the system involves training and deploying a YOLO-based computer vision model to detect obstacles such as potholes and traffic cones. I was involved in:

  • Dataset preparation and annotation using Roboflow

  • Model training and validation

  • Performance evaluation under varying lighting and terrain conditions

  • Integration of the trained model into the robot’s real-time perception pipeline

The goal is to ensure reliable object detection with minimal latency during live navigation.

This Project will be completed under the American Institute of Mechatronic Engineers (AIME) at California State University Chico

Lane Segmentation & Environment Understanding

Mainly working on lane segmentation algorithms to allow the robot to identify and follow navigable boundaries. This involves:

  • Processing camera input to detect lane markings

  • Extracting region-of-interest features

  • Filtering noise and environmental inconsistencies

  • Producing a usable binary/segmented representation for downstream planning

The segmentation output feeds directly into the robot’s navigation and control logic.

Path Planning & Autonomy

Also contribute to the path planning system, which converts segmented lane data and obstacle detections into safe, executable trajectories. Current work includes:

  • Designing algorithms to compute feasible paths within detected boundaries

  • Avoiding obstacles using detected bounding box data

  • Generating smooth navigation commands for the motion controller

  • Optimizing computational efficiency for real-time performance

This requires coordinating perception outputs with decision-making logic while maintaining deterministic behavior under changing environmental conditions.