Autonomous Navigation Robot

hero-image

Rex Worley

Project Timeline

Jan 2024 - May-2024

OVERVIEW

This project demonstrates the design and implementation of an autonomous robotic system capable of navigating complex environments, identifying objects through sensor fusion, and executing precise manipulation tasks. The robot integrates mechanical design, sensor processing, and multi-threaded control architecture to autonomously traverse a maze, locate and identify colored blocks, and relocate them to corresponding destinations. Technical Approach: The system uses a Raspberry Pi as the central controller, coordinating an Adafruit VL53L4CD Time-of-Flight (ToF) distance sensor for spatial awareness, an APDS-9960 RGB color sensor for object identification, TETRIX DC motors for differential-drive locomotion, and servo-actuated mechanisms for object manipulation. The drive base employs two parallel-mounted TETRIX motors, each powering an independent wheel, to maximize torque efficiency, with an omni-wheel mounted at the rear to facilitate low-friction turning and weight distribution. A servo-mounted ToF sensor provides the equivalent of three fixed sensors by dynamically rotating to measure distances in multiple directions, enabling the robot to detect front and side walls with an accuracy of 30 centimeters. Control Implementation: The software architecture implements Python-based threading to enable concurrent execution of sensor processing and motion control, allowing continuous ToF distance monitoring without blocking the main control loop. The navigation algorithm employs closed-loop feedback control, continuously comparing sensor data against setpoint values and adjusting motor inputs to minimize positional error and maintain proper maze traversal. Wall detection logic processes ToF measurements to determine when distance readings fall below 30cm thresholds, triggering autonomous decision-making for turns and alignment maneuvers. The robot performs wall-alignment sequences by reversing into detected side walls to correct its heading before continuing forward navigation. System Integration & Object Manipulation: Color identification uses a voting algorithm that samples RGB values five times and selects the most common result; the maximum of the red, green, and blue channels determines the detected color. A scissor-arm mechanism with an integrated grabbing claw secures blocks upon detection, controlled by servo motors for jaw actuation and a REV Robotics 40:1 motor for arm angle adjustment. The pulley system uses a TETRIX motor and string-tensioned linkages threaded through each scissor-arm joint to extend the mechanism vertically, though full vertical extension was not achieved in the final implementation due to time constraints. The robot successfully demonstrated autonomous maze navigation with multiple turns, block detection and acquisition, color identification, and transportation to the final destination.

HighlightS

Technical Achievements:

  • Developed a multi-threaded Python control system enabling simultaneous sensor processing and motion control without blocking operations
  • Implemented closed-loop feedback control for autonomous maze navigation using real-time ToF sensor data and adaptive decision-making algorithms
  • Designed a servo-actuated sensor platform providing 3-axis coverage from a single ToF sensor through dynamic repositioning
  • Created a robust color identification algorithm using statistical voting across multiple RGB samples to ensure reliable block classification

System Integration:

  • Built a complete autonomous robotic system integrating a Raspberry Pi controller, DC motors, servo actuators, ToF distance sensor, RGB color sensor, and custom mechanical assemblies
  • Designed a parallel dual-motor drive base with omni-wheel rear support for maximum torque efficiency and low-friction turning
  • Implemented scissor-arm manipulation system with pulley-driven extension, servo-controlled claw, and motorized angle adjustment

Problem-Solving:

  • Resolved ToF sensor oscillation issues (20-120cm fluctuations) by implementing a weighted averaging algorithm for stable distance measurements
  • Discovered and adapted to block-detection orientation requirements through iterative testing and code refinement
  • Overcame wall-versus-block differentiation challenges through threshold-based logic and alignment sequences
  • Applied DRY (Don't Repeat Yourself) methodology by creating reusable functions for each robot action, reducing code redundancy

SKILLS

Multi-threaded programming (Python)Sensor fusionAutonomous navigation algorithmsClosed-loop control systemsMechanical design (CAD)Servo and DC motor controlEmbedded Linux (Raspberry Pi)System-level integration and debugging

ADDITIONAL CONTENTS

Home
Questions?
hero-image

Rex Worley

I am a Mechatronics Engineering student at Texas A&M University with hands-on experience in industrial automation and embedded systems. I've engineered PLC control systems, designed IoT dashboards with real-time sensor integration, and programmed autonomous robotic systems. I'm passionate about applying robotics and control systems to solve real-world engineering challenges.

| lowinertia |
Engineering Portfolio in 15 minutes
Create Your Portfolio