TROTBOT

1 0
  • 0 Collaborators

A fully autonomous robot in an indoor environment without predefined paths ...learn more

Project status: Under Development

Robotics, RealSense™

Intel Technologies
Other

Links [1]

Overview / Usage

Trotbot is an autonomous robot designed to serve as a delivery robot in an indoor environment, such as an office or an academic building. It can carry packages inside its onboard secure container. Identification tags will be placed around the office to locate
the rooms. These will be used by the robot find it’s destination. The robot can be called from a mobile app. The robot will reach the desired chamber and send a notification. Only then the locked box on the robot can be opened. Place the deliverables inside the box and give the robot the destination through the app. The robot will find its way, avoiding obstacles in the path to reach the target.

This is a group project under Sandbox FabLab, BITS Goa and Electronic and Robotics Club, BITS Goa.

Methodology / Approach

Our aim is to solve the existing problem of indoor navigation. The drawbacks of the existing solutions in this field are:

  1. Starship Robot- It is only used for Outdoor Navigation. Requires GPS for navigation.
  2. Turtlebot Navigation Stack - It can be used for indoor navigation but requires a pre-fed map to it.
  3. Amazon Warehouse Robot - It requires pre-marked lines on the surface of the place where it has to be operated in.

Our technology will overcome all of these problems and hence we are pursuing a unique approach to solve this problem.

Once given a destination, the robot will need waypoints to reach the destination. These waypoints will be given to it by the Global Planner which is implemented using the Dijkstra Algorithm based on the data from the identification tags which are place all around the office. For this, 2D 360 degrees LIDAR is used for obstacle detection. This obstacle data is sent to the Local Planner which plans the path using the RRT Algorithm. An LQR Controller is used to follow this path. Obstacles are continuously updated and if any obstacle is noticed in the path, the path is re-planned from the current position. For localization, the optical mouse sensor is used which will be later replaced by Intel Realsense D435 for visual inertial odometry. The entire software stack is built upon the Robot Operating System (ROS)

Technologies Used

Hardware Specifications :

Robot Chassis / Platform: Custom-made Omni-Wheel drive wooden chassis
Main Computer: Raspberry Pi 3B (to be replaced by Intel NUC7i7)
Localization: Odometry using Optical Mouse Sensor (to be replaced by Visual Inertial Odometry using Intel RealSense D435)
Obstacle detection:
2D RP LIDAR
4 IR Sensors one on each side
Motor: Johnson A Grade Quality 300 RPM
Motor Driver: MDD 10-HAT
Battery: ORANGE Lithium polymer 11.1V 6200mAh / 3S 40C

Software Specifications:

Platform : ROS Kinetic
Local Path Planner : RRT Algorithm
Global Path Planner: Dijkstra Algorithm
Simulator: Gazebo, Visualizer: RViz
Path Follower: LQR Controller
OS used on Raspberry Pi: Ubuntu MATE

Comments (0)