An autonomous Stretcher for use in hospitals or closed enclosures. A stretcher is sent from point A to point B by means of remote control. Engineered by application of ROS and simulated on Gazebo.
- What is this?
- Description
- Robot hardware requerimets
- Simulation requeriments
- Instalation guide
- Hardware Scheme
- 3D piece
- Software architecure diagram
- RQT Graph
- ROS modules
- RRT
- Simulation
- Video
- Authors
This is a project created in our third year of Computer Engineering with a major in Computer Science at the Autonomous University of Barcelona for the subject Robotics, Language and Planning.
We had a budget of 100€ to create a robot that doesn't exist. Due to the pandemic caused by covid-19, we have not been able to implement it in reality. Consequently, we have created a robot using ROS, Gazebo and rviz for the simulation part, and Fritzing for the hardware scheme.
Enjoy it!
The prorgams approach is to enable any stretcher to move autonomously in a hospital or given closed enclosured avoiding obstacles in real-time.
So, just by changing some components of a stretcher it will become autonomous.
First of all, the stretcher will need some modifications, these are the hardware parts to make a stretcher autonomous (Remember that we had a 100€ budget, so if you want more battery capacity or power you will need to do some changes):
- Shaft Hub 4mm
- Arduino Uno
- 2200mAh 2S 25C Lipo Pack
- L298n Motor Driver
- Breadboard (Generic)
- Adafruit Compass
- Wifi module
- Ultrasound distance sensor
- Camera
- 4 Wheels
In order to simulate the robot, you will need you need a working version of ROS. We have used and tested in ROS Kinetic Kame.
- Ubuntu 16.04 Xelenial version
- ROS kinetic
- Gazebo 7
- Rviz
With the intention of facilitating the installation, we have created a step-by-step pdf file.
If you don't have installed ROS yet, you can use this quick and easy guide found on the official ROS web page:
Once ROS is installed, you will need to install the turtlebot3 dependency package.
To import our folder, you have to create a catkin workspace, we named catkin_ws
, create the packages into src**
and paste de content of the folders (important: don't copy the files). Now, you can do catkin_make
on your catkin workspace.
Lastly, import the folder AS_robot
into ~/model_editor_models
and the folder _models_
into ~/.gazebo/models
.
In order to facilitate the command prompt line, add these commands into your _~/.bashrc_
file:
source ~/catkin_ws/devel/setup.bash
export TURTLEBOT3_MODEL=burger
Finally, you should be able to run all the packages as follow:
roslaunch <package> <file>
This is the Hardware Scheme we planned via fritzing within the 100€ budget.
Remember that the purpose of the project is to enable an existing stretcher to autonomous mode, so this is just the STL model of a stretcher adding a structure at the bottom for de hardware part (battery, Arduino motherboard, and actuators).
The diagram of the software scheme shows that orders are sent to the corresponding stretcher via wifi, this activates the A star algorithm global path planner module, (To accelerate the path planning process and to obtain a more efficient route mapping to start as quickly as possible, we have also developed an RRT algorithm planner but not integrated with ROS). Afterwards, the stretcher will move following the global planner, in case it encounters an obstacle using the built-in sensors, the local path planner module will act, which consists of a Dynamic Window Approach, once the object or person is bypassed, the stretcher will return to the route indicated by the global path planner until it reaches the destination.
Using the ROS rqt graph we have generated a graph that allows us to observe the nodes that are active and the connections between the editors and subscribers who are in charge of moving the base of our stretcher. These nodes are involved in the entire operation of our robot. We can see reflected in the graph, the sensors, events, and actions of our robot, events of the ROS simulator itself, and how they are all connected to each other. In green, we have the route that is currently being followed, and therefore, the nodes that are being used.
The SLAM (Simultaneous Localization and Mapping) is a technique to draw a map by estimating the current location in an arbitrary space. To enable the path planner we need a 2D map as base for calculating the route. That's why we used the Turtlebot3 SLAM system to get a 2D map.
Quick video of the mapping process:
This is the result we got:
due to the deficiencies of the map planning was not perfect and the route planner did not respect walls or searched a way outside the premises, we then applied knowledge of morphological transformations learned in computer vision to create the following map in MATLAB:
The A* algorithm included in the ROS package is used as a global path planner, It develops route plans starting from map coordinates, which it interprets as a vertex of a graph, and begins to explore adjacent nodes until the destination node (map destination coordinate) is reached. This algorithm generally finds very low-cost routes in a very short time.
Example from AtsushiSakai of how it works:
The DWA algorithm included in the ROS package is used as a local path planner, it takes into account the dynamic and kinematic limitations of the stretcher. After entering the global plan and cost map, the local path planner will generate commands that will be sent to the move base. The objective of this planner is to follow the path indicated by the global planner, penalizing when deviating from the route in order to also be able to avoid obstacles.
Example from AtsushiSakai of how it works:
We created but not implemented a RRT algorithm in MATLAB to perform the global path planner.
Example of path obtained:
Using Gazebo and rviz, we can see how by determining a goal destination creates a route and follows the path:
We can see how putting some objects on the scene, the global planer makes a route through the objects but once the sensors detect them it dodges them using the local planner:
Larger video on youtube: video
Antonio Fernández Campagna Github LinkedIn