Turtlebot3教程|JTDQ logo Turtlebot3教程|JTDQ

Subscribe with RSS to keep up with the latest changes.

Videos

July 17, 2019

Videos

Videos from Open Source Team

[TurtleBot3 57 ROS2 Navigation2]

[TurtleBot3 56 ROS2 Cartographer]

[TurtleBot3 55 ROS2 Tutorials]

[TurtleBot3 54 Additional Sensors with TurtleBot3]

[TurtleBot3 53 AutoRace on RDS]

[TurtleBot3 52 RDS Task Mission Tutorial]

[TurtleBot3 51 Machine Learning tutorial 3]

[TurtleBot3 50 Machine Learning tutorial 2]

[TurtleBot3 49 Machine Learning tutorial 1]

[TurtleBot3 48 Autorace with Gazebo]

[TurtleBot3 47 Reinforcement Learning]

[TurtleBot3 46 Pick and Place Tutorial by TurtleBot3 with OpenMANIPULATOR]

[TurtleBot3 45 TurtleBot3 with OpenMANIPULATOR]

[TurtleBot3 44 Automatic Parking Vision]

[TurtleBot3 43 TurtleBot3 AutoRace Tutorial 6: Tunnel]

[TurtleBot3 42 TurtleBot3 AutoRace Tutorial 5: Level Crossing]

[TurtleBot3 41 TurtleBot3 AutoRace Tutorial 4: Node Optimization]

[TurtleBot3 40 TurtleBot3 AutoRace Tutorial 3: Parking]

[TurtleBot3 39 TurtleBot3 AutoRace Tutorial 2: Lane Tracking]

[TurtleBot3 38 TurtleBot3 AutoRace Tutorial 1: Traffic Light]

[TurtleBot3 37 Gazebo Simulation Tutorial]

[TurtleBot3 36 Waffle Pi Camera]

[TurtleBot3 35 How to use LDS]

[TurtleBot3 34 Basic Operation]

[TurtleBot3 33 Automatic Parking]

[TurtleBot3 32 AutoRace RBiz Challenge 2017]

[TurtleBot3 31 Burger Assembly]

[TurtleBot3 30 Follow Demo]

[TurtleBot3 29 Friends Example]

[TurtleBot3 28 Navigation Example]

[TurtleBot3 27 SLAM Example]

[TurtleBot3 26 Laser Distance Sensor (LDS) Example]

[TurtleBot3 25 Gazebo Simulator Example]

[TurtleBot3 24 Intel® RealSense™ Example]

[TurtleBot3 23 Fake Node]

[TurtleBot3 22 Hardware]

[TurtleBot3 21 Friends - Auto]

[TurtleBot3 20 Friends - OpenMANIPULATOR Chain]

[TurtleBot3 19 Friends - Road Train]

[TurtleBot3 18 Friends - Monster]

[TurtleBot3 17 Friends - Segway]

[TurtleBot3 16 Assembling the TurtleBot3 Premium]

[TurtleBot3 15 Assembling the TurtleBot3 Basic]

[TurtleBot3 14 Friends - Tank]

[TurtleBot3 13 Friends - Car]

[TurtleBot3 12 Friends - Omni & Mecanum]

[TurtleBot3 11 Friends - Real TurtleBot meets R2D2]

[TurtleBot3 10 Friends - Real TurtleBot]

[TurtleBot3 09 SLAM using Gmapping and Cartographer]

[TurtleBot3 08 Teleoperation Example]

[TurtleBot3 07 Friends - Conveyor]

[TurtleBot3 06 Payload]

[Turtlebot3 05 Manipulator X6 RViz]

[Turtlebot3 04 Gmapping and Cartographer]

[Turtlebot3 03 Navigation]

[Turtlebot3 02 SLAM]

[Turtlebot3 01 Assembling and Example]

Videos from ROBOTIS Channel

[TurtleBot3 AutoRace @RobotWorld 2017 R-BIZ Challenge]

[TurtleBot3 AutoRace TestDrive 2017 (RBIZ @ RobotWorld)]

[TurtleBot3 Delivery Service Demo @ICRA2017 (2)]

[TurtleBot3 Delivery Service Demo @ICRA2017 (1)]

[TurtleBot3 Following Demo @ICRA2017]

[TurtleBot3 @ICRA]

[TurtleBot3 - Behind Scenes]

[TurtleBot3 - Official Product Video]

Topic_monitor

July 17, 2019

Topic Monitor

WARNING: Be careful when running the robot on the table as the robot might fall.

NOTE:

  • This instructions were tested on Ubuntu 16.04 and ROS Kinetic Kame.
  • This instructions are supposed to be running on the remote PC. Please run the instructions below on your Remote PC.
  • Make sure to run the Bringup instructions before running the instructions below.

In order to check the topics of TurtleBot3, we will use rqt provided by ROS. The rqt is a Qt-based framework for GUI development for ROS. The rqt is a tool that allows users to easily see the topic status by displaying all the topics in the topic list. There are topic names, types, bandwidth, Hz, value in GUI.

[Remote PC] Run the rqt.

$ rqt

TIP: If rqt is not displayed, select the plugin -> Topics -> Topic Monitor.

When rqt is first run, the topic values are not monitored. To monitor the topic, click the checkbox next to each topic.

If you want to see more detail topic message, click the button next to each checkbox.

  • /battery_state indicates a message relating to the battery condition, such as the current battery voltage and remaining capacity.

  • /diagnostics indicates a message the status of the components connected to the TurtleBot3, such as a MPU9250, a DYNAMIXEL X, a HLS-LFCD-LDS, a battery and a OpenCR.

  • /odom indicates a message the odometry of the TurtleBot3. This topic has orientation and position by the encoder data.

  • /sensor_state indicates a message the encoder values, battery and torque.

  • /scan indicates a message all of the LDS data, such as angle_max and min, range_max and min, indicates, ranges.

In addition, you can monitor topics through rqt whenever you have a topic added.

Tensorflow

July 17, 2019

Tensorflow

Install tensorflow https://www.tensorflow.org/install. I recommand installing with Virtualenv for python3.

Activate the Virtualenv environment by issuing one of the following commands:

$ source ~/tensorflow/bin/activate # bash, sh, ksh, or zsh
$ source ~/tensorflow/bin/activate.csh  # csh or tcsh

install pip3 packages.

pip3 install Keras numpy pandas matplotlib Pillow gym h5py scikit-image

install rospkg with pycham.

[File] -> [settings] -> [Project:FILE.py] -> [Project Interpreter]. Change Project Interpreter to python3.5.

Install setuptools and rospkg.

TurtleBot3 Fake Node Implementation

Tip : The terminal application can be found with the Ubuntu search icon on the top left corner of the screen. Shortcut key for terminal is Ctrl-Alt-T.

Install dependent packages for TurtleBot3 Simulation.

Note : turtlebot3_simulation package requires TurtleBot3 package as a prerequisite.

$ cd ~/catkin_ws/src/
$ git clone https://github.com/ROBOTIS-GIT/turtlebot3_simulations.git
$ cd ~/catkin_ws && catkin_make

Step_by_step_to_install_ubuntu_on_joule

July 17, 2019

Step by Step to install Ubuntu

[Intel® Joule™] Connect micro HDMI to HDMI cable, power connector supplied by OpenCR1.0, USB devices including Bootable USB drive, mouse and keyboard. You might need a USB hub to plug multiple USB devices into Joule.

[Intel® Joule™] Installation will be proceeded as shown in below images. When Joule is turned on, monitor will blink about 3 times after 5 seconds, and print menu screen. Press F7 to go to Boot Manager.

[Intel® Joule™] Select USB Device.

[Intel® Joule™] Select Erase disk and install Ubuntu then continue.

[Intel® Joule™] Intel® Joule™ has two different disk drives: 16GB micro SD Card and 16GB eMMC. In this instruction, it is highly recommended to install Alternarive Ubuntu for Joule on the 16GB eMMC. Select MMC/SD card #2 (mmcblk1) - 15.7 GB MMC 016G32 then continue.

[Intel® Joule™] Installation will take about 10 minutes.

[Intel® Joule™] When installation is completed, click Restart Now.

[Intel® Joule™] Remove bootable USB drive from Joule.

[Intel® Joule™] Don’t press any key. It will boot from 16GB eMMC which is a default boot device.

[Intel® Joule™] Finish the rest of settings.

Specifications

July 17, 2019

Specifications

Hardware Specifications

Items Burger Waffle (Discontinued) Waffle Pi
Maximum translational velocity 0.22 m/s 0.26 m/s 0.26 m/s
Maximum rotational velocity 2.84 rad/s (162.72 deg/s) 1.82 rad/s (104.27 deg/s) 1.82 rad/s (104.27 deg/s)
Maximum payload 15kg 30kg 30kg
Size (L x W x H) 138mm x 178mm x 192mm 281mm x 306mm x 141mm 281mm x 306mm x 141mm
Weight (+ SBC + Battery + Sensors) 1kg 1.8kg 1.8kg
Threshold of climbing 10 mm or lower 10 mm or lower 10 mm or lower
Expected operating time 2h 30m 2h 2h
Expected charging time 2h 30m 2h 30m 2h 30m
SBC (Single Board Computers) Raspberry Pi 3 Model B and B+ Intel® Joule™ 570x Raspberry Pi 3 Model B and B+
MCU 32-bit ARM Cortex®-M7 with FPU (216 MHz, 462 DMIPS) 32-bit ARM Cortex®-M7 with FPU (216 MHz, 462 DMIPS) 32-bit ARM Cortex®-M7 with FPU (216 MHz, 462 DMIPS)
Remote Controller - - RC-100B + BT-410 Set (Bluetooth 4, BLE)
Actuator Dynamixel XL430-W250 Dynamixel XM430-W210 Dynamixel XM430-W210
LDS(Laser Distance Sensor) 360 Laser Distance Sensor LDS-01 360 Laser Distance Sensor LDS-01 360 Laser Distance Sensor LDS-01
Camera - Intel® Realsense™ R200 Raspberry Pi Camera Module v2.1
IMU Gyroscope 3 Axis
Accelerometer 3 Axis
Magnetometer 3 Axis
Gyroscope 3 Axis
Accelerometer 3 Axis
Magnetometer 3 Axis
Gyroscope 3 Axis
Accelerometer 3 Axis
Magnetometer 3 Axis
Power connectors 3.3V / 800mA
5V / 4A
12V / 1A
3.3V / 800mA
5V / 4A
12V / 1A
3.3V / 800mA
5V / 4A
12V / 1A
Expansion pins GPIO 18 pins
Arduino 32 pin
GPIO 18 pins
Arduino 32 pin
GPIO 18 pins
Arduino 32 pin
Peripheral UART x3, CAN x1, SPI x1, I2C x1, ADC x5, 5pin OLLO x4 UART x3, CAN x1, SPI x1, I2C x1, ADC x5, 5pin OLLO x4 UART x3, CAN x1, SPI x1, I2C x1, ADC x5, 5pin OLLO x4
Dynamixel ports RS485 x 3, TTL x 3 RS485 x 3, TTL x 3 RS485 x 3, TTL x 3
Audio Several programmable beep sequences Several programmable beep sequences Several programmable beep sequences
Programmable LEDs User LED x 4 User LED x 4 User LED x 4
Status LEDs Board status LED x 1
Arduino LED x 1
Power LED x 1
Board status LED x 1
Arduino LED x 1
Power LED x 1
Board status LED x 1
Arduino LED x 1
Power LED x 1
Buttons and Switches Push buttons x 2, Reset button x 1, Dip switch x 2 Push buttons x 2, Reset button x 1, Dip switch x 2 Push buttons x 2, Reset button x 1, Dip switch x 2
Battery Lithium polymer 11.1V 1800mAh / 19.98Wh 5C Lithium polymer 11.1V 1800mAh / 19.98Wh 5C Lithium polymer 11.1V 1800mAh / 19.98Wh 5C
PC connection USB USB USB
Firmware upgrade via USB / via JTAG via USB / via JTAG via USB / via JTAG
Power adapter (SMPS) Input : 100-240V, AC 50/60Hz, 1.5A @max
Output : 12V DC, 5A
Input : 100-240V, AC 50/60Hz, 1.5A @max
Output : 12V DC, 5A
Input : 100-240V, AC 50/60Hz, 1.5A @max
Output : 12V DC, 5A

Dimension and Mass

Data of TurtleBot3 Burger

Data of TurtleBot3 Waffle

Data of TurtleBot3 Waffle Pi

Components

SBCs

Sensors

Embedded Board

  • OpenCR1.0
    • TurtleBot3 Burger, Waffle, Waffle Pi

Actuators

Simulation

July 17, 2019

Simulation

NOTE:

  • This instructions were tested on Ubuntu 16.04 and ROS Kinetic Kame.
  • This instructions are supposed to be running on the remote PC. Please run the instructions below on your Remote PC.

TurtleBot3 supports development environment that can be programmed and developed with a virtual robot in the simulation. There are two development environments to do this, one is using fake node and 3D visualization tool RViz and the other is using the 3D robot simulator Gazebo.

The fake node method is suitable for testing with the robot model and movement, but it can not use sensors. If you need to test SLAM and Navigation, we recommend using Gazebo, which can use sensors such as IMU, LDS, and camera in the simulation.

TurtleBot3 Simulation using Fake Node

To use turtlebot3_fake_node, you need the turtlebot3_simulation metapackage. Install the package as shown in the following command.

TIP: The terminal application can be found with the Ubuntu search icon on the top left corner of the screen. The shortcut key for running the terminal is Ctrl-Alt-T.

NOTE: The turtlebot3_simulation metapackage requires turtlebot3 metapackage and turtlebot3_msgs package as a prerequisite. If you did not install it in the Install Dependent ROS Packages of PC Setup section, install it first.

$ cd ~/catkin_ws/src/
$ git clone https://github.com/ROBOTIS-GIT/turtlebot3_simulations.git
$ cd ~/catkin_ws && catkin_make

To launch the virtual robot, execute the turtlebot3_fake.launch file in the turtlebot3_fake package as shown below. The turtlebot3_fake is a very simple simulation node that can be run without having an actual robot. You can even control the virtual TurtleBot3 in RViz with a teleoperation node.

TIP: Before executing this command, you have to specify the model name of TurtleBot3. The ${TB3_MODEL} is the name of the model you are using in burger, waffle, waffle_pi. If you want to permanently set the export settings, please refer to Export TURTLEBOT3_MODEL page.

$ export TURTLEBOT3_MODEL=${TB3_MODEL}
$ roslaunch turtlebot3_fake turtlebot3_fake.launch
$ export TURTLEBOT3_MODEL=${TB3_MODEL}
$ roslaunch turtlebot3_teleop turtlebot3_teleop_key.launch

TurtleBot3 Simulation using Gazebo

There are two ways to simulate using Gazebo. first method is to use with ROS through turtlebot3_gazebo package and second method is to use only gazebo and turtlebot3_gazebo_plugin plugin without using ROS.

If you want to use the first method, see the instructions below. For the second method, see the following instructions.

ROS packages for Gazebo

NOTE: If you are running Gazebo for the first time on your Remote PC, it takes a bit longer than usual.

Simulate in Various World

1) Empty World

The following command can be used to test the virtual TurtleBot3 on the empty world of the gazebo default environment.

TIP: Before executing this command, you have to specify the model name of TurtleBot3. The ${TB3_MODEL} is the name of the model you are using in burger, waffle, waffle_pi. If you want to permanently set the export settings, please refer to Export TURTLEBOT3_MODEL page.

$ export TURTLEBOT3_MODEL=${TB3_MODEL}
$ roslaunch turtlebot3_gazebo turtlebot3_empty_world.launch

2) TurtleBot3 World

TurtleBot3 world is a map consists of simple objects that makes up the shape of TurtleBot3 symbol. TurtleBot3 world is mainly used for testing such as SLAM and Navigation.

$ export TURTLEBOT3_MODEL=${TB3_MODEL}
$ roslaunch turtlebot3_gazebo turtlebot3_world.launch

3) TurtleBot3 House

TurtleBot3 House is a map made with house drawings. It is suitable for testing related to more complex task performance.

$ export TURTLEBOT3_MODEL=${TB3_MODEL}
$ roslaunch turtlebot3_gazebo turtlebot3_house.launch

NOTE : If TurtleBot3 House is excuted for the first time, downloading the map file takes a couple of minutes or more depending on download speed.

Drive TurtleBot3

1) Teleoperation on Gazebo

In order to control a TurtleBot3 with a keyboard, please launch teleoperation feature with below command in a new terminal window.

$ roslaunch turtlebot3_teleop turtlebot3_teleop_key.launch
2) Collision Avoidance

In order to autonomously drive a TurtleBot3 around the TurtleBot3 world, open a new terminal window and enter below command.

$ export TURTLEBOT3_MODEL=${TB3_MODEL}
$ roslaunch turtlebot3_gazebo turtlebot3_world.launch

Open a new terminal window and enter below command.

$ export TURTLEBOT3_MODEL=${TB3_MODEL}
$ roslaunch turtlebot3_gazebo turtlebot3_simulation.launch

Execute RViz

RViz visualizes published topics while simulation is running. You can launch RViz in a new terminal window by entering below command.

$ export TURTLEBOT3_MODEL=${TB3_MODEL}
$ roslaunch turtlebot3_gazebo turtlebot3_gazebo_rviz.launch

Virtual SLAM with TurtleBot3

For virtual SLAM in Gazebo, instead of running the actual robot, you can select the various environments and robot models mentioned above, and the SLAM-related commands will use the ROS packages used in the SLAM section.

Virtual SLAM Execution Procedure

The following commands are examples of using the TurtleBot3 Waffle Pi model and the turtlebot3_world environment.

  • Launch Gazebo
    $ export TURTLEBOT3_MODEL=waffle_pi
    $ roslaunch turtlebot3_gazebo turtlebot3_world.launch
    
  • Launch SLAM
    $ export TURTLEBOT3_MODEL=waffle_pi
    $ roslaunch turtlebot3_slam turtlebot3_slam.launch slam_methods:=gmapping
    
  • Remotely Control TurtleBot3
    $ roslaunch turtlebot3_teleop turtlebot3_teleop_key.launch
    
  • Save the Map
    $ rosrun map_server map_saver -f ~/map
    

When you run the dependent packages and move the robot in virtual space and create a map as shown below, you can create a map as shown in figure below.

Virtual Navigation with TurtleBot3

For virtual Navigation in Gazebo, instead of running the actual robot, you can select the various environments and robot models mentioned above, and the Navigation-related commands will use the ROS packages used in the Navigation section.

Virtual Navigation Execution Procedure

Terminate all applications that were executed during the virtual SLAM practice and execute related packages in the following instruction, the robot will appear on the previously generated map. After setting the initial position of the robot on the map, set the destination to run the navigation as shown in figure below. The initial position only needs to be set once.

  • Execute Gazebo
    $ export TURTLEBOT3_MODEL=waffle_pi
    $ roslaunch turtlebot3_gazebo turtlebot3_world.launch
    
  • Execute Navigation
    $ export TURTLEBOT3_MODEL=waffle_pi
    $ roslaunch turtlebot3_navigation turtlebot3_navigation.launch map_file:=$HOME/map.yaml
    

Virtual SLAM by Multiple TurtleBot3s

1) Call Three TurtleBot3s in TurtleBot3 House
$ roslaunch turtlebot3_gazebo multi_turtlebot3.launch

These loaded turtlebot3s are set initial position and orientation

2) Excute SLAM
$ ROS_NAMESPACE=tb3_0 roslaunch turtlebot3_slam turtlebot3_gmapping.launch set_base_frame:=tb3_0/base_footprint set_odom_frame:=tb3_0/odom set_map_frame:=tb3_0/map
$ ROS_NAMESPACE=tb3_1 roslaunch turtlebot3_slam turtlebot3_gmapping.launch set_base_frame:=tb3_1/base_footprint set_odom_frame:=tb3_1/odom set_map_frame:=tb3_1/map
$ ROS_NAMESPACE=tb3_2 roslaunch turtlebot3_slam turtlebot3_gmapping.launch set_base_frame:=tb3_2/base_footprint set_odom_frame:=tb3_2/odom set_map_frame:=tb3_2/map

3) Merge Map Data from each TurtleBot3’s Map Data

Before launch this nodes, please make sure arguments for position and orientation of turtlebot3s

$ sudo apt-get install ros-kinetic-multirobot-map-merge
$ roslaunch turtlebot3_gazebo multi_map_merge.launch
4) Excute RViz
$ rosrun rviz rviz -d `rospack find turtlebot3_gazebo`/rviz/multi_turtlebot3_slam.rviz
5) Teleoperation
$ ROS_NAMESPACE=tb3_0 rosrun turtlebot3_teleop turtlebot3_teleop_key
$ ROS_NAMESPACE=tb3_1 rosrun turtlebot3_teleop turtlebot3_teleop_key
$ ROS_NAMESPACE=tb3_2 rosrun turtlebot3_teleop turtlebot3_teleop_key
6) Save the Map
$ rosrun map_server map_saver -f ~/map

TurtleBot3 AutoRace with Gazebo

Go to AutoRace with Gazebo.

Turtlebot3 with OpenMANIPULATOR

Go to OpenMANIPULATOR with Gazebo

Standalone Gazebo Plugin

NOTE: This tutorial is developed only for user who want to simulate TurtleBot3 without ROS. However we highly recommend to simulate robots with ROS.

How to use Gazebo Plugin

1) Install Library for Gazebo7
$ sudo apt-get install libgazebo7-dev
2) Download Source Code from Github
$ git clone https://github.com/ROBOTIS-GIT/turtlebot3_gazebo_plugin
3) Add Gazebo Plugin Path in .bashrc File
$ nano ~/.bashrc

TIP: turtlebot3_gazebo_plugin path = ~/turtlebot3_gazebo_plugin

export GAZEBO_PLUGIN_PATH=$GAZEBO_PLUGIN_PATH:${turtlebot3_gazebo_plugin path}/build
export GAZEBO_MODEL_PATH=$GAZEBO_MODEL_PATH:${turtlebot3_gazebo_plugin path}/models
4) Make and Build
$ cd ${turtlebot3_gazebo_plugin path}
$ mkdir build
$ cd build
$ cmake ..
$ make
5) Excute Plugin

TIP: Before executing this command, you have to specify the model name of TurtleBot3. The ${TB3_MODEL} is the name of the model you are using in burger, waffle, waffle_pi. If you want to permanently set the export settings, please refer to Export TURTLEBOT3_MODEL page.

$ cd ${turtlebot3_gazebo_plugin}
$ gazebo worlds/turtlebot3_${TB3_MODEL}.world

6) Teleoperation by Keyboard
w - set linear velocity up
x - set linear velocity down
d - set angular velocity up
a - set angular velocity down
s - set all velocity to zero
7) Topic Subscribe Command
  • Show all topic
$ gz topic -l
  • Subscribe scan data
$ gz topic -e /gazebo/default/user/turtlebot3_${TB3_MODEL}/lidar/hls_lfcd_lds/scan
  • Subscribe image data

Waffle

$ gz topic -e /gazebo/default/user/turtlebot3_waffle/image/intel_realsense_r200/image

Waffle Pi

$ gz topic -e /gazebo/default/user/turtlebot3_waffle_pi/image/raspberry_pi_cam/image
8) Excute listener
$ cd ${turtlebot3_gazebo_plugin}/build
$ ./lidar_listener ${TB3_MODEL}

Open a new terminal window and enter below command.

$ cd ${turtlebot3_gazebo_plugin}/build
$ ./image_listener ${TB3_MODEL}
Reference

Ros2

July 17, 2019

ROS2

NOTE:

  • This instructions were tested on Ubuntu 18.04 and ROS2 Crystal Clemmys.
  • This instructions are supposed to be running on the remote PC. Please run the instructions below on your [Remote PC]. However, the part marked [TurtleBot] is the content that runs on SBC of TurtleBot3.

TIP: The terminal application can be found with the Ubuntu search icon on the top left corner of the screen. The shortcut key for running the terminal is Ctrl-Alt-T.

This chapter shows some demos using TurtleBot3 with ROS2 and Gazebo9. In order to implement these demos, you have to install some packages.

Setup

PC setup

NOTE: All demos have been tested in Ubuntu 18.04 and macOS High Sierra installed ROS2 Crystal Clemmys. If you got stuck during installation, please following ROS Answers or ROS2 Issue.

[Install Ubuntu on Remote PC]

[Install ROS2 on Remote PC]

[Install TurtleBot3 ROS2 Packages]

[Remote PC] Download turtlebot3 packages and install some dependencies for ROS2

# Install Cartographer dependencies
$ sudo apt install -y \
    google-mock \
    libceres-dev \
    liblua5.3-dev \
    libboost-dev \
    libboost-iostreams-dev \
    libprotobuf-dev \
    protobuf-compiler \
    libcairo2-dev \
    libpcl-dev \
    python3-sphinx
# Install Gazebo9
$ curl -sSL http://get.gazebosim.org | sh
# Install Navigation2 dependencies
$ sudo apt install -y \
    libsdl-image1.2 \
    libsdl-image1.2-dev \
    libsdl1.2debian \
    libsdl1.2-dev
$ mkdir -p ~/turtlebot3_ws/src
$ cd ~/turtlebot3_ws
$ wget https://raw.githubusercontent.com/ROBOTIS-GIT/turtlebot3/ros2/turtlebot3.repos
$ vcs import src < turtlebot3.repos
$ colcon build --symlink-install
$ echo 'source ~/turtlebot3_ws/install/setup.bash' >> ~/.bashrc
$ source ~/.bashrc

NOTE: If you get any build errors or warnings from dependencies, please refer to below documents.

SBC setup

[TurtleBot] Install Raspbian Stretch

  1. Download Raspbian Stretch with desktop and recommended software
  2. Unzip the download file and burn image to your microSD card(>8GB)
  3. Follow instruction that How to setup for TurtleBot3 with ROS2

OpenCR setup

[TurtleBot] Upload firmware for ROS2.

  • TurtleBot3 Burger
$ cd ~/turtlebot3
$ rm -rf ./opencr_update.tar.xz
$ wget https://github.com/ROBOTIS-GIT/OpenCR-Binaries/raw/master/turtlebot3/ROS2/latest/opencr_update.tar.bz2
$ tar -xf ./opencr_update.tar.bz2

$ export OPENCR_PORT=/dev/ttyACM0
$ export OPENCR_MODEL=burger
$ cd ./opencr_update
$ ./update.sh $OPENCR_PORT $OPENCR_MODEL.opencr

If uploading the firmware succeeds, below message will be displayed in the terminal.

armv7l
arm
OpenCR Update Start..
opencr_ld_shell ver 1.0.0
opencr_ld_main 
[  ] file name   	: burger.opencr 
[  ] file size   	: 168 KB
[  ] fw_name     	: burger 
[  ] fw_ver      	: V180903R1 
[OK] Open port   	: /dev/ttyACM0
[  ]
[  ] Board Name  	: OpenCR R1.0
[  ] Board Ver   	: 0x17020800
[  ] Board Rev   	: 0x00000000
[OK] flash_erase 	: 0.96s
[OK] flash_write 	: 1.92s 
[OK] CRC Check   	: 10E28C8 10E28C8 , 0.006000 sec
[OK] Download 
[OK] jump_to_fw 
  • TurtleBot3 Waffle or Waffle_Pi
$ cd ~/turtlebot3
$ rm -rf ./opencr_update.tar.xz
$ wget https://github.com/ROBOTIS-GIT/OpenCR_Binaries/raw/master/turtlebot3/ROS2/latest/opencr_update.tar.xz
$ tar -xf ./opencr_update.tar.xz

$ export OPENCR_PORT=/dev/ttyACM0
$ export OPENCR_MODEL=waffle
$ cd ./opencr_update
$ ./update.sh $OPENCR_PORT $OPENCR_MODEL.opencr

If uploading the firmware succeeds, below message will be displayed in the terminal.

armv7l
arm
OpenCR Update Start..
opencr_ld_shell ver 1.0.0
opencr_ld_main 
[  ] file name   	: waffle.opencr 
[  ] file size   	: 168 KB
[  ] fw_name     	: waffle 
[  ] fw_ver      	: V180903R1 
[OK] Open port   	: /dev/ttyACM0
[  ]
[  ] Board Name  	: OpenCR R1.0
[  ] Board Ver   	: 0x17020800
[  ] Board Rev   	: 0x00000000
[OK] flash_erase 	: 0.96s
[OK] flash_write 	: 1.92s 
[OK] CRC Check   	: 10E28C8 10E28C8 , 0.006000 sec
[OK] Download 
[OK] jump_to_fw 

[TurtleBot] Reset OpenCR using RESET button.

Bringup

[Bringup TurtleBot3]

[TurtleBot, RemotePC] Sync time between TurtleBot and RemotePC

$ sudo apt-get install ntpdate
$ sudo ntpdate ntp.ubuntu.com

[TurtleBot] Run Micro-XRCE-DDS Agent for OpenCR

$ cd ~/turtlebot3 && MicroXRCEAgent serial /dev/ttyACM0

[TurtleBot] Run Micro-XRCE-DDS Agent for Lidar

$ cd ~/turtlebot3 && MicroXRCEAgent udp 2018

[TurtleBot] Run Lidar application

$ ./turtlebot3/turtlebot3_lidar

[Remote PC] Launch robot includinf robot_state_publisher and turtlebot3_node

TIP: Before executing this command, you have to specify the model name of TurtleBot3. The ${TB3_MODEL} is the name of the model you are using in burger, waffle, waffle_pi. If you want to permanently set the export settings, please refer to [Export TURTLEBOT3_MODEL][export_turtlebot3_model]{: .popup} page.

$ export TURTLEBOT3_MODEL=${TB3_MODEL}
$ ros2 launch turtlebot3_bringup robot.launch.py

If the node is successfully launched, the following instruction will be appeared to the terminal window.

[INFO] [launch]: process[robot_state_publisher-1]: started with pid [24824]
[INFO] [launch]: process[turtlebot3_ros-2]: started with pid [24825]
Initialize urdf model from file: /home/ost/turtlebot3_ws/install/turtlebot3_description/share/turtlebot3_description/urdf/turtlebot3_burger.urdf
Parsing robot urdf xml string.
Link base_link had 5 children
Link caster_back_link had 0 children
Link imu_link had 0 children
Link base_scan had 0 children
Link wheel_left_link had 0 children
Link wheel_right_link had 0 children
got segment base_footprint
got segment base_link
got segment base_scan
got segment caster_back_link
got segment imu_link
got segment wheel_left_link
got segment wheel_right_link
[INFO] [turtlebot3_node]: Init TurtleBot3 Node Main
Adding fixed segment from base_footprint to base_link
Adding fixed segment from base_link to caster_back_link
Adding fixed segment from base_link to imu_link
Adding fixed segment from base_link to base_scan
Adding moving segment from base_link to wheel_left_link
Adding moving segment from base_link to wheel_right_link

And you can check topic list as shown below

$ ros2 topic list
/cmd_vel
/imu
/joint_states
/motor_power
/odom
/parameter_events
/reset
/robot_description
/scan
/scan_half
/sensor_state
/sound
/tf
/tf_static
/time_sync
/version_info

Rviz2

[Remote PC] Run Rviz2

$ ros2 launch turtlebot3_bringup rviz2.launch.py

Teleoperation

[Remote PC] Run teleoperation node

$ export TURTLEBOT3_MODEL=${TB3_MODEL}
$ ros2 run turtlebot3_teleop teleop_keyboard

If the node is successfully run, the following instruction will be appeared to the terminal window.

Control Your TurtleBot3!
---------------------------
Moving around:
        w
   a    s    d
        x

w/x : increase/decrease linear velocity (Burger : ~ 0.22, Waffle and Waffle Pi : ~ 0.26)
a/d : increase/decrease angular velocity (Burger : ~ 2.84, Waffle and Waffle Pi : ~ 1.82)

space key, s : force stop

CTRL-C to quit

Cartographer

The SLAM (Simultaneous Localization and Mapping) is a technique to draw a map by estimating current location in an arbitrary space. The SLAM is a well-known feature of TurtleBot from its predecessors. The video here shows you how accurately TurtleBot3 can draw a map with its compact and affordable platform.

[TurtleBot, RemotePC] Sync time between TurtleBot and RemotePC

$ sudo apt-get install ntpdate
$ sudo ntpdate ntp.ubuntu.com

[Remote PC]

$ cd ~/turtlebot3_ws && colcon build
$ ros2 launch turtlebot3_cartographer cartographer.launch.py

[Remote PC] Save the map

$ ros2 run nav2_map_server map_saver -f ~/map

Navigation is to move the robot from one location to the specified destination in a given environment. For this purpose, a map that contains geometry information of furniture, objects, and walls of the given environment is required. As described in the previous Cartographer section, the map was created with the distance information obtained by the sensor and the pose information of the robot itself.

The navigation enables a robot to move from the current pose to the designated goal pose on the map by using the map, robot’s encoder, IMU sensor, and distance sensor. The procedure for performing this task is as follows.

[Run Navigation2 Nodes]

$ export TURTLEBOT3_MODEL=${TB3_MODEL}
$ ros2 launch turtlebot3_navigation2 navigation2.launch.py

Simulation

[Remote PC] Add GAZEBO_MODEL_PATH

  $ echo '# Add gazebo model path' >> ~/.bashrc
  $ echo 'export GAZEBO_MODEL_PATH=$GAZEBO_MODEL_PATH:~/turtlebot3_ws/src/turtlebot3/turtlebot3_simulations/turtlebot3_gazebo/models' >> ~/.bashrc
  $ source ~/.bashrc

TIP: Before executing this command, you have to specify the model name of TurtleBot3. The ${TB3_MODEL} is the name of the model you are using in burger, waffle, waffle_pi. If you want to permanently set the export settings, please refer to [Export TURTLEBOT3_MODEL][export_turtlebot3_model]{: .popup} page.

[Remote PC] Load TurtleBot3 on turtlebot3 world

$ export TURTLEBOT3_MODEL=${TB3_MODEL}
$ ros2 launch turtlebot3_gazebo turtlebot3_world.launch.py
$ ros2 param set /gazebo use_sim_time True

WARNING: If you got error messages about TF_OLD_DATA, you should retry to set use_sim_time parameter onto /gazebo node.

[Remote PC] Launch Cartographer

$ ros2 launch turtlebot3_cartographer cartographer.launch.py use_sim_time:=True

[Remote PC] Run teleoperation node

$ export TURTLEBOT3_MODEL=${TB3_MODEL}
$ ros2 run turtlebot3_teleop teleop_keyboard

[Remote PC] Save the map

$ ros2 run nav2_map_server map_saver -f ~/map

[Remote PC] Load Navigation2

$ export TURTLEBOT3_MODEL=${TB3_MODEL}
$ ros2 launch turtlebot3_navigation2 navigation2.launch.py

You should set some parameters to use simulation time. If you need futher information about it, please following navigation2 repo

$ ros2 param set /world_model use_sim_time True
$ ros2 param set /global_costmap/global_costmap use_sim_time True
$ ros2 param set /local_costmap/local_costmap use_sim_time True

Projects_aws_robomaker

July 17, 2019

AWS RoboMaker with Turtlebot3

AWS RoboMaker is a service that makes it easy to develop, test, and deploy intelligent robotics applications at scale. RoboMaker extends the most widely used open-source robotics software framework, Robot Operating System (ROS), with connectivity to cloud services. This includes AWS machine learning services, monitoring services, and analytics services that enable a robot to stream data, navigate, communicate, comprehend, and learn. RoboMaker provides a robotics development environment for application development, a robotics simulation service to accelerate application testing, and a robotics fleet management service for remote application deployment, update, and management.

Developer Guide

AWS RoboMaker example with Turtlebot3

AWS RoboMaker – Develop, Test, Deploy, and Manage Intelligent Robotics Apps

AWS RoboMaker Reinforcement Learning example with Turtlebot3

How to Train a Robot Using Reinforcement Learning

Projects

July 17, 2019

Projects

TurtleBot3 Collaboration Project

TurtleBot3 is a collaboration project among Open Robotics, ROBOTIS, and more partners like The Construct, Intel, Onshape, OROCA, AuTURBO, ROS in Robotclub Malaysia, Astana Digital, Polariant Experiment, Tokyo University of Agriculture and Technology, GVlab, Networked Control Robotics Lab at National Chiao Tung University, SIM Group at TU Darmstadt. The Open Robotics is in charge of software and community activities, while ROBOTIS is in charge of manufacturing and global distribution.

The most important part of this TurtleBot3 collaboration project is open source based software, hardware, and content. We are encouraging more partners and research collaborators to participate in this project to enrich the robotics field. So we have prepared this page. This page is an introduction to Awesome Projects of TurtleBot3 Partners and Research Collaborators.

If you are interested in partnership with us to realize open source robotics, please fill out form here.

TurtleBot3 Providers

TurtleBot3 Partners and Research Collaborators

* Each collaboration member’s web page can be found here.

TurtleBot3 Distributors

* Each collaboration member’s web page can be found here.

TurtleBot3 Awesome Projects

[TurtleBot3 Autorace Simulation Circuit]

[Multi-robot exploration]

[turtlebot3 ros move base gazebo dwa planner]

[Tutoriel TurtleBot3/ROS/Docker/Darknet pour mac]

[TurtleBot3 Automatic Obstacle Avoidance]

[Robotclub TB3 (Episode 4): Track and push color balls to goals with avoiding obstacles]

[TurtleBot3 SLAM (Gmapping) & Navigation (AMCL)]

[Robotclub TB3 (Episode 3): Track and push color balls to goals]

[Robotclub TB3 (Episode 2): Track and push ball to goal]

[Robotclub TB3 (Episode 1) : Ball Tracking]

[TurtleBot3 Maze Solving at FIRA Malaysia 2018]

[TurtleBot3 Maze Explorer]

[PLS(Polarized Light Sensing) for Robotics]

[TurtleBot3 ROS Intelligent Robot]

[BallBot Project based on TurtleBot3]

[GdR TurtleBot Challenge 2018 - Final Match - “Monka” vs. “Ninja Turtle”]

[TurtleBot3 Maze Solving]

[TurtleBot3 with OpenTLD and IBVS Controller]

[Autonomous Radiation Mapping - Carma Project]

[Controlling TurtleBot 3 from Windows using NEP]

[Demo video of KAIST ‘Introduction to Robotics’ in 2017 Fall]

[TurtleBot3 Voice Teleop Testing]

[TurtleBot3 Follows DashGo]

[TurtleBot3 Hector SLAM on Gazebo]

[Q-Learning to Obstacle Avoidance using TurtleBot3]

Overview

July 17, 2019

Overview

TurtleBot

TurtleBot is a ROS standard platform robot. Turtle is derived from the Turtle robot, which was driven by the educational computer programming language Logo in 1967. In addition, the turtlesim node, which first appears in the basic tutorial of ROS, is a program that mimics the command system of the Logo turtle program. It is also used to create the Turtle icon as a symbol of ROS. The nine dots used in the ROS logo derived from the back shell of the turtle. TurtleBot, which originated from the Turtle of Logo, is designed to easily teach people who are new to ROS through TurtleBot as well as to teach computer programming language using Logo. Since then TurtleBot has become the standard platform of ROS, which is the most popular platform among developers and students.

TurtleBot3

There are 3 versions of the TurtleBot series. TurtleBot1 was developed by Tully (Platform Manager at Open Robotics) and Melonee (CEO of Fetch Robotics) from Willow Garage on top of the iRobot’s Roomba-based research robot, Create, for ROS deployment. It was developed in 2010 and has been on sale since 2011. In 2012, TurtleBot2 was developed by Yujin Robot based on the research robot, iClebo Kobuki. In 2017, TurtleBot3 was developed with features to supplement the lacking functions of its predecessors, and the demands of users. The TurtleBot3 adopts ROBOTIS smart actuator Dynamixel for driving. For more information on the TurtleBot series, please see the following link.

TurtleBot3 is a small, affordable, programmable, ROS-based mobile robot for use in education, research, hobby, and product prototyping. The goal of TurtleBot3 is to dramatically reduce the size of the platform and lower the price without having to sacrifice its functionality and quality, while at the same time offering expandability. The TurtleBot3 can be customized into various ways depending on how you reconstruct the mechanical parts and use optional parts such as the computer and sensor. In addition, TurtleBot3 is evolved with cost-effective and small-sized SBC that is suitable for robust embedded system, 360 degree distance sensor and 3D printing technology.

The TurtleBot3’s core technology is SLAM, Navigation and Manipulation, making it suitable for home service robots. The TurtleBot can run SLAM(simultaneous localization and mapping) algorithms to build a map and can drive around your room. Also, it can be controlled remotely from a laptop, joypad or Android-based smart phone. The TurtleBot can also follow a person’s legs as they walk in a room. Also the TurtleBot3 can be used as a mobile manipulator capable of manipulating an object by attaching a manipulator like OpenMANIPULATOR. The OpenMANIPULATOR has the advantage of being compatible with TurtleBot3 Waffle and Waffle Pi. Through this compatibility can compensate for the lack of freedom and can have greater completeness as a service robot with the the SLAM and navigation capabilities that the TurtleBot3 has.

TurtleBot3 Introduction Video

TurtleBot3 Collaboration Project

TurtleBot3 is a collaboration project among Open Robotics, ROBOTIS, and more partners like The Construct, Intel, Onshape, OROCA, AuTURBO, ROS in Robotclub Malaysia, Astana Digital, Polariant Experiment, Tokyo University of Agriculture and Technology, GVlab, Networked Control Robotics Lab at National Chiao Tung University, SIM Group at TU Darmstadt. The Open Robotics is in charge of software and community activities, while ROBOTIS is in charge of manufacturing and global distribution.

The most important part of this TurtleBot3 collaboration project is open source based software, hardware, and content. We are encouraging more partners and research collaborators to participate in this project to enrich the robotics field.

If you are interested in partnership with us to realize open source robotics, please fill out form here.

TurtleBot3 Providers

TurtleBot3 Partners and Research Collaborators

* Each collaboration member’s web page can be found here.

TurtleBot3 Distributors

* Each collaboration member’s web page can be found here.

TurtleBot3 Map

Notices

July 17, 2019

Notices

Publish

OpenMANIPULATOR

News

Navigation

July 17, 2019

Navigation

WARNING: Be careful when running the robot on the table as the robot might fall.

NOTE:

  • This instructions were tested on Ubuntu 16.04 and ROS Kinetic Kame.
  • This instructions are supposed to be running on the remote PC. Please run the instructions below on your Remote PC.
  • The terminal application can be found with the Ubuntu search icon on the top left corner of the screen. The shortcut key for running the terminal is Ctrl-Alt-T.
  • Make sure to run the Bringup instructions before running the instructions below.
  • The navigation uses the a data created in SLAM. Please make sure to have a map data.

Navigation is to move the robot from one location to the specified destination in a given environment. For this purpose, a map that contains geometry information of furniture, objects, and walls of the given environment is required. As described in the previous SLAM section, the map was created with the distance information obtained by the sensor and the pose information of the robot itself.

The navigation enables a robot to move from the current pose to the designated goal pose on the map by using the map, robot’s encoder, IMU sensor, and distance sensor. The procedure for performing this task is as follows.

Run Navigation Nodes

[Remote PC] Run roscore.

$ roscore

[TurtleBot] Bring up basic packages to start TurtleBot3 applications.

$ roslaunch turtlebot3_bringup turtlebot3_robot.launch

[Remote PC] Launch the navigation file.

TIP: Before executing this command, you have to specify the model name of TurtleBot3. The ${TB3_MODEL} is the name of the model you are using in burger, waffle, waffle_pi. If you want to permanently set the export settings, please refer to Export TURTLEBOT3_MODEL page.

$ export TURTLEBOT3_MODEL=${TB3_MODEL}
$ roslaunch turtlebot3_navigation turtlebot3_navigation.launch map_file:=$HOME/map.yaml

TIP: When you run the above command, the visualization tool RViz is also executed. If you want to run RViz separately, use the following command.

$ rviz -d `rospack find turtlebot3_navigation`/rviz/turtlebot3_navigation.rviz

Estimate Initial Pose

[Remote PC] First, the initial pose estimation of the robot should be performed. When you press 2D Pose Estimate in the menu of RViz, a very large green arrow appears. Move it to the pose where the actual robot is located in the given map, and while holding down the left mouse button, drag the green arrow to the direction where the robot’s front is facing, follow the instruction below.

  • Click the 2D Pose Estimate button.
  • Click on the approxtimate point in the map where the TurtleBot3 is located and drag the cursor to indicate the direction where TurtleBot3 faces.

Then move the robot back and forth with tools like the turtlebot3_teleop_keyboard node to collect the surrounding environment information and find out where the robot is currently located on the map.

$ export TURTLEBOT3_MODEL=${TB3_MODEL}
$ roslaunch turtlebot3_teleop turtlebot3_teleop_key.launch

When this process is completed, the robot estimates its actual position and orientation by using the position and orientation specified by the green arrow as the initial pose. Every green arrow stands for an expected position of TurtleBot3. The laser scanner will draw approximate figures of wall on the map. If the drawing doesn’t show the figures incorrectly, repeat localizing the TurtleBot3 from clicking 2D Pose Estimate button above.

TIP: The turtlebot3_teleop_keyboard node used for Estimate Initial Pose should be terminated after use. If it does not, the robot will behave strangely because the topic overlaps with the /cmd_vel topic from the navigation node of the next step.

Send Navigation Goal

[Remote PC] When everything is ready, let’s try the move command from the navigation GUI. If you press 2D Nav Goal in the menu of RViz, a very large green arrow appears. This green arrow is a marker that can specify the destination of the robot. The root of the arrow is the x and y position of the robot, and the orientation pointed by the arrow is the theta direction of the robot. Click this arrow at the position where the robot will move, and drag it to set the orientation like the instruction below.

  • Click the 2D Nav Goal button.
  • Click on a specific point in the map to set a goal position and drag the cursor to the direction where TurtleBot should be facing at the end.

The robot will create a path to avoid obstacles to its destination based on the map. Then, the robot moves along the path. At this time, even if an obstacle is suddenly detected, the robot moves to the target point avoiding the obstacle.

Setting a goal position might fail if the path to the goal position cannot be created. If you wish to stop the robot before it reaches to the goal position, set the current position of TurtleBot3 as a goal position.

Tuning Guide

Navigation stack has many parameters to change performances for different robots. You can get an information about it in ROS Wiki or refer chapter 11 in ROS Robot Programming book.

This tuning guide give some tips for you to configue important parameters. If you want to change performances depends on your environments, this tips might be help you and save your time.

inflation_radius

  • turtlebot3_navigation/param/costmap_common_param_$(model).yaml
  • This parameter makes inflation area from the obstacle. Path would be planned in order that it don’t across this area. It is safe that to set this to be bigger than robot radius. For more information about it please following page of costmap_2d wiki.

cost_scaling_factor

  • turtlebot3_navigation/param/costmap_common_param_$(model).yaml
  • This factor is multiplied by cost value. Because it is an reciprocal propotion, this parameter is increased, the cost is decreased.

The best path is for the robot to pass through a center of between obstacles. Set this factor to be smaller in order to far from obstacles.

max_vel_x

  • turtlebot3_navigation/param/dwa_local_planner_params_$(model).yaml
  • This factor is set the maximum value of translational velocity.

min_vel_x

  • turtlebot3_navigation/param/dwa_local_planner_params_$(model).yaml
  • This factor is set the minimum value of translational velocity. If set this negative, the robot can move backwards.

max_trans_vel

  • turtlebot3_navigation/param/dwa_local_planner_params_$(model).yaml
  • Actual value of the maximum translational velocity. The robot can not be faster than this.

min_trans_vel

  • turtlebot3_navigation/param/dwa_local_planner_params_$(model).yaml
  • Actual value of the minimum translational velocity. The robot can not be slower than this.

max_rot_vel

  • turtlebot3_navigation/param/dwa_local_planner_params_$(model).yaml
  • Actual value of the maximum rotational velocity. The robot can not be faster than this.

min_rot_vel

  • turtlebot3_navigation/param/dwa_local_planner_params_$(model).yaml
  • Actual value of the minimum rotational velocity. The robot can not be slower than this.

acc_lim_x

  • turtlebot3_navigation/param/dwa_local_planner_params_$(model).yaml
  • Actual value of the translational acceleration limit.

acc_lim_theta

  • turtlebot3_navigation/param/dwa_local_planner_params_$(model).yaml
  • Actual value of the rotational acceleration limit.

xy_goal_tolerance

  • turtlebot3_navigation/param/dwa_local_planner_params_$(model).yaml
  • The x,y distance allowed when the robot reaches its goal pose.

yaw_goal_tolerance

  • turtlebot3_navigation/param/dwa_local_planner_params_$(model).yaml
  • The yaw angle allowed when the robot reaches its goal pose.

sim_time

  • turtlebot3_navigation/param/dwa_local_planner_params_$(model).yaml
  • This factor is set forward simulation in seconds. Too low value is in sufficient time to pass narrow area and too high value is not allowed rapidly rotates. You can watch defferences of length of the yellow line in below image.

References

References

opencr_setup

July 17, 2019

Manipulation

NOTE:

  • This instructions were tested on Ubuntu 16.04 and ROS Kinetic Kame.
  • If you want more specfic information about OpenMANIPULATOR, please refer to the OpenMANIPULATOR e-Manual.

TIP: The terminal application can be found with the Ubuntu search icon on the top left corner of the screen. The shortcut key for running the terminal is Ctrl-Alt-T.

TurtleBot3 with OpenMANIPULATOR

The OpenMANIPULATOR by ROBOTIS is one of the manipulators that support ROS, and has the advantage of being able to easily manufacture at a low cost by using Dynamixel actuators with 3D printed parts.

The OpenMANIPULATOR has the advantage of being compatible with TurtleBot3 Waffle and Waffle Pi. Through this compatibility can compensate for the lack of freedom and can have greater completeness as a service robot with the the SLAM and Navigation capabilities that the TurtleBot3 has. TurtleBot3 and OpenMANIPULATOR can be used as a mobile manipulator and can do things like the following videos.

Software Setup

NOTE: Before you install open_manipulator_with_tb3 packages, please make sure turtlebot3 and open_manipulator packages have been installed previously in RemotePC and setup Raspberry Pi 3.

  • Install dependent packages for the OpenMANIPULATOR with TurtleBot3.

[Remote PC]

$ cd ~/catkin_ws/src/
$ git clone https://github.com/ROBOTIS-GIT/open_manipulator_with_tb3.git
$ git clone https://github.com/ROBOTIS-GIT/open_manipulator_with_tb3_msgs.git
$ git clone https://github.com/ROBOTIS-GIT/open_manipulator_with_tb3_simulations.git
$ git clone https://github.com/ROBOTIS-GIT/open_manipulator_perceptions.git
$ sudo apt-get install ros-kinetic-smach* ros-kinetic-ar-track-alvar ros-kinetic-ar-track-alvar-msgs
$ cd ~/catkin_ws && catkin_make
  • If catkin_make command is completed without any errors, the preparation for OpenMANIPULATOR is done. Then load a TurtleBot3 Waffle or Waffle Pi with OpenMANIPULATOR on RViz.

TIP: Before executing this command, you have to specify the model name of TurtleBot3. The ${TB3_MODEL} is the name of the model you are using in waffle, waffle_pi. If you want to permanently set the export settings, please refer to Export TURTLEBOT3_MODEL page.

[RemotePC]

$ export TURTLEBOT3_MODEL=${TB3_MODEL}
$ roslaunch open_manipulator_with_tb3_description open_manipulator_with_tb3_rviz.launch

Hardware Setup

  • CAD files (TurtleBot3 Waffle Pi + OpenMANIPULATOR)

  • First, detach lidar sensor and shift it front of TurtleBot3 (Red circle represents position of bolts).
  • Second, attach OpenMANIPULATOR on the TurtleBot3 (Yellow circle represents position of bolts).

OpenCR Setup

NOTE: You can choose one of methods for uploading firmware. But we highly recommend to use shell script. If you need to modify TurtleBot3’s firmware, you can use the second method.

  • Method #1: Shell Script, upload the pre-built binary file using the shell script.
  • Method #2: Arduino IDE, build the provided source code and upload the generated binary file using the Arduino IDE.

Shell Script

[TurtleBot3]

$ export OPENCR_PORT=/dev/ttyACM0
$ export OPENCR_MODEL=om_with_tb3
$ rm -rf ./opencr_update.tar.bz2
$ wget https://github.com/ROBOTIS-GIT/OpenCR-Binaries/raw/master/turtlebot3/ROS1/latest/opencr_update.tar.bz2 && tar -xvf opencr_update.tar.bz2 && cd ./opencr_update && ./update.sh $OPENCR_PORT $OPENCR_MODEL.opencr && cd ..

When firmware upload is completed, jump_to_fw text string will be printed on the terminal.

Arduino IDE

[Remote PC]

  • Before you following step, please setup Arduino IDE.

  • Arduino IDE for using OpenCR

  • OpenCR firmware (or the source) for TurtleBot3 with OpenMANIPULATOR is to control DYNAMIXEL and sensors in the ROS. The firmware is located in OpenCR example which is downloaded by the board manager.

  • Go to FileExamplesTurtleBot3turtlebot3_with_open_manipulatorturtlebot3_with_open_manipulator_core.

  • Click Upload button to upload the firmware to OpenCR.

NOTE: If error occurs while uploading firmware, go to ToolsPort and check if correct port is selected. Press Reset button on the OpenCR and try to upload the firmware again.

TIP: The Dynamixel ids can be changed in open_manipulator_driver.h in turtlebot3_with_open_manipulator folder

  • When firmware upload is completed, jump_to_fw text string will be printed on the screen.

Bringup

NOTE: Please double check the OpenCR usb port name in turtlebot3_core.launch

TIP: Before executing this command, you have to specify the model name of TurtleBot3. The ${TB3_MODEL} is the name of the model you are using in waffle, waffle_pi. If you want to permanently set the export settings, please refer to Export TURTLEBOT3_MODEL page.

[TurtleBot3] Launch rosserial and lidar node

$ export TURTLEBOT3_MODEL=${TB3_MODEL}
$ ROS_NAMESPACE=om_with_tb3 roslaunch turtlebot3_bringup turtlebot3_robot.launch multi_robot_name:=om_with_tb3 set_lidar_frame_id:=om_with_tb3/base_scan

[TurtleBot3] Launch rpicamera node

$ ROS_NAMESPACE=om_with_tb3 roslaunch turtlebot3_bringup turtlebot3_rpicamera.launch

[Remote PC] Launch Launch robot_state_publisher node

$ ROS_NAMESPACE=om_with_tb3 roslaunch open_manipulator_with_tb3_tools om_with_tb3_robot.launch

SLAM

[Remote PC] Launch slam node

$ export TURTLEBOT3_MODEL=${TB3_MODEL}
$ roslaunch open_manipulator_with_tb3_tools slam.launch use_platform:=true

[Remote PC] Launch teleop node

$ ROS_NAMESPACE=om_with_tb3 roslaunch turtlebot3_teleop turtlebot3_teleop_key.launch

[Remote PC] Launch map_saver node

$ ROS_NAMESPACE=om_with_tb3 rosrun map_server map_saver -f ~/map

$ export TURTLEBOT3_MODEL=${TB3_MODEL}
$ roslaunch open_manipulator_with_tb3_tools navigation.launch use_platform:=true

MoveIt!

  • In order to run MoveIt!, open a new terminal window and enter the command below.

    $ export TURTLEBOT3_MODEL=${TB3_MODEL}
    $ roslaunch open_manipulator_with_tb3_tools manipulation.launch use_platform:=true
    

  • Below rqt plugins shows example of control OpenMANIPULATOR.

  • Get robot status by rosservice messages.

$ rosservice call /arm/moveit/get_joint_position "planning_group: 'arm'" 
joint_position: 
  joint_name: [joint1, joint2, joint3, joint4]
  position: [-0.003067961661145091, -0.42644667625427246, 1.3084856271743774, -0.8452234268188477]
  max_accelerations_scaling_factor: 0.0
  max_velocity_scaling_factor: 0.0
$ rosservice call /arm/moveit/get_kinematics_pose "planning_group: 'arm'
end_effector_name: ''" 
header: 
  seq: 0
  stamp: 
    secs: 1550714737
    nsecs: 317547871
  frame_id: "/base_footprint"
kinematics_pose: 
  pose: 
    position: 
      x: 0.0918695085861
      y: -0.000263644738325
      z: 0.218597669468
    orientation: 
      x: 1.82347658316e-05
      y: 0.023774433021
      z: -0.000766773548775
      w: 0.999717054001
  max_accelerations_scaling_factor: 0.0
  max_velocity_scaling_factor: 0.0
  tolerance: 0.0
  • In order to control gripper(range is -0.01~0.01) of OpenMANIPULATOR, below service command might be help.
$ rosservice call /om_with_tb3/gripper "planning_group: ''
joint_position:
  joint_name:
  - ''
  position:
  - 0.15
  max_accelerations_scaling_factor: 0.0
  max_velocity_scaling_factor: 0.0
path_time: 0.0"

Simulation

  • Load TurtleBot3 with OpenMANIPULATOR on Gazebo simulator and click Play button
$ export TURTLEBOT3_MODEL=${TB3_MODEL}
$ roslaunch open_manipulator_with_tb3_gazebo empty_world.launch

  • Type rostopic list to check which topic is activated.
$ rostopic list
/clock
/gazebo/link_states
/gazebo/model_states
/gazebo/parameter_descriptions
/gazebo/parameter_updates
/gazebo/set_link_state
/gazebo/set_model_state
/gazebo_gui/parameter_descriptions
/gazebo_gui/parameter_updates
/gazebo_ros_control/pid_gains/gripper/parameter_descriptions
/gazebo_ros_control/pid_gains/gripper/parameter_updates
/gazebo_ros_control/pid_gains/gripper/state
/gazebo_ros_control/pid_gains/gripper_sub/parameter_descriptions
/gazebo_ros_control/pid_gains/gripper_sub/parameter_updates
/gazebo_ros_control/pid_gains/gripper_sub/state
/gazebo_ros_control/pid_gains/joint1/parameter_descriptions
/gazebo_ros_control/pid_gains/joint1/parameter_updates
/gazebo_ros_control/pid_gains/joint1/state
/gazebo_ros_control/pid_gains/joint2/parameter_descriptions
/gazebo_ros_control/pid_gains/joint2/parameter_updates
/gazebo_ros_control/pid_gains/joint2/state
/gazebo_ros_control/pid_gains/joint3/parameter_descriptions
/gazebo_ros_control/pid_gains/joint3/parameter_updates
/gazebo_ros_control/pid_gains/joint3/state
/gazebo_ros_control/pid_gains/joint4/parameter_descriptions
/gazebo_ros_control/pid_gains/joint4/parameter_updates
/gazebo_ros_control/pid_gains/joint4/state
/om_with_tb3/camera/parameter_descriptions
/om_with_tb3/camera/parameter_updates
/om_with_tb3/camera/rgb/camera_info
/om_with_tb3/camera/rgb/image_raw
/om_with_tb3/camera/rgb/image_raw/compressed
/om_with_tb3/camera/rgb/image_raw/compressed/parameter_descriptions
/om_with_tb3/camera/rgb/image_raw/compressed/parameter_updates
/om_with_tb3/camera/rgb/image_raw/compressedDepth
/om_with_tb3/camera/rgb/image_raw/compressedDepth/parameter_descriptions
/om_with_tb3/camera/rgb/image_raw/compressedDepth/parameter_updates
/om_with_tb3/camera/rgb/image_raw/theora
/om_with_tb3/camera/rgb/image_raw/theora/parameter_descriptions
/om_with_tb3/camera/rgb/image_raw/theora/parameter_updates
/om_with_tb3/cmd_vel
/om_with_tb3/gripper_position/command
/om_with_tb3/gripper_sub_position/command
/om_with_tb3/imu
/om_with_tb3/joint1_position/command
/om_with_tb3/joint2_position/command
/om_with_tb3/joint3_position/command
/om_with_tb3/joint4_position/command
/om_with_tb3/joint_states
/om_with_tb3/odom
/om_with_tb3/scan
/rosout
/rosout_agg
/tf
/tf_static
  • OpenMANIPULATOR in Gazebo is controllered by ROS message. For example, to use below command make publish joint position (radian).
$ rostopic pub /om_with_tb3/joint4_position/command std_msgs/Float64 "data: -0.21" --once

Pick and Place

We provide the pick and place example for mobile manipulation. This example is used smach(task-level architecture) to send action to robot.

Bringup gazebo simulator

$ roslaunch open_manipulator_with_tb3_gazebo rooms.launch use_platform:=false

Launch navigation, moveIt!

$ roslaunch open_manipulator_with_tb3_tools rooms.launch use_platform:=false

Launch task controller

$ roslaunch open_manipulator_with_tb3_tools task_controller.launch 

TIP: Smach provides state graph. Try to run smach viewer and how to robot can pick and place. rosrun smach_viewer smach_viewer.py

opencr_setup

July 17, 2019

Machine Learning

Machine learning is a data analysis technique that teaches computers to recognize what is natural for people and animals - learning through experience. There are three types of machine learning: supervised learning, unsupervised learning, reinforcement learning.

This application is reinforcement learning with DQN (Deep Q-Learning). The reinforcement learning is concerned with how software agents ought to take actions in an environment so as to maximize some notion of cumulative reward.

This shows reinforcement learning with TurtleBot3 in gazebo. This reinforcement learning is applied DQN(Deep Q-Learning) algorithm with LDS.
We are preparing a four-step reinforcement learning tutorial.

Installation

To do this tutorial, you need to install Tensorflow, Keras and Anaconda with Ubuntu 16.04 and ROS kinetic.

Anaconda

You can download Anaconda 5.2 for Python 2.7 version.

After downloading Andaconda, go to the directory in located download file and enter the follow command.

$ bash Anaconda2-x.x.x-Linux-x86_64.sh

After installing Anaconda,

$ source ~/.bashrc
$ python -V

If Anaconda is installed, you can see Python 2.7.xx :: Anaconda, Inc..

ROS dependency packages

To use ROS and Anaconda together, you must additionally install ROS dependency packages.

$ pip install -U rosinstall msgpack empy defusedxml netifaces

Tensorflow

You can install TensorFlow.

$ conda create -n tensorflow pip python=2.7

This tutorial is used python 2.7(CPU only). If you want to use another python version and GPU, please refer to TensorFlow.

$ pip install --ignore-installed --upgrade https://storage.googleapis.com/tensorflow/linux/cpu/tensorflow-1.8.0-cp27-none-linux_x86_64.whl

Keras

Keras is a high-level neural networks API, written in Python and capable of running on top of TensorFlow.

$ pip install keras

Machine Learning packages

WARNING: Please install turtlebot3, turtlebot3_msgs and turtlebot3_simulations package before installing this package.

$ cd ~/catkin_ws/src/
$ git clone https://github.com/ROBOTIS-GIT/turtlebot3_machine_learning.git
$ cd ~/catkin_ws && catkin_make

Set parameters

The goal of DQN Agent is to get the TurtleBot3 to the goal avoiding obstacles. When TurtleBot3 gets closer to the goal, it gets a positive reward, and when it gets farther it gets a negative reward. The episode ends when the TurtleBot3 crashes on an obstacle or after a certain period of time. During the episode, TurtleBot3 gets a big positive reward when it gets to the goal, and TurtleBot3 gets a big negative reward when it crashes on an obstacle.

Set state

State is an observation of environment and describes the current situation. Here, state_size is 26 and has 24 LDS values, distance to goal, and angle to goal.

Turtlebot3’s LDS default is set to 360. You can modify sample of LDS at turtlebot3/turtlebot3_description/urdf/turtlebot3_burger.gazebo.xacro.

<xacro:arg name="laser_visual" default="false"/>   # Visualization of LDS. If you want to see LDS, set to `true`
<scan>
  <horizontal>
    <samples>360</samples>            # The number of sample. Modify it to 24
    <resolution>1</resolution>
    <min_angle>0.0</min_angle>
    <max_angle>6.28319</max_angle>
  </horizontal>
</scan>
sample = 360 sample = 24

Set action

Action is what an agent can do in each state. Here, turtlebot3 has always 0.15 m/s of linear velocity. angular velocity is determined by action.

Action Angular velocity(rad/s)
0 -1.5
1 -0.75
2 0
3 0.75
4 1.5

Set reward

When turtlebot3 takes an action in a state, it receives a reward. The reward design is very important for learning. A reward can be positive or negative. When turtlebot3 gets to the goal, it gets big positive reward. When turtlebot3 collides with an obstacle, it gets big negative reward. If you want to apply your reward design, modify setReward function at /turtlebot3_machine_learning/turtlebot3_dqn/src/turtlebot3_dqn/environment_stage_#.py.

Set hyper parameters

This tutorial has been learned using DQN. DQN is a reinforcement learning method that selects a deep neural network by approximating the action-value function(Q-value). Agent has follow hyper parameters at /turtlebot3_machine_learning/turtlebot3_dqn/nodes/turtlebot3_dqn_stage_#.

Hyper parameter default description
episode_step 6000 The time step of one episode.
target_update 2000 Update rate of target network.
discount_factor 0.99 Represents how much future events lose their value according to how far away.
learning_rate 0.00025 Learning speed. If the value is too large, learning does not work well, and if it is too small, learning time is long.
epsilon 1.0 The probability of choosing a random action.
epsilon_decay 0.99 Reduction rate of epsilon. When one episode ends, the epsilon reduce.
epsilon_min 0.05 The minimum of epsilon.
batch_size 64 Size of a group of training samples.
train_start 64 Start training if the replay memory size is greater than 64.
memory 1000000 The size of replay memory.

Run Machine Learning

Stage 1 (No Obstacle)

Stage 1 is a 4x4 map with no obstacles.

$ roslaunch turtlebot3_gazebo turtlebot3_stage_1.launch
$ roslaunch turtlebot3_dqn turtlebot3_dqn_stage_1.launch

Stage 2 (Static Obstacle)

Stage 2 is a 4x4 map with four cylinders of static obstacles.

$ roslaunch turtlebot3_gazebo turtlebot3_stage_2.launch
$ roslaunch turtlebot3_dqn turtlebot3_dqn_stage_2.launch

Stage 3 (Moving Obstacle)

Stage 2 is a 4x4 map with four cylinders of moving obstacles.

$ roslaunch turtlebot3_gazebo turtlebot3_stage_3.launch
$ roslaunch turtlebot3_dqn turtlebot3_dqn_stage_3.launch

Stage 4 (Combination Obstacle)

Stage 4 is a 5x5 map with walls and two cylinders of moving obstacles.

$ roslaunch turtlebot3_gazebo turtlebot3_stage_4.launch
$ roslaunch turtlebot3_dqn turtlebot3_dqn_stage_4.launch

If you want to see graph, launch the graph launch file.

$ roslaunch turtlebot3_dqn result_graph.launch

opencr_setup

July 17, 2019

Locomotion

In the video, watch how TurtleBot3 can be assembled and reassembled with a few additional parts. The waffle plate which is the biggest part among TurtleBot3 components can be assembled in various sizes and shapes, thanks to its diverse holes for bolts and nuts.

With this openended component, handful of TurtleBot3 friends with various characteristics could be built. You can create a totally new robot that is never seen before. Create a variety of robots based on open hardware and try out the new Locomotion.

TurtleBot3 Friends List

Components List ( BOM )

Single Item Quantity Set Item Quantity Purchase Links
XL430-W250-T (e-Manual) 2 XL430-W250-T 2 ROBOTIS SHOP
ROBOT CABLE-X4P 240mm 2 ROBOT CABLE-X4P 240mm 2 ROBOTIS SHOP
OpenCR1.0 (e-Manual) 1 OpenCR1.0 1 ROBOTIS SHOP
TB3 Waffle Plate-IPL-01 6 TB3 Waffle Plate-IPL-01 1 ROBOTIS SHOP
TB3 PCB Support-IBB-01 4 TB3 PCB Support-IBB-01 1 ROBOTIS SHOP
Rivet-Mg(n) 4 Rivet-Mg(n) 4 ROBOTIS SHOP
TB3 Wheel/Tire 2 TB3 Wheel/Tire Set-ISW-01 1 ROBOTIS SHOP
PHS M3x8mm 20 None None Online Store
PHS M2.5x8mm 4 None None Online Store
PHS M2.5x12mm 16 None None Online Store
WB_M2x4mm 8 None None Online Store
WB_M2.5x20mm 8 None None Online Store
NUT_M2.5 4 None None Online Store
NUT_M3 8 None None Online Store
SB-S3-45 4 None None Online Store
SB-S3-35 2 None None Online Store
3D printing parts(HV Converter) 4 None None Onshape

BOM spreadsheets

  • Get source codes and make friends! Go to Examplesturtlebot3turtlebot3_friends on Arduino IDE for OpenCR.

    NOTE: Any suggestions and ideas for TurtleBot3 Friends Project are always welcomed. Tell us about your creative TurtleBot3 Friends. We can introduce your friend to the world through this wiki page! E-Mail: ost@robotis.com :)

TurtleBot3 Friends: Car

  • Type: RC Car
  • Features: About 1:2 gear ratio, differential gears want to make Car be in the Formula E!
  • Components: Two Dynamixel X 430 Series (One for steering, one for driving), an OpenCR1.0 Board, a RC100 Remote Controller with BT410 master-slave Bluetooth modules, TurtleBot3 Chassis and Battery, OLLO plastic frames.
  • Hardware: Due to the complex hardware configuration, it will be released as a later improved version. :)
  • Software: Examplesturtlebot3turtlebot3_friendsturtlebot3_car
  • Video:

TurtleBot3 Friends: OpenManipulator

  • Type: Manipulator 4 DOF + 1 Gripper
  • Features: Compatible with TurtleBot3 Waffle Pi and it has linear gripper for pick and place.
  • Components: Four Dynamixel X 430 Series (Four for joints, one for gripper), an OpenCR1.0 Board, 3D printed chassis.
  • BOM: Please refer to the Parts of OpenManipulator
  • DIY Manual: Please refer to the link
  • Hardware for TurtleBot3 Waffle Pi + OpenManipulator (Onshape, Thingiverse)
  • Hardware for OpenManipulator (Onshape, Thingiverse)
  • Software (We are preparing for OpenCR Example)
  • OpenManipulator Wiki
  • Video:

TurtleBot3 Friends: Segway

NOTE:

  • Two Dynamixel X 430 Series are need to be set on PWM Mode.
  • Filters library have to be downloaded and includes it arduino IDE. Github Link
  • Type: Segway robot
  • Features: Balancing with only two DYNAMIXEL by applying PID controller.
  • Components: Two Dynamixel X 430 Series (All for balancing), an OpenCR1.0 Board, TurtleBot3 Chassis and Battery, 3D printed chassis.
  • BOM
  • Hardware (Onshape, Thingiverse)
  • Software: Examplesturtlebot3turtlebot3_friendsturtlebot3_segway
  • Video:

TurtleBot3 Friends: Conveyor

  • Type: 4 Wheel parallel translation vehicle
  • Features: 4 Joints and 4 wheels will become a futuristic technology on transportation society by overcoming fuel-consuming mechanics.
  • Components: Eight Dynamixel X 430 Series (Four for steering, four for driving), an OpenCR1.0 Board, a RC100 Remote Controller with BT410 master-slave Bluetooth modules, TurtleBot3 Chassis and Battery, 3D printed chassis.
  • BOM
  • Hardware (Onshape, Thingiverse)
  • Software: Examplesturtlebot3turtlebot3_friendsturtlebot3_conveyor
  • Video:

TurtleBot3 Friends: Monster

  • Type: 4WD Car
  • Features: 4 Big wheels let it be strong in the rough terrain or even a big difference of elevation.
  • Components: Four Dynamixel X 430 Series (All for driving), an OpenCR1.0 Board, a RC100 Remote Controller with BT410 master-slave Bluetooth modules, TurtleBot3 Chassis and Battery, 3D printed chassis.
  • BOM
  • Hardware (Onshape, Thingiverse)
  • Software: Examplesturtlebot3turtlebot3_friendsturtlebot3_monster
  • Video:

TurtleBot3 Friends: Tank

  • Type: Caterpillar
  • Features: Caterpillar units which are connected and assembled on sprocket wheels make it be strong in the rough terrain.
  • Components: Two Dynamixel X 430 Series (All for driving), an OpenCR1.0 Board, a RC100 Remote Controller with BT410 master-slave Bluetooth modules, TurtleBot3 Chassis and Battery, Caterpillar Unit, 3D printed chassis.
  • BOM
  • Hardware (Onshape, Thingiverse)
  • Software: Examplesturtlebot3turtlebot3_friendsturtlebot3_tank
  • Video:

TurtleBot3 Friends: Omni

  • Type: Omni wheel
  • Features: Omni wheels have additional discs around the circumference make it laterally driveable.
  • Components: Three Dynamixel X 430 Series (All for driving), an OpenCR1.0 Board, a RC100 Remote Controller with BT410 master-slave Bluetooth modules, TurtleBot3 Chassis and Battery, 3D printed chassis.
  • BOM
  • Hardware (Onshape, Thingiverse)
  • Software: Examplesturtlebot3turtlebot3_friendsturtlebot3_omni
  • Video:

TurtleBot3 Friends: Mecanum

  • Type: Mecanum wheel
  • Features: Mecanum wheels have additional discs around the circumference make it laterally driveable.
  • Components: Four Dynamixel X 430 Series (All for driving), an OpenCR1.0 Board, a RC100 Remote Controller with BT410 master-slave Bluetooth modules, TurtleBot3 Chassis and Battery, 3D printed chassis.
  • BOM
  • Hardware (Onshape, Thingiverse)
  • Software: Examplesturtlebot3turtlebot3_friendsturtlebot3_mechanum
  • Video: See in the video TurtleBot3 Friends: Omni above.

TurtleBot3 Friends: Bike

  • Type: 3-DOF Motorcycle
  • Features: Cute 3-wheeled bikey reveals its existence on the “Car” film as a brother of the “Car”.
  • Components: Three Dynamixel X 430 Series (One for steering, two for driving), an OpenCR1.0 Board, a RC100 Remote Controller with BT410 master-slave Bluetooth modules, TurtleBot3 Chassis and Battery, 3D printed chassis.
  • BOM
  • Hardware (Onshape, Thingiverse)
  • Software: Examplesturtlebot3turtlebot3_friendsturtlebot3_bike
  • Video: See in the videos TurtleBot3 Friends: Car and TurtleBot3 Friends: monster above.

TurtleBot3 Friends: Road Train

  • Type: Road train
  • Features: Road train can connect vehicles and it can serve various things!
  • Components: Two Dynamixel X 430 Series (two for driving), an OpenCR1.0 Board, a RC100 Remote Controller with BT410 master-slave Bluetooth modules, TurtleBot3 Chassis and Battery, 3D printed chassis.
  • BOM
  • Hardware (Onshape, Thingiverse)
  • Software: Examplesturtlebot3turtlebot3_friendsturtlebot3_road_train
  • Video:

TurtleBot3 Friends: Real TurtleBot

  • Type: 8-DOF legged robot (a.k.a. Real TurtleBot)
  • Features: A real TurtleBot will make most of the fanpics in the turtlebot society!.
  • Components: Ten Dynamixel X 430 Series (Four for leg joint, another four for shoulder joint, two for head), an OpenCR1.0 Board, a RC100 Remote Controller with BT410 master-slave Bluetooth modules, TurtleBot3 Chassis and Battery, 3D printed chassis.
  • BOM
  • Hardware (Onshape, Thingiverse)
  • Software: Examplesturtlebot3turtlebot3_friendsturtlebot3_realturtlebot
  • Video:

TurtleBot3 Friends: Carrier

  • Type: 2 Wheel mobile based platform as service robot
  • Features: 2 wheeled mobile platform will serve whatever you wants.
  • Components: A TurtleBot3 Waffle, 6 supports for 4th layer, another 6 supports for 5th layer construction, extra Waffle Plates in each layers, customized 3D printed wheel and ball caster.
  • BOM
  • Hardware (Onshape, Thingiverse)
  • Software: Examplesturtlebot3turtlebot3_waffleturtlebot3_waffle
  • Video:

opencr_setup

July 17, 2019

Learn

The Construct

This MASTERING WITH ROS: TurtleBot3 lecture is made by the Construct. Within this Course, you are going to learn how you can start working with the TurtleBot3 robot, explore its functionalitities, and how to build interesting ROS applications. If you want to learn more about ROS, please check the ROBOT IGNITE ACADEMY that contains series of online ROS tutorials tied to online simulations, giving you the tools and knowledge to understand and create any ROS based robotics development.

MASTERING WITH ROS: TurtleBot3

What you will learn from MASTERING WITH ROS: TurtleBot3 (Python):

  1. Basic Usage and control of the TurtleBot3 robot
  2. How to perform Navigation with TurtleBot3
  3. Follow a line with TurtleBot3
  4. Object Recognition with TurtleBot3
  5. Motion Planning in Moveit with TurtleBot3
  6. Grasping with TurtleBot3

Have a TurtleBot3 simulation running in 5 minutes with RDS v2.0

TurtleBot3 Laser Scan subscription

TurtleBot3 Blockly

Programming with Blockly to run TurtleBot3 (This tutorial is builed by Dabit Industries)

  • Blockly Wiki: A detailed documentation on how to use Blockly (free and open source software) with TurtleBot3.

TurtleBot3 Simulation on ROS Indigo

TurtleBot3 simulator in Ubuntu 14.04 (This tutorial is builed by Cyaninfinite)

Youtube Course

This ROS courses are a ROS robot programming guide based on the experiences we had accumulated from ROS projects like TurtleBot3, OpenCR and OpenManipulator. We tried to make this a comprehensive guide that covers all aspects necessary for a beginner in ROS. Topics such as embedded system, mobile robots, and robot arms programmed with ROS are included. For those who are new to ROS, there are footnotes throughout the courses providing more information on the web. Through this ROS courses and book, we hope that more people will be aware of and participate in bringing forward the ever-accelerating collective knowledge of Robotics Engineering.

Books

This Handbook is written for

College students and graduate students who want to learn robot programming based on ROS (Robot Operating system) and also for professional researchers and engineers who work on robot development or software programming. We have tried to offer detailed information we learned while working on TurtleBot3 and OpenManipulator. We hope this book will be the complete handbook for beginners in ROS and more people will contribute to the ever-growing community of open robotics.

What you will learn from this book

From the basic concept to practical robot application programming!

  • ROS Kinetic Kame : Basic concept, instructions and tools
  • How to use sensor and actuator packages on ROS
  • Embedded board for ROS : OpenCR1.0
  • SLAM & Navigation with TurtleBot3
  • How to program a delivery robot using ROS Java
  • OpenManipulator simulation using MoveIt! and Gazebo

ROS Robot Programming (Language: English, Chinese, Japanese, Korean)

opencr_setup

July 17, 2019

Joule Setup

WARNING: Setup work requires Power and Time. So battery is not suitable. We recommend using SMPS (AC adapter) during this work.

Install Linux (Ubuntu)

In this section, the Alternative Ubuntu Desktop 16.04 LTS will be installed on Intel® Joule™.

[Remote PC] Download Ubuntu image Alternative Ubuntu 16.04 for Intel® Joule™ from the below link.

[Remote PC] In order to make a bootable installation USB drive, please follow the Alternative install(Ubuntu Desktop 16.04 LTS) section from the below link.

[Remote PC] Before getting started, The board needs to have its BIOS updated to BIOS version #193 to install Ubuntu Image. Download BIOS version #193 and flash the BIOS into the Joule by following instructions in the below link.

WARNING: Updating to the latest BIOS(1J2 or higher) may cause unexpected problem of Intel® Joule™ with Ubuntu 16.04 LTS. Please use only the recommended BIOS version #193.

WARNING: Intel® Joule™ comes with passive heatsink in the package. It is recommended to use the heatsink. In order to operate Joule without the heatsink, please follow the extra instruction

If you need following step for installation, please refer to below link

Install ROS

WARNING: The contents in this chapter corresponds to the Intel® Joule™ which will be the main computer of TurtleBot3 Waffle. Do NOT apply this instruction to your Remote PC (your desktop PC or laptop).

NOTE: This instruction takes about 2 hours to install ROS and related packages for TurtleBot3. Elapsed time may vary depending on network environment.

TIP: The terminal application can be found with the Ubuntu search icon on the top left corner of the screen. Shortcut key for terminal is Ctrl-Alt-T.

[TurtleBot] Install ROS

$ sudo apt-get update
$ sudo apt-get upgrade
$ wget https://raw.githubusercontent.com/ROBOTIS-GIT/robotis_tools/master/install_ros_kinetic.sh && chmod 755 ./install_ros_kinetic.sh && bash ./install_ros_kinetic.sh

NOTE: In order to check which packages are installed, please check this link out. install_ros_kinetic

NOTE: After install ROS, please reboot Intel® Joule™.

If you prefer manual installation, please following the link below.

Install Dependent Packages

The next step is to install dependent packages for TurtleBot3 control.

[TurtleBot] Download packages from github

$ cd ~/catkin_ws/src
$ git clone https://github.com/ROBOTIS-GIT/hls_lfcd_lds_driver.git
$ git clone https://github.com/ROBOTIS-GIT/turtlebot3_msgs.git
$ git clone https://github.com/ROBOTIS-GIT/turtlebot3.git

NOTE: If you want to use Intel® RealSense™, please check related appendix for Intel® RealSense™

[TurtleBot] Delete some packages that are not needed in TurtleBot SBC

$ cd ~/catkin_ws/src/turtlebot3
$ sudo rm -r turtlebot3_description/ turtlebot3_teleop/ turtlebot3_navigation/ turtlebot3_slam/ turtlebot3_example/

[TurtleBot] Install dependent packages

$ sudo apt-get install ros-kinetic-rosserial-python ros-kinetic-tf

NOTE: After install packages, please reboot Intel® Joule™.

[TurtleBot] Build packages

$ cd ~/catkin_ws && catkin_make

If catkin_make command is completed without any errors, the preparation for TurtleBot3 is done.

USB Settings

[TurtleBot] The following commands allow to use USB port for OpenCR1.0 without acquiring root permission.

$ rosrun turtlebot3_bringup create_udev_rules

Network Configuration

ROS requires IP addresses in order to communicate between TurtleBot3 and remote PC.

Enter the below command on the terminal window of the SBC in TurtleBot3 to find out the IP address of TurtleBot3.

$ ifconfig

Texts in the rectangle is the IP address of the TurtleBot.

Enter the following command.

$ nano ~/.bashrc

Press ‘ alt+/ ‘ to end line of the file.

Replace the localhost in the ROS_MASTER_URI address with the IP address acquired from Remote PC Network Configuration. Also replace the localhost in the ROS_HOSTNAME address with the IP address acquired from the above terminal window, which is the IP address of TurtleBot3.

Then, source the bashrc with below command.

$ source ~/.bashrc

opencr_setup

July 17, 2019

Hardware Setup

Part List

TurtleBot3 has three different models: Burger, Waffle and Waffle Pi. The following list shows their components. The big differences between three models are the Motor, the SBC (Single Board Computer) and the Sensors. The TurtleBot3 Waffle model is discontinued due to discontinuation of Intel® Joule™.

  Part Name Burger Waffle Waffle Pi
Chassis Parts Waffle Plate 8 24 24
. Plate Support M3x35mm 4 12 12
. Plate Support M3x45mm 10 10 10
. PCB Support 12 12 12
. Wheel 2 2 2
. Tire 2 2 2
. Ball Caster 1 2 2
. Camera Bracket 0 0 1
Motors DYNAMIXEL (XL430-W250-T) 2 0 0
. DYNAMIXEL (XM430-W210-T) 0 2 2
Boards OpenCR1.0 1 1 1
. Raspberry Pi 3 1 0 1
. Intel® Joule™ 0 1 0
. USB2LDS 1 1 1
Remote Controllers BT-410 Set (Bluetooth 4, BLE) 0 0 1
. RC-100B (Remote Controller) 0 0 1
Sensors LDS (HLS-LFCD2) 1 1 1
. Intel® Realsense™ R200 0 1 0
. Raspberry Pi Camera Module v2.1 0 0 1
Memorys MicroSD Card 1 0 1
Cables Raspberry Pi 3 Power Cable 1 0 1
. Intel® Joule™ Power Cable 0 1 0
. Li-Po Battery Extension Cable 1 1 1
. DYNAMIXEL to OpenCR Cable 2 2 2
. USB Cable 2 2 2
. Camera Cable 0 0 1
Powers SMPS 12V5A 1 1 1
. A/C Cord 1 1 1
. LIPO Battery 11.1V 1,800mAh 1 1 1
. LIPO Battery Charger 1 1 1
Tools Screw driver 1 1 1
. Rivet tool 1 1 1
. USB3.0 Hub 0 1 0
Miscellaneous PH_M2x4mm_K 8 8 8
. PH_T2x6mm_K 4 8 8
. PH_M2x12mm_K 0 4 4
. PH_M2.5x8mm_K 16 12 16
. PH_M2.5x12mm_K 0 18 20
. PH_T2.6x12mm_K 16 0 0
. PH_M2.5x16mm_K 4 4 4
. PH_M3x8mm_K 44 140 140
. NUT_M2 0 4 4
. NUT_M2.5 20 18 24
. NUT_M3 16 96 96
. Rivet_1 14 20 22
. Rivet_2 2 2 2
. Spacer 4 4 4
. Silicone Spacer 0 0 4
. Bracket 5 8 6
. Adapter Plate 1 1 1

Assembly Manual

TurtleBots3 is delivered as unassembled parts in the boxes. Follow the instructions to assemble TurtleBot3.

Assembly Video

If it is difficult to assemble with the assembly manual, please refer to the following assembly video.

TurtleBot3 Burger

TurtleBot3 Waffle

Open Source Hardware

Core components of Turtlebot3 are the followings: Chassis, Motors, Wheels, OpenCR, SBC, Sensors and Battery. The chassis are Waffle Plates that holds other components. The Waffle Plate plays an important role as a chassis although its size is as small as your palm. The Waffle Plate is manufactured with injection mold method to lower the manufacturing cost. However, the CAD data of Waffle Plate for 3D printing is also available via Onshape. Turtlebot3 Burger is a Two-wheeled differential drive type platform, but it is customizable structurally and mechanically in many ways: Segway, Tank, Bike, Trailer and so on.

The CAD data is released to the Onshape, which is a full-cloud 3D CAD editor. Get access through a web browser from your PC or from portable devices. Onshape allows drawing and assemblying parts with co-workers.

opencr_setup

July 17, 2019

Getting Started

This page is for users who are new to TurtleBot3. The manual has an enormous amount of content, but this page explains how information is divided.

About TurtleBot3

First of all, collect information from the Overview, Notices, Features, and Specifications pages to get an overall understanding of TurtleBot3.

First steps for using TurtleBot3

When you have enough understanding about TurtleBot3 from above step, here are the software and hardware setups. Be aware that it is a time-saver to set up the SBC and your PC first, rather than assembling the robot. It is recommended to proceed in the following order.

  1. PC Setup: Install Linux, ROS and application software for TurtleBot3 on your Remote PC.
  2. SBC Setup: Install Linux, ROS and hardware related software to control the TurtleBot3 on your TurtleBot PC.
  3. OpenCR Setup: Upload latest firmware of TurtleBot3 to OpenCR embedded board.
  4. Hardware Setup: TurtleBots3 is delivered as unassembled parts in the box. Follow the instructions to assemble TurtleBot3. Prepared SBC and OpenCR will be mounted on the robot.

If you want to use other products instead of SBCs and Sensors included in the TurtleBot3 package, please refer to the Compatible Devices page.

Let’s try the basic operation

Once you have completed the above steps, run the robot through the provided Bringup package, and remotely move the robot with the teleoperation feature.
Next, let’s check various sensors’ value mounted on the robot or learn how to control the robot by reading Basic Operation page.

Keep TurtleBot3’s various technologies with you

The TurtleBot3’s core technology is SLAM, Navigation and Manipulation, making it suitable for home service robots. These technologies can be applied either on a real robot or a virtual robot with Simulation feature.
Of course, they can be implemented in TurtleBot3, such as Autonomous Driving and Machine Learning. In addition, we are introducing 12 different types of Locomotion as TurtleBot3 Friends, as well as differential drive mobile robot. With this openended component, handful of TurtleBot3 friends with various characteristics could be built. You can create a totally new robot that is never seen before. Also interesting applications such as Follower Demo, Panoramic Demo, and Automatic Parking are available. See Applications page for more application examples.

Learn and Explore more

The above are just a few examples of using TurtleBot3. You can learn more and challenge yourself with the following information.

You can Learn more through the ROS courses provided by the Construct, the various lectures created by TurtleBot3 users, web content, YouTube courses, free books, and more. In addition, various Videos produced by ROBOTIS will be helpful, and use cases using TurtleBot3 can be checked through various Projects released by TurtleBot3 research collaborators and TurtleBot3 users. You can also try a variety of challenges through Challenges.

References and Contacts

The Appendixes contains information on components used in TurtleBot3 such as DYNAMIXEL, OpenCR and LDS. The open source used by TurtleBot3 is listed on OpenSource and Licenses page and this page contains information about each license. If you have any questions about TurtleBot3, please refer to our FAQ or leave your Contact information.

Features

July 17, 2019

Features

TurtleBot is the most popular open source robot for education and research. The new generation TurtleBot3 is a small, low cost, fully programmable, ROS based mobile robot. It is intended to be used for education, research, hobby and product prototyping.

Affordable Cost

TurtleBot was developed to meet the cost-conscious needs of schools, laboratories and companies. TurtleBot3 is the most affordable robot among the SLAM-able mobile robots equipped with a 360° Laser Distance Sensor LDS-01.

Small Size

The dimension of TurtleBot3 Burger is only 138mm x 178mm x 192mm (L x W x H). Its size is about 1/4 of the size of the predecessor. Imagine keeping TurtleBot3 in your backpack and develop your program and test it anywhere you go.

ROS Standard

The TurtleBot brand is managed by Open Robotics, which develops and maintains ROS. Nowadays, ROS has become the go-to platform for all the roboticists around the world. TurtleBot can be integrated with existing ROS-based robot components, but TurtleBot3 can be an affordable platform for whom want to get started learning ROS.

Extensibility

TurtleBot3 encourages users to customize its mechanical structure with some alternative options: open source embedded board (as a control board), computer and sensors. TurtleBot3 Burger is a two-wheeled differential drive type platform but it is able to be structurally and mechanically customized in many ways: Cars, Bikes, Trailers and so on. Extend your ideas beyond imagination with various SBC, sensors and motors on a scalable structure.

Modular Actuator for Mobile Robot

TurtleBot3 is able to get a precise spatial data by using 2 DYNAMIXELs in the wheel joints. DYNAMIXEL XM series can be operated by one of 6 operating modes(XL series: 4 operating modes): Velocity control mode for wheels, Torque control mode or Position control mode for joint, etc. DYNAMIXEL can be used even to make a mobile manipulator which is light but can be precisely controlled with velocity, torque and position control. DYNAMIXEL is a core component that makes TurtleBot3 perfect. It is easy to assemble, maintain, replace and reconfigure.

Open Control Board for ROS

The control board is open-sourced in hardware wise and in software wise for ROS communication. The open source control board OpenCR1.0 is powerful enough to control not only DYNAMIXELs but also ROBOTIS sensors that are frequently being used for basic recognition tasks in cost effective way. Various sensors such as Touch sensor, Infrared sensor, Color sensor and a handful more are available. The OpenCR1.0 has an IMU sensor inside the board so that it can enhance precise control for countless applications. The board has 3.3V, 5V, 12V power supplies to reinforce the available computer device lineups.

Strong Sensor Lineups

TurtleBot3 Burger uses enhanced 360° LiDAR, 9-Axis Inertial Measurement Unit and precise encoder for your research and development. TurtleBot3 Waffle is equipped with an identical 360° LiDAR as well but additionally proposes a powerful Intel® RealSense™ with the recognition SDK. TurtleBot3 Waffle Pi uses high utilized Raspberry Pi Camera. This will be the best hardware solution for making a mobile robot.

Open Source

The hardware, firmware and software of TurtleBot3 are open source which means that users are welcomed to download, modify and share source codes. All components of TurtleBot3 are manufactured with injection molded plastic to achieve low cost, however, the 3D CAD data is also available for 3D printing. The 3D CAD data is released via Onshape which is a full-cloud 3D CAD editor. Users can get an access with a web browser on desktop PC, laptop and even portable devices. Onshape allows to draw 3D models and to assemble them with colleagues. Besides, for the users who want to make OpenCR1.0 board by themselves, all details of the OpenCR1.0 board such as schematics, PCB gerber files, BOM and firmware source code are fully opened under the open-source licenses for users and ROS community. You can modify downloaded source code and hardware to share it with your friends.

Faq

July 17, 2019

FAQ

Enable SSH Server in Raspberry Pi

First you have to install SSH on Remote PC and Raspberry Pi.

$ sudo apt-get install ssh

In case of Raspberry Pi (TurtleBot3 Burger and Waffle Pi), since the SSH server of Ubuntu MATE 16.04.x and Raspbian is disabled by default. If you want to enable SSH, please refer to the documents below.

or you can use following commands.

$ sudo service ssh start
$ sudo ufw allow ssh

Before start to connect SSH, you need to check host name. Redbox in below image shows it.

Then, you can use SSH following command in Remote PC.

$ ssh ${HOSTNAME}@xxx.xxx.xx.xx

Timesync between TurtleBot3 and Remote PC

NOTE: This solution is subject to connect internet on your TurtleBot and Remote PC under same network.

  • Install ntpdate, and synchronize to NTP server on both TurtleBot and Remote PC.
$ sudo apt-get install ntpdate
$ sudo ntpdate ntp.ubuntu.com

Setup Dynamixels for TurtleBot3

WARNING: Please connect only single Dynamixel with OpenCR.

Download Setup Firmware

As shown in below image, from the example menu, go to turtlebot3turtlebot3_setupturtlebot3_setup_motor, download the firmware to OpenCR board, and proceed with setting process.

After completing the setup, download the proper TurtleBot3 firmware to OpenCR.

Click the Upload button on the Arduino IDE to download and once download is completed, Click the Serial Monitor icon on the upper right corner of the application as shown in next image.

Connect the Dynamixel to the OpenCR. Note that this firmware only works with one Dynamixel, so you have to connect only one Dynamixel at a time.

Change Dynamixel Setting

When the Serial Monitor is executed, a menu for the Dynamixel setup is displayed as shown in below image. TurtleBot3 consists of two Dynamixel actuators on the left and right respectively, so select the Dynamixel based on the assembly position. To set up the left motor, enter 1.

To prevent input mistakes, a confirmation menu is displayed once again. To proceed with the changes, enter Y.

If you enter Y, the setup tool starts to search the connected Dynamixel using different baudrates, and ID. If a Dynamixel is found, it will be reset for the TurtleBot3 configuration. When the setup is completed, OK message is printed.

Dynamixel Test

Complete the setup procedure and verify if the change has been properly made. If you select one of the test menu for the motor, the connected Dynamixel with correct configuration will iterate the rotation in the clockwise and counterclockwise. To end the test, press the Enter key again. To test the left Dynamixel, enter 3 as shown in below image and enter 4 for the right Dynamixel.

Can I charge the battery when the battery is connected to TurtleBot3?

Charging and discharging the battery at the same is NOT recommended and this may void the warranty of the product. If TurtleBot3 needs to be turned on while charging/replacing battery, please follow below procedure:

  1. Connect SMPS 12V 5A to OpenCR
  2. Disconnect the depleted battery from OpenCR
  3. Connect the depleted battery to battery charger or replace the depleted battery with a fully charged battery
  4. Connect the fully charged battery to OpenCR
  5. Disconnect SMPS 12V 5A from OpenCR

How to download the STL files of TurtleBot3

You can download it in the following way.

We released the TurtleBot3 Friends hardware design file at the link below.

So, you can download the STL files directly from each Onshape address as shown in the following figures.

  1. Sign in. (If you do not have an ID, you have to create one.)
  2. Click the “toggle tab manager” (A menu will appear on the left side of the browser.)
  3. Click the “Parts folder”
  4. Right-click on the icon of the file you want to download.
  5. Click the “Export…”
  6. Finally, you can download the output file type you want.

Intel® Joule™ USB-C port is not recognized on Windows 10

Some users have reported that the USB-C port is not recognized on Windows 10 when they were trying to update BIOS. Please check below link that describes the solution that worked out (Thanks for Rknlhrqy and VRAORESEARCH).

  1. ROS Discourse
  2. Intel Communities

Intel® Joule™ freezes while booting/installation

If BIOS firmware is not properly installed, this might happen. Please burn the BIOS firmware 193 release version again.

  1. Turn off the Joule.
  2. Proceed BIOS firmware #193 update using file from the below link.
  3. Make sure that you see the message in the red box.

How to update software

[TurtleBot3]

$ cd ~/catkin_ws/src/
$ rm -rf turtlebot3/ turtlebot3_msgs/ hls_lfcd_lds_driver/
$ git clone https://github.com/ROBOTIS-GIT/hls_lfcd_lds_driver.git
$ git clone https://github.com/ROBOTIS-GIT/turtlebot3_msgs.git
$ git clone https://github.com/ROBOTIS-GIT/turtlebot3.git
$ cd ~/catkin_ws/src/turtlebot3
$ sudo rm -r turtlebot3_description/ turtlebot3_teleop/ turtlebot3_navigation/ turtlebot3_slam/ turtlebot3_example/
$ cd ~/catkin_ws/
$ rm -rf build/ devel/
$ cd ~/catkin_ws && catkin_make -j1

[RemotePC]

$ cd ~/catkin_ws/src/
$ rm -rf turtlebot3/ turtlebot3_msgs/
$ git clone https://github.com/ROBOTIS-GIT/turtlebot3_msgs.git
$ git clone https://github.com/ROBOTIS-GIT/turtlebot3.git
$ cd ~/catkin_ws/
$ rm -rf build/ devel/
$ cd ~/catkin_ws && catkin_make

How to update firmware

[TurtleBot3]

  • Burger
$ export OPENCR_PORT=/dev/ttyACM0
$ export OPENCR_MODEL=burger
$ rm -rf ./opencr_update.tar.bz2
$ wget https://github.com/ROBOTIS-GIT/OpenCR/raw/master/arduino/opencr_release/shell_update/opencr_update.tar.bz2 && tar -xvf opencr_update.tar.bz2 && cd ./opencr_update && ./update.sh $OPENCR_PORT $OPENCR_MODEL.opencr && cd ..
  • Waffle or Waffle Pi
$ export OPENCR_PORT=/dev/ttyACM0
$ export OPENCR_MODEL=waffle
$ rm -rf ./opencr_update.tar.bz2
$ wget https://github.com/ROBOTIS-GIT/OpenCR/raw/master/arduino/opencr_release/shell_update/opencr_update.tar.bz2 && tar -xvf opencr_update.tar.bz2 && cd ./opencr_update && ./update.sh $OPENCR_PORT $OPENCR_MODEL.opencr && cd ..

Export_turtlebot3_model

July 17, 2019

Export TURTLEBOT3_MODEL

TurtleBot3 has three models, burger, waffle, and waffle_pi, so you have to set which model to use before using. To do this, we specify the model to be used with the export command.

export TURTLEBOT3_MODEL=burger
export TURTLEBOT3_MODEL=waffle
export TURTLEBOT3_MODEL=waffle_pi

The Bash export command is used to export a variable or function to the environment of all the child processes running in the current shell.

You can run this export command each time in the terminal window. But it is very inconvenient to set it up every time.

So we recommend that add your settings in bashrc file. The following example is an example for the user of TurtleBot3 Burger model. If you use a different model, you just change the value of TURTLEBOT3_MODEL.

$ gedit ~/.bashrc

$ source ~/.bashrc

opencr_setup

July 17, 2019

Contact US

About Open Robotics

Open Robotics is an independent non-profit organization founded by members of the global robotics community. The mission of Open Robotics is to support the development, distribution, and adoption of open source software for use in robotics research, education, and product development.

  • Address : 170 S Whisman Rd, Building D, Suite A, Mountain View, CA 94041, USA
  • E-Mail : turtlebot@osrfoundation.org

About ROBOTIS

ROBOTIS was derived from a simple response to the question, “What is a Robot?”.

Thus, ROBOTIS = “ROBOT IS…

There are many answers to this question, but we strive to develop and apply products that would impact our daily lives and make robots more personal rather than an intelligent appliance. Imaginations can become reality through personal robots.

ROBOTIS US Office

  • Address: 26228 Enterprise Ct Lake Forest, CA 92630, USA
  • Tel: +1-949-377-0377
  • Fax: +1-949-242-5112
  • Web: http://en.robotis.com/
  • E-Mail: america@robotis.com

ROBOTIS China Office

  • Address: Room 1103, Building B, Jiajing Tiancheng, Chaoyang District, (100102) Beijing, China
  • Tel: +86-10-5726-7179
  • Web: http://cn.robotis.com/
  • E-Mail: china@robotis.com

ROBOTIS Japan Office

  • Address: Haruka Building 3F, 2 Chome−12-14, Kanda Ogawamachi, Chiyoda-ku, Tokyo-to, Japan
  • Tel: +81-3-6869-8804
  • Web: http://jp.robotis.com/
  • E-Mail: japan@robotis.com

ROBOTIS Korea Office

  • Address: 37, Magok Jungang 5-ro 1-gil, Gangseo-gu, Seoul, Korea 07594
  • Tel: +82-70-8671-2609
  • Fax: +82-70-8230-1336
  • Web: http://www.robotis.com/
  • E-Mail: contactus2@robotis.com

About OST (Open Source Team)

  • Members : Ashe Kim, Daniel Seon, Darby Lim, Hancheol Cho, Jason Jin, Leon Jung, M. Y, Will Son, Yoonseok Pyo
  • Alumni : Christopher Tatsch, Yoshihiro Shibata
  • Supporter : JangHo Kim, Jinwook Kim, Woosik Yang, OROCA
  • Collaboration Team : Open Robotics, Intel, OROCA, Onshape, OSU

We are Open Source Team in ROBOTIS HQ. With open source hardwares and softwares, our robot friends are hoping to enrich our lives. We mainly develop and support Dynamixel SDK, OpenManipulator, OpenCM, OpenCR and OpenAutonomousCar. Our favorite platform is ROS! :) We are delighted to be in charge of developing TurtleBot3, the official reference platform for ROS, and in the future we will be the team that can support you to get closer to your robot dreams.

opencr_setup

July 17, 2019

Compatible Devices

  • If you want to use other products instead of Computer and Sensors included in the basic configuration, please refer to the this page.

Computer

  • TurtleBot3’s main computer is Raspberry Pi 3 (TurtleBot3 Burger and Waffle Pi) and Intel Joule 570x (TurtleBot3 Waffle). These SBCs (Single Board Computer) are enough to use the basic features of TurtleBot3, but users need to increase CPU performance, use GPU, or add RAM size for other purposes. This section describes how to replace the SBC.

  • There are various types of SBC as shown in the following figure. The specification of each SBC is different. But if you can install Linux and ROS on the SBC you want to use, you can use that SBC as the main computer for TurtleBot3. In addition to SBC, Intel NUC, mini PC and small notebooks are available.

Hardware assembly

  • Most of the SBCs can be assembled without problems using PCB support, which is built in TurtleBot3. For reference, you can purchase additional parts such as PCB support (link), download files shared with Onshape, and print them using a 3D printer.

  • You can fix SBC on the waffle-plate of TurtleBot3 using the PCB support in the fixing hole of the SBC to be used as shown in the following figure.

Power supply

  • Hardware assembly of the SBC is simple. But the power supply is not simple. You need to modify the existing power cable or make a new power cable to match the power cable of the computer you are going to use.

  • As a basic part of TurtleBot3, the following power cable is provided. The left figure is for Raspberry Pi and the right figure is for Intel Joule 570x. The power cable must be made to match the power specifications of the computer you are using. OpenCR has both a 5V (4A) power and a 12V (1A) power, which are commonly used in SBCs.

  • The power source for the SBC is the three connectors on the left in the OpenCR pinmap below.

Sensors

  • TurtleBot3 Burger uses enhanced 360° LiDAR, 9-Axis Inertial Measurement Unit and precise encoder for your research and development. TurtleBot3 Waffle is equipped with an identical 360° LiDAR as well but additionally proposes a powerful Intel® RealSense™ with the recognition SDK. TurtleBot3 Waffle Pi uses high utilized Raspberry Pi Camera. This will be the best hardware solution for making a mobile robot.

  • If you use an additional sensor, you can use it after attaching the sensor to the robot. The ROS provides a development environment in which drivers and libraries of the aforementioned sensors can be used. Not all sensors are supported by ROS package, but more and more sensor related packages are increasing.

  • If you are looking for a new sensor, check out Sensors page of ROS Wiki to find the sensor and related ROS packages you want.

  • If you are using an analog sensor connected to the embedded board, you can use it with OpenCR. If you need to use an analog sensor other than USB or Ethernet communication, refer to Additional Sensors page.

Common_notice

July 17, 2019

WARNING: Be careful when running the robot on the table as the robot might fall.

NOTE:

  • This instructions were tested on Ubuntu 16.04 and ROS Kinetic Kame.
  • This instructions are supposed to be running on the remote PC. Please run the instructions below on your Remote PC.
  • Make sure to run the Bringup instructions before running the instructions below.

TIP: It is recommended to use a joystick pad instead of the keyboard for easier control. For more information on remote control, Please refer to Teleoperation page.

TIP: The terminal application can be found with the Ubuntu search icon on the top left corner of the screen. The shortcut key for running the terminal is Ctrl-Alt-T.

TIP: Before executing this command, you have to specify the model name of TurtleBot3. The ${TB3_MODEL} is the name of the model you are using in burger, waffle, waffle_pi. If you want to permanently set the export settings, please refer to Export TURTLEBOT3_MODEL page.

Export TURTLEBOT3_MODEL

opencr_setup

July 17, 2019

Challenges

Online Competition on RDS

Online Competition using TurtleBot3

We are preparing an online competition on ROS Development Studio (RDS) with TurtleBot3 AutoRace and Task Mission using TurtleBot3 and OpenManipulator. You can participate free of charge in this online competition and learn about SLAM, Navigatin, Autonomous driving, Manipulation in a defined rule. Let’s play each other in this online competition!

TurtleBot3 AutoRace on RDS

If you need more information about it or you want to launch it in your remote PC, please visit Autonomous Driving section.

Task Mission using TurtleBot3 and OpenManipulator on RDS

If you need more information about it or you want to launch it in your remote PC, please visit Manipulation section.

ROS Development Studio (RDS)

ROS Development Studio (RDS) is an online IDE which allows you program and test any robot using only a web browser. With RDS, you will be able to: Develop ROS programs for robots in a faster way, with an already setup IDE environment that includes autocomplete. Test the programs in real time on the provided simulated robots. Use the provided simulations or upload your own. Quickly see the results of your programming. Debug using graphical ROS tools. Test what you have developed on RDS in the real robot (if you have it all of these are using ONLY a web browser without any installation and not limited by any operating system. DEVELOP FOR ROS USING WINDOWS, LINUX OR OSX. Please refer to the following link for further information on TurtleBot3 related lectures and reference materials provided by The Construct.

Offline Competition

TurtleBot3 Maze Solving @ FIRA Malaysia 2018

Robosot (office task challenge) using TurtleBot3 @ FIRA Malaysia 2018

  • For more information, please see the following page.

GdR TurtleBot Challenge 2018 (TU Darmstadt)

Autonomous Mobile Robot Competition (Dankook University)

AutoRace - RBIZ Challenge 2017

  • For more information, please see the following page.

AutoRace - RBIZ Challenge 2018

  • The competition will be held in Daegu, Korea on November 15-17.

opencr_setup

July 17, 2019

Autonomous Driving

NOTE: This instructions were tested on Ubuntu 16.04 and ROS Kinetic Kame.

We are currently doing several projects related with the keywords: Autonomous Driving and TurtleBot3s.

TurtleBot3 AutoRace

The AutoRace is a competition for autonomous driving robot platforms. To provide various conditions for robot application development, the game gives as less structural regulation as possible. Entire contents are opened in software (source codes for referee system) wise and hardware (stp / dwg files of game map) wise.

Whole robots and even the referee system in the field is run by ROS, so it would support to make many kinds of contents further. Get opened source of what is participated in each competitions!

TurtleBot3 AutoRace Tutorials

  • Source code for the AutoRace tutorial: turtlebot3_autorace packages

  • Tutorial 1: Traffic Light

  • Tutorial 2: Lane Tracking

  • Tutorial 3: Parking

  • Tutorial 4: Node Optimization

  • Tutorial 5: Level Crossing

  • Tutorial 6: Tunnel

Tutorials: 1. Requirements

  • TurtleBot3 Burger
    • ROS and dependent ROS packages needs to be installed in the robot
    • All functions of TurtleBot3 Burger which is described in TurtleBot3 E-Manual needs to be tested before running TurtleBot3 Auto source code
  • Remote PC (Laptop, Desktop, etc.)
    • ROS and dependent ROS packages needs to be installed in the computer
    • All functions of TurtleBot3 Burger which is described in TurtleBot3 E-Manual needs to be tested before running TurtleBot3 Auto source code
  • Add-ons on TurtleBot3 Burger
    • Raspberry Pi Camera Type G (Fisheye Lens) : Available Here
      • See Features of 4 screw holes in the page very carefully before mounting on the frame of any conductive materials
    • Raspberry Pi Camera Mount
  • Track structure and Accessories, such as Traffic Signs, Traffic Lights, and other objects.

Tutorials: 2. Install AutoRace package

Remote PC & TurtleBot SBC Open terminal, then install AutoRace package.

$ cd ~/catkin_ws/src/
$ git clone https://github.com/ROBOTIS-GIT/turtlebot3_autorace.git
$ cd ~/catkin_ws && catkin_make

Tutorials: 3. Install Additional Dependent Packages

Remote PC & TurtleBot SBC Open new terminal, then enter

$ sudo apt-get install ros-kinetic-image-transport ros-kinetic-cv-bridge ros-kinetic-vision-opencv python-opencv libopencv-dev ros-kinetic-image-proc

Tutorials: 4. Calibration

Tutorials: 4.1. Camera Imaging Calibration

  1. Remote PC Open new terminal, then enter

     $ roscore
    
  2. TurtleBot SBC Open new terminal, then enter

     $ roslaunch turtlebot3_autorace_camera turtlebot3_autorace_camera_pi.launch
    
  3. Remote PC Open new terminal, then enter

     $ rqt_image_view
    

    then, click /camera/image/compressed or /camera/image/ topic in the select box. If everything works fine, the screen should show you the view from the robot.

  4. Remote PC Open new terminal, then enter

     $ rosrun rqt_reconfigure rqt_reconfigure
    

    then, click camera, adjust the parameter value that makes the camera show clean, enough bright image to you. After that, overwrite each values on to the turtlebot3_autorace_camera/calibration/camera_calibration/camera.yaml. This will make the camera set its parameters as you set here from next launching.

Tutorials: 4.2. Intrinsic Camera Calibration

  1. Remote PC Open new terminal, then enter

     $ roscore
    
  2. TurtleBot SBC Open new terminal, then enter

     $ roslaunch turtlebot3_autorace_camera turtlebot3_autorace_camera_pi.launch
    
  3. Remote PC Print checkerboard for camera calibration on A4 size paper. The checkerboard is in turtlebot3_autorace_camera/data/checkerboard_for_calibration.pdf. See Calibration manual and modify the parameter value written in turtlebot3_autorace_camera/launch/turtlebot3_autorace_intrinsic_camera_calibration.launch.

  4. Remote PC Open new terminal, then enter

     $ export AUTO_IN_CALIB=calibration
     $ roslaunch turtlebot3_autorace_camera turtlebot3_autorace_intrinsic_camera_calibration.launch
    
  5. Remote PC After finishing the calibration, intrinsic camera calibration file will be saved in turtlebot3_autorace_camera/calibration/intrinsic_calibration/camerav2_320x240_30fps.yaml.

Tutorials: 4.3. Extrinsic Camera Calibration

  1. Remote PC Open new terminal, then enter

     $ roscore
    
  2. TurtleBot SBC Open new terminal, then enter

     $ roslaunch turtlebot3_autorace_camera turtlebot3_autorace_camera_pi.launch
    
  3. TurtleBot SBC Open new terminal, then enter

     $ export AUTO_IN_CALIB=action
     $ roslaunch turtlebot3_autorace_camera turtlebot3_autorace_intrinsic_camera_calibration.launch
    
  4. Remote PC Open new terminal, then enter

     $ export AUTO_EX_CALIB=calibration
     $ roslaunch turtlebot3_autorace_camera turtlebot3_autorace_extrinsic_camera_calibration.launch
    
  5. Remote PC Open new terminal, then enter

     $ rqt
    

    clicking plugins -> visualization -> Image view on the top of the screen will make extra monnitor for camera view. Make 2 extra monitor in the rqt plate by following it. Then, choose /camera/image_extrinsic_calib/compressed and /camera/image_projected_compensated topics on each of extra monitors. If everything works fine, one of the screen will show the image with red rectangle, and other one will show the ground projected view (bird’s eye view) which is based on the one another.

  6. Remote PC Open new terminal, then enter

     $ rosrun rqt_reconfigure rqt_reconfigure
    

    then, adjust the parameter value in /camera/image_projection and /camera/image_compensation_projection that carries out visual modifications on the image. The parameter image_projection will change the shape of the red rectangle of /camera/image_extrinsic_calib/compressed image. Intrinsic camera calibration will transform the image surrounded by the red rectangle, and will show the image that looks from over the lane. After that, overwrite each values on to the yaml files in turtlebot3_autorace_camera/calibration/extrinsic_calibration/. This will make the camera set its parameters as you set here from next launching.

Tutorials: 4.4. Settings for Recognition

Until now, all the preprocess of image must have been tested.
  1. Remote PC Open new terminal, then enter

     $ roscore
    
  2. TurtleBot SBC Open new terminal, then enter

     $ roslaunch turtlebot3_autorace_camera turtlebot3_autorace_camera_pi.launch
    
  3. TurtleBot SBC Open new terminal, then enter

     $ export AUTO_IN_CALIB=action
     $ roslaunch turtlebot3_autorace_camera turtlebot3_autorace_intrinsic_camera_calibration.launch
    
  4. Remote PC Open new terminal, then enter

     $ export AUTO_EX_CALIB=action
     $ roslaunch turtlebot3_autorace_camera turtlebot3_autorace_extrinsic_camera_calibration.launch
    

    From now, the following descriptions will mainly adjust feature detector / color filter for object recognition. Every adjustment after here is independent to each other’s process. However, to make sure, if you want to adjust each parameters in series, finish every adjustment perfectly, then continue to next.

Tutorials: 5. Missions

Tutorials: 5.1 Lane Detection

  1. Put the robot on the lane. If you placed the robot correctly, yellow line should be placed on the left side of the robot, and of course, white line should be placed on the right side of the robot. Make sure that turtlebot3_robot node of turtlebot3_bringup package is not yet launched. If it is on running, the robot will suddenly runs on the track.

  2. Remote PC Open new terminal, then enter

     $ export AUTO_DT_CALIB=calibration
     $ roslaunch turtlebot3_autorace_detect turtlebot3_autorace_detect_lane.launch
    
  3. Remote PC Open new terminal, then enter

     $ rqt
    

    clicking plugins -> visualization -> Image view on the top of the screen will make extra monnitor for camera view. Make 3 extra monitor in the rqt plate by following it. Then, choose /detect/image_yellow_lane_marker/compressed, /detect/image_lane/compressed and /detect/image_white_lane_marker/compressed topics on each of extra monitors. If everything works fine, left and right screen will show the filtered image of the yellow line and the white line, and the center screen will show the lane of where the robot should go. In the calibration mode, left and right screen will show white, and the center screen may show abnormal result. From here, you should adjust the filter parameters to show up correct lines and the direction.

  4. Remote PC Open new terminal, then enter

     $ rosrun rqt_reconfigure rqt_reconfigure
    

    then, adjust the parameter value in /camera/image_projection and /camera/image_compensation_projection that carries out visual modifications on the image. The parameter image_projection will change the shape of the red rectangle of /camera/image_extrinsic_calib/compressed image. Intrinsic camera calibration will transform the image surrounded by the red rectangle, and will show the image that looks from over the lane. After that, overwrite value on to the lane.yaml file in turtlebot3_autorace_detect/param/lane/. This will make the camera set its parameters as you set here from next launching.

    TIP: Calibration process of line color filtering is sometimes so-so difficult because of your physical environment which includes the luminance of light in the room, etc. Hence, you should have patience to carry out this procedure. To make everything quickly, put the value of turtlebot3_autorace_detect/param/lane/lane.yaml on the reconfiguration parameter, then start calibration. Calibrate hue low - high value at first. (1) Hue value means the color, and every colors, like yellow, white, have their own region of hue value (refer to hsv map). Then calibrate saturation low - high value. (2) Every colors have also their own field of saturation. Finally, calibrate the lightness low - high value. (3) In the source code, however, have auto-adjustment function, so calibrating lightness low value is meaningless. Just put the lightness high value to 255. Clearly filtered line image will give you clear result of the lane.

  5. Remote PC After overwriting the calibration file, close rqt_rconfigure and turtlebot3_autorace_detect_lane, then enter

     $ export AUTO_DT_CALIB=action
     $ roslaunch turtlebot3_autorace_detect turtlebot3_autorace_detect_lane.launch
    
  6. Check if the results come out well by entering

    Remote PC

     $ roslaunch turtlebot3_autorace_control turtlebot3_autorace_control_lane.launch
    

    TurtleBot SBC

     $ roslaunch turtlebot3_bringup turtlebot3_robot.launch
    

    After entering these commands, the robot will kick-off to run.

Tutorials: 5.2. Traffic Sign

  1. Traffic sign detection needs some pictures of the traffic sign. Take their pictures by using rqt_image_view node and edit their size, shape by any of photo editor available in linux. The node finds the traffic sign with SIFT algorithm, so if you want to use your customized traffic signs ( which is not introduced in the autorace_track), just be aware of More edges in the traffic sign gives better recognition results from SIFT.

  2. Put the robot on the lane. At this time, the traffic sign should be placed to where the robot can see it easily. Make sure that turtlebot3_robot node of turtlebot3_bringup package is not yet launched. If it is on run, the robot may suddenly run on the track.

  3. Remote PC Open new terminal, then enter

     $ rqt_image_view
    

    then, click /camera/image_compensated topic in the select box. If everything works fine, the screen should show you the view from the robot.

  4. Remote PC Take the picture by alt + print screen, edit the captured with your preferred photo editor. After that, place the picture to [where the turtlebot3_autorace package you've placed]/turtlebot3_autorace/turtlebot3_autorace_detect/file/detect_sign/ and rename it as you want. (Although, you should change the file name written in the source detect_sign.py, if you want to change the default file names.)

  5. Remote PC Open new terminal, then enter

     $ roslaunch turtlebot3_autorace_detect turtlebot3_autorace_detect_sign.launch
    
  6. Remote PC Open new terminal, then enter

     $ rqt_image_view
    

    then, click /detect/image_traffic_sign/compressed topic in the select box. If everything works fine, the screen will show the result of traffic sign detection, if it succeeds to recognize it.

Tutorials: 5.3. Traffic Light

  1. Put the robot on the lane. If you placed the robot correctly, yellow line should be placed on the left side of the robot, and of course, white line should be placed on the right side of the robot. Make sure that turtlebot3_robot node of turtlebot3_bringup package is not yet launched. If it is on running, the robot will suddenly runs on the track.

  2. Remote PC Open new terminal, then enter

     $ export AUTO_DT_CALIB=calibration
     $ roslaunch turtlebot3_autorace_detect turtlebot3_autorace_detect_traffic_light.launch
    
  3. Remote PC Open new terminal, then enter

     $ rqt
    

    clicking plugins -> visualization -> Image view on the top of the screen will make extra monnitor for camera view. Make 3 extra monitor in the rqt plate by following it. Then, choose /detect/image_red_light, /detect/image_yellow_light, /detect/image_green_light and /detect/image_traffic_light topics on each of extra monitors. If everything works fine, three screen will show the filtered image of the red / yellow / green light, and the other one will show the recognized color with short string. In the calibration mode, three screen will show white, and the other screen may show plain result. From here, you should adjust the filter parameters to show up correct lines and the direction.

  4. Remote PC Open new terminal, then enter

     $ rosrun rqt_reconfigure rqt_reconfigure
    

    then, adjust the parameter value in /detect_traffic_light. Changing the value of color filter will show the change of filtered view on each color’s screen. After that, overwrite value on to the traffic_light.yaml file in turtlebot3_autorace_detect/param/traffic_light/. This will set its parameters as you set here from next launching.

    Tip: same as 5.1 Lane Detection

  5. Remote PC After overwriting the calibration file, close rqt_rconfigure and turtlebot3_autorace_detect_traffic_light, then enter

     $ export AUTO_DT_CALIB=action
     $ roslaunch turtlebot3_autorace_detect turtlebot3_autorace_detect_traffic_light.launch
    
  6. Use rqt_image_view node, and check if the results come out well

Tutorials: 5.4. Parking Lot

  1. Parking needs only one preparation, traffic sign recognition.

  2. Place the dummy robot on either of parking lot.

  3. Place the robot on the lane appropriately.

Tutorials: 5.5. Level Crossing

  1. Level Crossing finds 3 red rectangles on the level, and calculates whether the level is opened or closed, and how much near the robot is come.

  2. Put the robot on the lane correctly. Then, bring the robot in front of closed level.

  3. Remote PC Open new terminal, then enter

     $ export AUTO_DT_CALIB=calibration
     $ roslaunch turtlebot3_autorace_detect turtlebot3_autorace_detect_level.launch
    
  4. Remote PC Open new terminal, then enter

     $ rqt
    

    clicking plugins -> visualization -> Image view on the top of the screen will make extra monnitor for camera view. Make 3 extra monitor in the rqt plate by following it. Then, choose /detect/image_level_color_filtered and /detect/image_level topics on each of extra monitors. If everything works fine, three screen will show the filtered image of the red rectangles, and another one will draw a line which connects the rectangles. In the calibration mode, a screen will show white, and the other screen may show plain result. From here, you should adjust the filter parameters to show up correct lines and the direction.

  5. Remote PC Open new terminal, then enter

     $ rosrun rqt_reconfigure rqt_reconfigure
    

    then, adjust the parameter value in /detect_level. Changing the value of color filter will show the change of filtered view on each color’s screen. After that, overwrite value on to the level.yaml file in turtlebot3_autorace_detect/param/level/. This will set its parameters as you set here from next launching.

    Tip: same as 5.1 Lane Detection

  6. Remote PC After overwriting the calibration file, close rqt_rconfigure and turtlebot3_autorace_detect_level, then enter

     $ export AUTO_DT_CALIB=action
     $ roslaunch turtlebot3_autorace_detect turtlebot3_autorace_detect_level.launch
    
  7. Use rqt_image_view node, and check if the results come out well

Tutorials: 5.6. Tunnel

  1. Tunnel node will bring you from the entrance to the exit by using turtlebot3 navigation package. What you should calibrate is mapping the tunnel (or if you are using the autorace track as it is, you don’t need to modify it by yourself) and check the pose of how the robot should be posed right before it comes out from tunnel (this is also unnecessary when you are using the default map).

  2. Remote PC Check the pose of exit on RViz, while the SLAM or Navigation package is running. After that, overwrite value on to the detect_tunnel.py file line 144

Tutorials: 6. Run Autonomous Driving

  1. From now, all the related nodes will be run in action mode. Close all ROS-related programs and terminals on Remote PC and TurtleBot SBC, if some were not closed yet. Then, put the robot on the lane correctly.

  2. Remote PC Open new terminal, then enter

     $ roscore
    
  3. TurtleBot SBC Open new terminal, then enter

     $ roslaunch turtlebot3_bringup turtlebot3_robot.launch
    
  4. TurtleBot SBC Open new terminal, then enter

     $ roslaunch turtlebot3_autorace_camera turtlebot3_autorace_camera_pi.launch
    
  5. TurtleBot SBC Open new terminal, then enter

     $ export AUTO_IN_CALIB=action
     $ roslaunch turtlebot3_autorace_camera turtlebot3_autorace_intrinsic_camera_calibration.launch
    
  6. Remote PC Open new terminal, then enter

     $ export AUTO_EX_CALIB=action
     $ export AUTO_DT_CALIB=action
     $ export TURTLEBOT3_MODEL=burger
     $ roslaunch turtlebot3_autorace_core turtlebot3_autorace_core.launch
    
  7. Remote PC Open new terminal, then enter

     $ rostopic pub -1 /core/decided_mode std_msgs/UInt8 "data: 2"
    

    turtlebot3_autorace_core will control all system in the package (open and close the launch, nodes in the package.)

TurtleBot3 AutoRace Online Competition

AutoRace RBIZ Challenge 2017

TurtleBot3 AutoRace 2017 Teaser

  • Official Teaser #1

  • Official Teaser #2

  • Official Final Video

TurtleBot3 AutoRace 2017 Challengers

  • Video - Team RealRiceThief (1st Place)

  • Video - Team Sherlotics (Introduction video)

  • TurtleBot3 was tested its driving autonomy under the open source from MIT DuckieTown engineering.

AutoRace RBIZ Challenge 2018

  • (Coming Soon!)

AutoRace with Gazebo

The AutoRace is provided by Gazebo. We created a environment TurtleBot3 AutoRace 2017 in R-BIZ Challenge.

  • Recommended specification
CPU Intel Core i5 / 2 GHz Dual Core Processor
RAM 4GB
Storage 20Gb of free hard drive space
GPU NVIDIA GeForce GTX 9 series

WARNING: Don’t confuse your real camera calibration configure files and Gazebo calibration configure files.

NOTE: The turtlebot3_autorace package requires turtlebot3_simulations package as a prerequisite. If you did not install it in the Installation TurtleBot3 Simulations, install it first.

  1. Remote PC Run AutoRace Gazebo. You can see the AutoRace 2017 map in Gazebo.

     $ roslaunch turtlebot3_gazebo turtlebot3_autorace.launch
    

  2. Remote PC Run Mission launch. You can see Traffic Light, Parked TurtleBot3 and Toll Gate in Gazebo. When TurtleBot3 approaches the mission area, they operate automatically.

     $ roslaunch turtlebot3_gazebo turtlebot3_autorace_mission.launch
    

  3. Remote PC Run AutoRace launch. If you want to run AutoRace in real, you have to calibrate your camera.

     $ export GAZEBO_MODE=true
     $ export AUTO_IN_CALIB=action
     $ roslaunch turtlebot3_autorace_camera turtlebot3_autorace_intrinsic_camera_calibration.launch
    
  4. Remote PC Open new terminal, then enter

     $ export AUTO_EX_CALIB=action
     $ export AUTO_DT_CALIB=action
     $ export TURTLEBOT3_MODEL=burger
     $ roslaunch turtlebot3_autorace_core turtlebot3_autorace_core.launch
    
  5. Remote PC Open new terminal, then enter

     $ rostopic pub -1 /core/decided_mode std_msgs/UInt8 "data: 2"
    
  • Video : AutoRace with Gazebo

opencr_setup

July 17, 2019

Applications

NOTE:

  • This instructions were tested on Ubuntu 16.04 and ROS Kinetic Kame.
  • This instructions are supposed to be running on the remote PC. Please run the instructions below on your Remote PC. However, the part marked [TurtleBot] is the content that runs on SBC of TurtleBot3.
  • Make sure to run the Bringup instructions before running the instructions below.

TIP: The terminal application can be found with the Ubuntu search icon on the top left corner of the screen. The shortcut key for running the terminal is Ctrl-Alt-T.

This chapter shows some demos using TurtleBot3. In order to implement these demos, you have to install the turtlebot3_applications and turtlebot3_applications_msgs packages.

[Remote PC] Go to catkin workspace directory (/home/(user_name)/catkin_ws/src) and clone the turtlebot3_applications and turtlebot3_applications_msgs repository. Then run the catkin_make to build the new packages.

$ sudo apt-get install ros-kinetic-ar-track-alvar
$ sudo apt-get install ros-kinetic-ar-track-alvar-msgs
$ cd ~/catkin_ws/src
$ git clone https://github.com/ROBOTIS-GIT/turtlebot3_applications.git
$ git clone https://github.com/ROBOTIS-GIT/turtlebot3_applications_msgs.git
$ cd ~/catkin_ws && catkin_make

TurtleBot Follower Demo

NOTE:

  • The follower demo was implemented only using a 360 Laser Distance Sensor LDS-01. a classification algorithm is used based on previous fitting with samples of person and obstacles positions to take actions. It follows someone in front of the robot within a 50 centimeter range and 140 degrees.
  • Running the follower demo in an area with obstacles may not work well. Therefore, it is recommended to run the demo in an open area without obstacles.

[TurtleBot] In order to run this demo, parameter in LIDAR launch file has to be modified. In the below example, Pluma is used to edit the launch file. In the param tag with frame_id as a name, replace base_scan to odom and save the file as shown in the below images.

$ pluma ~/catkin_ws/src/turtlebot3/turtlebot3_bringup/launch/turtlebot3_lidar.launch

NOTE: Turtlebot Follower Demo requires scikit-learn, NumPy and ScyPy packages.

[Remote PC] Install scikit-learn, NumPy and ScyPy packages with below commands.

$ sudo apt-get install python-pip
$ sudo pip install -U scikit-learn numpy scipy
$ sudo pip install --upgrade pip

[Remote PC] When installation is completed, run roscore on the remote pc with below command.

$ roscore

[TurtleBot] Launch the bringup

$ roslaunch turtlebot3_bringup turtlebot3_robot.launch

[Remote PC] Launch turtlebot3_follow_filter with below command.

$ roslaunch turtlebot3_follow_filter turtlebot3_follow_filter.launch

[Remote PC] Launch turtlebot3_follower with below command.

$ roslaunch turtlebot3_follower turtlebot3_follower.launch

TurtleBot Panorama Demo

NOTE:

  • The turtlebot3_panorama demo uses pano_ros for taking snapshots and stitching them together to create panoramic image.
  • Panorama demo requires to install raspicam_node package. Instructions for installing this package can be found at Gihub Link
  • Panorama demo requires to install OpenCV and cvbridge packages. Instructions for installing OpenCV can be found at OpenCV Tutorial Link

[TurtleBot] Launch the turtlebot3_rpicamera file

$ roslaunch turtlebot3_bringup turtlebot3_rpicamera.launch

[Remote PC] Launch panorama with below command.

$ roslaunch turtlebot3_panorama panorama.launch

[Remote PC] To start the panorama demo, please enter below command.

$ rosservice call turtlebot3_panorama/take_pano 0 360.0 30.0 0.3

Parameters that can be sent to the rosservice to get a panoramic image are:

  • Mode for taking the pictures.

    • 0 : snap&rotate (i.e. rotate, stop, snapshot, rotate, stop, snapshot, …)
    • 1 : continuous (i.e. keep rotating while taking snapshots)
    • 2 : stop taking pictures and create panoramic image
  • Total angle of panoramic image, in degrees
  • Angle interval (in degrees) when creating the panoramic image in snap&rotate mode, time interval (in seconds) otherwise
  • Rotating velocity (in radians/s)

[Remote PC] To view the result image, please enter below command.

$ rqt_image_view image:=/turtlebot3_panorama/panorama

Automatic Parking

NOTE:

  • The turtlebot3_automatic_parking demo was using a 360 laser Distance Sensor LDS-01 and a reflective tape. The LaserScan topic has intensity and distance data from LDS. The TurtleBot3 uses this to locate the reflective tape.
  • The turtlebot3_automatic_parking demo requires NumPy package.

[Remote PC] Install NumPy package with below commands. If you already installed numpy, you can skip below commands.

$ sudo apt-get install python-pip
$ sudo pip install -U numpy
$ sudo pip install --upgrade pip

[Remote PC] Run roscore.

$ roscore

[TurtleBot] Bring up basic packages to start TurtleBot3 applications.

$ roslaunch turtlebot3_bringup turtlebot3_robot.launch

[Remote PC] If you use TurtleBot3 Burger, set the model of TurtleBot3 like command below.

TIP: Before executing this command, you have to specify the model name of TurtleBot3. The ${TB3_MODEL} is the name of the model you are using in burger, waffle, waffle_pi. If you want to permanently set the export settings, please refer to Export TURTLEBOT3_MODEL page.

$ export TURTLEBOT3_MODEL=${TB3_MODEL}

[Remote PC] Run RViz.

$ roslaunch turtlebot3_bringup turtlebot3_remote.launch
$ rosrun rviz rviz -d `rospack find turtlebot3_automatic_parking`/rviz/turtlebot3_automatic_parking.rviz

[Remote PC] Launch the automatic parking file.

$ roslaunch turtlebot3_automatic_parking turtlebot3_automatic_parking.launch  
  • You can select LaserScan topic in RViz.

  • /scan

  • /scan_spot

Automatic Parking Vision

NOTE:

  • The turtlebot3_automatic_parking_vision uses raspberry pi camera and so the robot which is a default platform used for this demo is TurtleBot3 Waffle Pi. Since it parks from finding out AR marker on some wall, printed AR marker should be prepared. Whole process uses the image get from the camera, so if the process is not well being done, configure the parameters, such as brightness, contrast, etc.
  • The turtlebot3_automatic_parking_vision uses rectified image based on image_proc nodes. To get rectified image, the robot should get optic calibration data for raspberry pi camera. (Every downloaded turtlebot3 packages already have the camera calibration data as raspberry pi camera v2 default.)
  • The turtlebot3_automatic_parking_vision package requires ar_track_alvar package.

[Remote PC] Run roscore.

$ roscore

[TurtleBot] Bring up basic packages to start TurtleBot3 applications.

$ roslaunch turtlebot3_bringup turtlebot3_robot.launch

[TurtleBot] Start the raspberry pi camera nodes.

$ roslaunch turtlebot3_bringup turtlebot3_rpicamera.launch

[Remote PC] Raspberry pi package will publish compressed type image for fast communication. However, what will be needed in image rectification in image_proc node is raw type image. Hence, compressed image should be transform to raw image.

$ rosrun image_transport republish compressed in:=raspicam_node/image raw out:=raspicam_node/image

[Remote PC] Then, the image rectification should be carried out.

$ ROS_NAMESPACE=raspicam_node rosrun image_proc image_proc image_raw:=image _approximate_s=true _queue_size:=20

[Remote PC] Now should start the AR marker detection. Before running related launch file, the model of what will be used by this example code should be exported. After running the launch file, RViz will be automatically run under preset environments.

$ export TURTLEBOT3_MODEL=waffle_pi
$ roslaunch turtlebot3_automatic_parking_vision turtlebot3_automatic_parking_vision.launch

Load Multiple TurtleBot3s

NOTE: This application must be set firmware version 1.2.1 or higher.

[Remote PC] Run roscore.

$ roscore

Bringup multiple turtlebot3s with different namespace. We recommend the namespace includes common words such as tb3_0, tb3_1 or my_robot_0, my_robot_1

[TurtleBot(tb3_0)] Bring up basic packages with ROS NAMESPACE for nodes, multi_robot_name for tf prefix and set_lidar_frame_id for lidar frame id. This parameters must be the same.

$ ROS_NAMESPACE=tb3_0 roslaunch turtlebot3_bringup turtlebot3_robot.launch multi_robot_name:="tb3_0" set_lidar_frame_id:="tb3_0/base_scan"

[TurtleBot(tb3_1)] Bring up basic packages with ROS NAMESPACE for nodes, multi_robot_name for tf prefix and set_lidar_frame_id for lidar frame id. This parameters must be the same but different other robots.

$ ROS_NAMESPACE=tb3_1 roslaunch turtlebot3_bringup turtlebot3_robot.launch multi_robot_name:="tb3_1" set_lidar_frame_id:="tb3_1/base_scan"

Then the terminal you launched tb3_0 will represents below messages. You can watch TF messages have prefix tb3_0

SUMMARY
========

PARAMETERS
 * /rosdistro: kinetic
 * /rosversion: 1.12.13
 * /tb3_0/turtlebot3_core/baud: 115200
 * /tb3_0/turtlebot3_core/port: /dev/ttyACM0
 * /tb3_0/turtlebot3_core/tf_prefix: tb3_0
 * /tb3_0/turtlebot3_lds/frame_id: tb3_0/base_scan
 * /tb3_0/turtlebot3_lds/port: /dev/ttyUSB0

NODES
  /tb3_0/
    turtlebot3_core (rosserial_python/serial_node.py)
    turtlebot3_diagnostics (turtlebot3_bringup/turtlebot3_diagnostics)
    turtlebot3_lds (hls_lfcd_lds_driver/hlds_laser_publisher)

ROS_MASTER_URI=http://192.168.1.2:11311

process[tb3_0/turtlebot3_core-1]: started with pid [1903]
process[tb3_0/turtlebot3_lds-2]: started with pid [1904]
process[tb3_0/turtlebot3_diagnostics-3]: started with pid [1905]
[INFO] [1531356275.722408]: ROS Serial Python Node
[INFO] [1531356275.796070]: Connecting to /dev/ttyACM0 at 115200 baud
[INFO] [1531356278.300310]: Note: publish buffer size is 1024 bytes
[INFO] [1531356278.303516]: Setup publisher on sensor_state [turtlebot3_msgs/SensorState]
[INFO] [1531356278.323360]: Setup publisher on version_info [turtlebot3_msgs/VersionInfo]
[INFO] [1531356278.392212]: Setup publisher on imu [sensor_msgs/Imu]
[INFO] [1531356278.414980]: Setup publisher on cmd_vel_rc100 [geometry_msgs/Twist]
[INFO] [1531356278.449703]: Setup publisher on odom [nav_msgs/Odometry]
[INFO] [1531356278.466352]: Setup publisher on joint_states [sensor_msgs/JointState]
[INFO] [1531356278.485605]: Setup publisher on battery_state [sensor_msgs/BatteryState]
[INFO] [1531356278.500973]: Setup publisher on magnetic_field [sensor_msgs/MagneticField]
[INFO] [1531356280.545840]: Setup publisher on /tf [tf/tfMessage]
[INFO] [1531356280.582609]: Note: subscribe buffer size is 1024 bytes
[INFO] [1531356280.584645]: Setup subscriber on cmd_vel [geometry_msgs/Twist]
[INFO] [1531356280.620330]: Setup subscriber on sound [turtlebot3_msgs/Sound]
[INFO] [1531356280.649508]: Setup subscriber on motor_power [std_msgs/Bool]
[INFO] [1531356280.688276]: Setup subscriber on reset [std_msgs/Empty]
[INFO] [1531356282.022709]: Setup TF on Odometry [tb3_0/odom]
[INFO] [1531356282.026863]: Setup TF on IMU [tb3_0/imu_link]
[INFO] [1531356282.030138]: Setup TF on MagneticField [tb3_0/mag_link]
[INFO] [1531356282.033628]: Setup TF on JointState [tb3_0/base_link]
[INFO] [1531356282.041117]: --------------------------
[INFO] [1531356282.044421]: Connected to OpenCR board!
[INFO] [1531356282.047700]: This core(v1.2.1) is compatible with TB3 Burger
[INFO] [1531356282.051355]: --------------------------
[INFO] [1531356282.054785]: Start Calibration of Gyro
[INFO] [1531356284.585490]: Calibration End

[Remote PC] Launch robot state publisher with same namespace.

$ ROS_NAMESPACE=tb3_0 roslaunch turtlebot3_bringup turtlebot3_remote.launch multi_robot_name:=tb3_0
$ ROS_NAMESPACE=tb3_1 roslaunch turtlebot3_bringup turtlebot3_remote.launch multi_robot_name:=tb3_1

Before start another application, check topics and TF tree to open rqt

$ rqt

To use this setup, each turtlebot3 makes map using SLAM and these maps are merged simutaneously by multi_map_merge packages. You can get more information about this to visit Virtual SLAM by Multiple TurtleBot3s sections

opencr_setup

July 17, 2019

Appendixes

DYNAMIXEL

OpenCR1.0

LDS-01

  • LDS-01: Lidar Sensor of TurtleBot3

RealSense™

Raspberry Pi Camera

opencr_setup

July 17, 2019

RealSense™

Overview

Intel® RealSense™ is a platform for implementing gesture-based human-computer interaction techniques. It consists of series of consumer grade 3D cameras together with an easy to use machine perception library. The Intel® RealSense™ R200 camera is a USB 3.0 device that can provide color, depth, and infrared video streams. The TurtleBot3 Waffle model adopts Intel® RealSense™ R200 to enable 3D SLAM and navigation, and it is possible to apply various applications such as gesture recognition, object recognition and scene recognition based on 3D depth information obtained using RealSense™’s innovative Active Stereo Technology.

Specifications

Technical Specifications

Items Specifications
RGB Video Resolution 1920 x 1280, 2M
IR Depth Resolution 640 x 480, VGA
Laser Projector Class 1 IR Laser Projector (IEC 60825-1:2007 Edition 2)
Frame Rate 30 fps (RGB), 60 fps (IR depth)
FOV (Field-of-View) 77° (RGB), 70° (IR depth), Diagonal Field of View
Range 0.3m ~ 4.0m
Operating Supply Voltage 5V (via USB port)
USB Port USB 3.0
Dimensions 101.56mm length x 9.55mm height x 3.8mm width
Mass Under 35g

Minimum System Requirements

Items Specifications
Processors 4th Generation and future Intel® Core™ processors
Disk Storage 1GB
Memory 2GB
Interface USB 3.0
  Ubuntu 14.04 and 16.04 LTS (GCC 4.9 toolchain)
Operating System Windows 8.1 and Windows 10 (Visual Studio 2015 Update 2)
for SDK Mac OS X 10.7+ (Clang toolchain)
  Ostro

Here is the detail specification document: Intel® RealSense™ Datasheet

Intel® RealSense™ R200 for TurtleBot3

The Intel® RealSense™ R200 is applied on TurtleBot3 Waffle.

Introduction Video

The TurtleBot3 Waffle uses Intel® RealSense™ Camera R200 as a default vision sensor. Check this video out that shows how Intel® RealSense™ Camera R200 can be used in TurtleBot3 Waffle.

User Guide

Intel® RealSense™ packages enable the use of Intel® RealSense™ R200, F200, SR300 and ZR300 cameras with ROS. Below table describes packages required to operate Intel® RealSense™. You will be guided to install these packaged in the next section.

Package Description
librealsense Underlying library driver for communicating with Intel® RealSense™ camera
realsense_camera ROS Intel® RealSense™ camera node for publishing camera

Installation

Warning! There are installation prerequisites for the Intel® RealSense™ package installation in http://wiki.ros.org/librealsense

[TurtleBot] The following commands will install relevant Intel® RealSense™ library.

$ sudo apt-get install linux-headers-generic
$ sudo apt-get install ros-kinetic-librealsense

[TurtleBot] To run the Intel® RealSense™ with ROS, the following package is needed. There are stable and unstable version packages. Choose one and install it.

[Stable]
$ cd catkin_ws/src
$ git clone https://github.com/intel-ros/realsense.git
$ cd realsense
$ git checkout 1.8.0
$ cd catkin_ws && catkin_make -j2
[Unstable]
$ sudo apt-get install ros-kinetic-realsense-camera

Run realsense_camera Node

[TurtleBot] Run the following command

$ roslaunch realsense_camera r200_nodelet_default.launch

While the realsense_camera node is running, you can view various data from Intel® RealSense™ by launching rqt_image_view.

[Remote PC] Run the following command

$ rqt_image_view

Once the gui application is appeared on the screen, you can select data topic name related to Intel® RealSense™ from drop down menu at the top of the application.

(Optional) To Try as the Example Video Shows

[TurtleBot] Input ctrl + c to quit the previously run camera node, then run other realsense_camera node

$ roslaunch realsense_camera r200_nodelet_rgbd.launch

[TurtleBot] Run turtlebot3_bringup node to get datas for doing SLAM

$ roslaunch turtlebot3_bringup turtlebot3_robot.launch

[Remote PC] Run turtlebot3_slam node to do SLAM

$ roslaunch turtlebot3_slam turtlebot3_slam.launch

[Remote PC] Run RViz

$ rosrun rviz rviz -d `rospack find turtlebot3_slam`/rviz/turtlebot3_slam.rviz

[Remote PC] Click Panels - Views to open the view window

[Remote PC] Click TopDownOrtho (rviz) and change it into XYOrbit (rviz)

[Remote PC] Click add - By topic and find the PointCloud2 type /points topic in /camera/depth, then click it

[Remote PC] Click PointCloud2 type topic on the left window, then change Color Transformer from Intensity to AxisColor. This will show the depth of each points by color description.

[Remote PC] Click add - By topic and find the Image type /image_color topic in /camera/rgb, then click it. This will show the view of the rgb camera

References

opencr_setup

July 17, 2019

Raspberry Pi Camera

Overview

The Raspberry Pi Camera Module v2 replaced the original Camera Module in April 2016. The v2 Camera Module has a Sony IMX219 8-megapixel sensor (compared to the 5-megapixel OmniVision OV5647 sensor of the original camera). The Camera Module can be used to take high-definition video, as well as stills photographs. It’s easy to use for beginners, but has plenty to offer advanced users if you’re looking to expand your knowledge. There are lots of examples online of people using it for time-lapse, slow-motion, and other video cleverness. You can also use the libraries we bundle with the camera to create effects.

Specifications

Hardware Specifications

Items Specifications
Net price $25
Size Around 25 × 24 × 9 mm
Weight 3g
Still resolution 8 Megapixels
Video modes 1080p30, 720p60 and 640 × 480p60/90
Linux integration V4L2 driver available
C programming API OpenMAX IL and others available
Sensor Sony IMX219
Sensor 3280 × 2464 pixels
Sensor 3.68 x 2.76 mm (4.6 mm diagonal)
Pixel size 1.12 µm x 1.12 µm
Optical size 1/4”
Full-frame SLR lens equivalent 35 mm
S/N ratio 36 dB
Dynamic range 67 dB @ 8x gain
Sensitivity 680 mV/lux-sec
Dark current 16 mV/sec @ 60 C
Well capacity 4.3 Ke-
Fixed focus 1 m to infinity
Focal length 3.04 mm
Horizontal field of view 62.2 degrees
Vertical field of view 48.8 degrees
Focal ratio (F-Stop) 2.0

Hardware features

Available Implemented
Chief ray angle correction Yes
Global and rolling shutter Rolling shutter
Automatic exposure control (AEC) No - done by ISP instead
Automatic white balance (AWB) No - done by ISP instead
Automatic black level calibration (ABLC) No - done by ISP instead
Automatic 50/60 Hz luminance detection No - done by ISP instead
Frame rate up to 120 fps Max 90fps. Limitations on frame size for the higher frame rates (VGA only for above 47fps)
AEC/AGC 16-zone size/position/weight control No - done by ISP instead
Mirror and flip Yes
Cropping No - done by ISP instead (except 1080p mode)
Lens correction No - done by ISP instead
Defective pixel cancelling No - done by ISP instead
10-bit RAW RGB data Yes - format conversions available via GPU
Support for LED and flash strobe mode LED flash
Support for internal and external frame synchronisation for frame exposure mode No
Support for 2 × 2 binning for better SNR in low light conditions Anything output res below 1296 x 976 will use the 2 x 2 binned mode
Support for horizontal and vertical sub-sampling Yes, via binning and skipping
On-chip phase lock loop (PLL) Yes
Standard serial SCCB interface Yes
Digital video port (DVP) parallel output interface No
MIPI interface (two lanes) Yes
32 bytes of embedded one-time programmable (OTP) memory No
Embedded 1.5V regulator for core power Yes

Software features

Software features  
Picture formats JPEG (accelerated), JPEG + RAW, GIF, BMP, PNG, YUV420, RGB888
Video formats raw h.264 (accelerated)
Effects negative, solarise, posterize, whiteboard, blackboard, sketch, denoise, emboss, oilpaint, hatch, gpen, pastel, watercolour, film, blur, saturation
Exposure modes auto, night, nightpreview, backlight, spotlight, sports, snow, beach, verylong, fixedfps, antishake, fireworks
Metering modes average, spot, backlit, matrix
Automatic white balance modes off, auto, sun, cloud, shade, tungsten, fluorescent, incandescent, flash, horizon
Triggers Keypress, UNIX signal, timeout
Extra modes demo, burst/timelapse, circular buffer, video with motion vectors, segmented video, live preview on 3D models

Mechanical Drawing

  • Camera Module v2 PDF

Here is the detail specification document: Raspberry Pi Camera Module v2 Datasheet

Raspberry Pi Camera for TurtleBot3

The Raspberry Pi Camera Module v2 is applied on TurtleBot3 Waffle Pi.

Introduction Video

The TurtleBot3 Waffle Pi uses Raspberry Pi Camera Module v2 as a default vision sensor. Check this video out that shows how Raspberry Pi Camera Module v2 can be used in TurtleBot3 Waffle Pi.

User Guide

Raspberry Pi Camera packages enable the use of Raspberry Pi Camera Module v1.x and v2.x with ROS. Below table describes packages required to operate Raspberry Pi Camera. You will be guided to install these packaged in the next section.

Package Description
Raspberry Pi Camera Underlying library driver for communicating with Raspberry Pi Camera

Installation

[TurtleBot] Setting up the camera hardware

$ sudo raspi-config

Select 3 Interfacing Options

Select P1 Camera

Enable camera interface

Enable camera interface

After reboot Raspberry Pi ,to test that the system is installed and working, try the following command:

$ raspistill -v -o test.jpg

The display should show a five-second preview from the camera and then take a picture, saved to the file test.jpg

[TurtleBot] The following commands will install relevant Raspberry Pi Camera packages on your ROS system.

$ cd ~/catkin_ws/src
$ git clone https://github.com/UbiquityRobotics/raspicam_node.git
$ sudo apt-get install ros-kinetic-compressed-image-transport ros-kinetic-camera-info-manager
$ cd ~/catkin_ws && catkin_make

Run raspicam Node

[TurtleBot] Run the following command

$ roslaunch turtlebot3_bringup turtlebot3_rpicamera.launch

or

$ roslaunch raspicam_node camerav2_1280x960.launch

While the raspicam node is running, you can view various data from Raspberry Pi Camera by launching rqt_image_view.

Warning! Before you run Rviz in Remote PC, check your Raspberry Pi 3 and Remote PC whether they are connected.

[Remote PC] Run the following command

$ rqt_image_view

Once the gui application is appeared on the screen, you can select data topic name related to Raspberry Pi Camera from drop down menu at the top of the application.

References

opencr_setup

July 17, 2019

OpenCR1.0

Overview

OpenCR is a main controller board of the TurtleBot3. OpenCR; Open-source Control module for ROS, is developed for ROS embedded systems to provide completely open-source hardware and software. Everything about the board; Schematics, PCB Gerber, BOM and the firmware source code for the TurtleBot3 are free to distribute under open-source licenses for users and the ROS community.

The STM32F7 series is a main chip inside the OpenCR board which is based on a very powerful ARM Cortex-M7 with floating point unit. The development environment for OpenCR is wide open from Arduino IDE and Scratch for young students to traditional firmware development for the expert.

OpenCR provides digital and analog input/output pins that can interface with extension board or various sensors. Also, OpenCR features various communication interfaces: USB for connecting to PC, UART, SPI, I2C, CAN for other embedded devices.

OpenCR can provide a best solution when using with a SBC. It supports 12V, 5V, 3.3V power outputs for SBCs and sensors. It also supports hot swap power inputs between battery and SMPS.

OpenCR will be the best solution for implementing your embedded control design.

Specifications

NOTE: Hot swap power switch between shore power(12V, 5A SMPS) and mobile power(battery) from OpenCR board enables UPS(Uninterrupted Power Supply) feature.

Items Specifications
Microcontroller STM32F746ZGT6 / 32-bit ARM Cortex®-M7 with FPU (216MHz, 462DMIPS)
Sensors Gyroscope 3Axis, Accelerometer 3Axis, Magnetometer 3Axis (MPU9250)
Programmer ARM Cortex 10pin JTAG/SWD connector
USB Device Firmware Upgrade (DFU)
Serial
Extension pins 32 pins (L 14, R 18) *Arduino connectivity
Sensor module x 4 pins
Extension connector x 18 pins
Communication circuits USB (Micro-B USB connector/USB 2.0/Host/Peripheral/OTG)
TTL (B3B-EH-A / Dynamixel)
RS485 (B4B-EH-A / Dynamixel)
UART x 2 (20010WS-04)
CAN (20010WS-04)
LEDs and buttons LD2 (red/green) : USB communication
User LED x 4 : LD3 (red), LD4 (green), LD5 (blue)
User button x 2
Powers External input source
5 V (USB VBUS), 7-24 V (Battery or SMPS)
Default battery : LI-PO 11.1V 1,800mAh 19.98Wh
Default SMPS: 12V 5A
External output source
12V@1A(SMW250-02), 5V@4A(5267-02A), 3.3V@800mA(20010WS-02)
External battery Port for RTC (Real Time Clock) (Molex 53047-0210)
Power LED: LD1 (red, 3.3 V power on)
Reset button x 1 (for power reset of board)
Power on/off switch x 1
Dimensions 105(W) X 75(D) mm
Mass 60g

User Guide

Run turtlebot3_core node

$ rosrun rosserial_python serial_node.py __name:=turtlebot3_core _port:=/dev/ttyACM0 _baud:=115200

Testing

$ rostopic echo /imu

  seq: 179
  stamp:
    secs: 1486448047
    nsecs: 147523921
  frame_id: imu_link
orientation:
  x: 0.0165222994983
  y: -0.0212152898312
  z: 0.276503056288
  w: 0.960632443428
orientation_covariance: [0.0024999999441206455, 0.0, 0.0, 0.0, 0.0024999999441206455, 0.0, 0.0, 0.0, 0.0024999999441206455]
angular_velocity:
  x: 2.0
  y: 1.0
  z: -1.0
angular_velocity_covariance: [0.019999999552965164, 0.0, 0.0, 0.0, 0.019999999552965164, 0.0, 0.0, 0.0, 0.019999999552965164]
linear_acceleration:
  x: 528.0
  y: 295.0
  z: 16648.0
linear_acceleration_covariance: [0.03999999910593033, 0.0, 0.0, 0.0, 0.03999999910593033, 0.0, 0.0, 0.0, 0.03999999910593033]
---

Debugging

turtlebot3_core.ino includes debugging code to check odometry, connected sensor and state of TurtleBot3 or Dynamixels. This might be help you to implement code and test it without ROS connection.

First, ready to LN-101 or any USB to Serial converter.

Second, open turtlebot3_core_config.h file and activate DEBUG. After that upload it to OpenCR.

Third, connect converter to UART2 in OpenCR.

Forth, download minicom and configure baudrate 57600 and port name.

  $ sudo apt-get install minicom
  $ minicom -s

Fifth, press reset button of OpenCR then you can see how turtlebot3_core.ino start and some data.

Success to init Motor Driver
Success to init Sensor
Success to init Diagnosis
Success to init Controller
---------------------------------------
EXTERNAL SENSORS
---------------------------------------
Bumper : 2
Cliff : 204.00
Sonar : 1.00
Illumination : 480.00
---------------------------------------
OpenCR SENSORS
---------------------------------------
Battery : 12.15
Button : 0
IMU :
    w : 1.00
    x : 0.00
    y : -0.00
    z : 0.00
---------------------------------------
DYNAMIXELS
---------------------------------------
Torque : 1
Encoder(left) : 876
Encoder(right) : 4001
---------------------------------------
TurtleBot3
---------------------------------------
Odometry :
         x : 0.00
         y : 0.00
     theta : 0.00

How to modify ROS library code

You can modify ROS library in below paths.

Your/Arduino/Board/Package/Directory/OpenCR/hardware/OpenCR/Version/libraries/turtlebot3

Your/Arduino/Board/Package/Directory/OpenCR/hardware/OpenCR/Version/libraries/turtlebot3_ros_lib

If you want to modify it in the sketch folder, move the above two folders to the “skechbook/libraries” folder.

How to added topic messages

For generate topic header file, you run command below in section 2.2 of rosserial tutorial.

$ cd <sketchbook>/libraries
$ rm -rf ros_lib
$ rosrun rosserial_arduino make_libraries.py .

Since some hardware-related libraries generated by the above command are different from OpenCR, you must copy only the necessary topic header files.

Then copy the generated header file to the path below.

/turtlebot3_ros_lib

Open Source Software

You can modify the downloaded source code and share it with your friends.

Open Source Hardware

If you want to manufacture your own OpenCR, you can download necessary files such as PCB Gerber, BOM. When the board is ready firmware source code can be burned into the MCU.

e-Manual

opencr_setup

July 17, 2019

LDS-01

Overview

  • 360 Laser Distance Sensor LDS-01 is a 2D laser scanner capable of sensing 360 degrees that collects a set of data around the robot to use for SLAM (Simultaneous Localization and Mapping) and Navigation.
  • The LDS-01 is used for TurtleBot3 Burger, Waffle and Waffle Pi models.
  • It supports USB interface(USB2LDS) and is easy to install on a PC.
  • It supports UART interface for embedded baord.

Introduction Video

[Video #01] How to use the LDS-01

  • Contents
    1. Specification
    2. ROS
    3. Windows, Linux, macOS
    4. Embedded Board
    5. SLAM and Navigation
    6. Self-Parking
    7. 3D Sensing
    8. for Makers

[Video #02] Laser Distance Sensor (LDS) Example.

[Video #03] ROS Hector SLAM demo using only a 360 Laser Distance Sensor LDS-01 made by HLDS (Hitachi-LG Data Storage).

[Video #04] ROS Gmapping and Cartographer SLAM demo using TurtleBot3 and 360 Laser Distance Sensor LDS-01.

Specifications

General Specifications

Items Specifications
Operating supply voltage 5V DC ±5%
Light source Semiconductor Laser Diode(λ=785nm)
LASER safety IEC60825-1 Class 1
Current consumption 400mA or less (Rush current 1A)
Detection distance 120mm ~ 3,500mm
Interface 3.3V USART (230,400 bps) 42bytes per 6 degrees, Full Duplex option
Ambient Light Resistance 10,000 lux or less
Sampling Rate 1.8kHz
Dimensions 69.5(W) X 95.5(D) X 39.5(H)mm
Mass Under 125g

Measurement Performance Specifications

Items Specifications
Distance Range 120 ~ 3,500mm
Distance Accuracy (120mm ~ 499mm) ±15mm
Distance Accuracy(500mm ~ 3,500mm) ±5.0%
Distance Precision(120mm ~ 499mm) ±10mm
Distance Precision(500mm ~ 3,500mm) ±3.5%
Scan Rate 300±10 rpm
Angular Range 360°
Angular Resolution

Detail Specification Document

The following link contains information about basic performance, measurement performance, mechanism layout, optical path, data information, pin description and commands.

Here is the detail specification document : PDF

NOTE: The 360 Laser Distance Sensor LDS-01 for TurtleBot3 uses molex 51021-0800 and 53048-0810 instead of the basic housing and connector.

LDS for TurtleBot3

The LDS-01 is used for TurtleBot3 Burger, Waffle and Waffle Pi models.

User Guide (for ROS)

We are offering ROS package for LSD. The hls_lfcd_lds_driver package provides a driver for HLS(Hitachi-LG Sensor) LFCD LDS(Laser Distance Sensor).

NOTE: Due to firmware update (after buy it on Oct. 2017), the sensor is running directly when power in on.

Installation

$ sudo apt-get install ros-kinetic-hls-lfcd-lds-driver

Set Permission for LDS-01

$ sudo chmod a+rw /dev/ttyUSB0

Run hlds_laser_publisher Node

$ roslaunch hls_lfcd_lds_driver hlds_laser.launch

Run hlds_laser_publisher Node with RViz

$ roslaunch hls_lfcd_lds_driver view_hlds_laser.launch

User Guide (for Driver)

  • In addition to ROS, the LDS-01 supports Windows, Linux, and MacOS development environments for general purposes.
  • The software requirement is:
    • GCC (for Linux and macOS), MinGW (for Windows)
    • Boost library (Lib for boost system, tested on v1.66.0)

Download

  • Download the LDS-01’s driver
$ git clone https://github.com/ROBOTIS-GIT/hls_lfcd_lds_driver.git

Build

  • The makefile used here is set for Linux. Windows and macOS should be changed according to their development environment.
$ cd hls_lfcd_lds_driver/applications/lds_driver/
$ make

Run

  • You can see the raw data in the terminal when you run the driver of LDS-01. Please check the source code for details.
$ ./lds_driver
r[359]=0.438000,r[358]=0.385000,r[357]=0.379000,...

User Guide (for GUI)

  • We provide a basic GUI tool for visually checking the data of the LDS-01.
  • It supports Linux, Windows, and macOS.
  • The software requirement is:
    • Qt Creator and Libs (tested on Qt Creator v4.5.0 and Qt Libs v5.10.0)
    • GCC (for Linux and macOS), MinGW (for Windows), This can be installed together while installing Qt.
    • Boost library (Lib for boost system, tested on v1.66.0)

Download

  • Download the LDS-01’s driver and GUI source code.
$ git clone https://github.com/ROBOTIS-GIT/hls_lfcd_lds_driver.git

Build

  • Run the Qt Creator
  • Open file (Ctrl-O) the lds_polar_graph.pro file (hls_lfcd_lds_driver/applications/lds_polar_graph/lds_polar_graph.pro)
  • Change the input your portname of source code
  • Build all (Ctrl-Shift-B)

Run

  • Run the application (Ctrl-R)

User Guide (for Embedded Board)

  • We provide a way to connect to an embedded board.
  • The data of the LDS-01 can be used on the embedded board like OpenCR and Arduino, and it can be confirmed on the LCD as a graph like below.

Preparations

  • It does not provide a dedicated interface board, but you can connect it to the power and UART of the embedded board as shown below. a

WARNNING : The wiring colours of LDS-01 could differ from the picture by manaufacturers.

  • OpenCR develops and downloads firmware through the Arduino IDE. Therefore, you must install the Arduino IDE in advance and install the OpenCR board package. Install through the following link document.

Download firmware and run

  1. After connecting USB to PC, select Tools -> Board -> OpenCR Board in Arduino IDE.
  2. Change Tools-> Port to the port to which the board is connected.
  3. In the Arduino IDE Examples, select the firmware for LDS (File -> Examples -> OpenCR -> Etc -> LDS -> drawLDS).
  4. Click Upload icon in the Arduino IDE that displays the red circle to build and download the firmware. When the download is completed, the firmware is automatically executed.

Certifications

Please inquire us for information regarding unlisted certifications.

FCC

opencr_setup

July 17, 2019

DYNAMIXEL

Overview

DYNAMIXEL X-Series is a new line-up of high performance networked actuator module, which has been widely used for building various types of robots with reliability and expandability.

Two different types of DYNAMIXEL is adopted in TurtleBot3 Burger, Waffle and Waffle Pi as they have different requirements. DYNAMIXEL X-Series shares its design, therefore, users can replace actuators depend on applications.

  • Basic Features
    • Improved Torque with Compact Size
    • Enhanced Durability and Expandability
    • Hollow Back Case Minimizes Cable Stress (3-way-routing)
    • Direct Screw Assembly to the Case (without Nut Insert)
    • Improved Heat Sink Featuring Aluminum Case
  • Various Control Functions
    • 6 Operating Modes
    • Current-Based Torque Control (4096 steps, 2.69mA/step)
    • Profile Control for Smooth Motion Planning
    • Trajectory Data and Moving Status (In-Position, Following Error, etc.)
    • Energy Saving (Reduced Current from 100mA to 40mA)

Specifications

Items XL430-W250 (for Burger) XM430-W210 (for Waffle and Waffle Pi)
Microcontroller ST CORTEX-M3 (STM32F103C8 @ 72Mhz, 32bit) ST CORTEX-M3 (STM32F103C8 @ 72Mhz, 32bit)
Position Sensor Contactless Absolute Encoder (12bit, 360°) Contactless Absolute Encoder (12bit, 360°)
Motor Cored Motor **Coreless Motor **
Baud Rate 9600 bps ~ 4.5 Mbps 9600 bps ~ 4.5 Mbps
Control Modes Velocity, Position, Extended Position, PWM Velocity, Position, Extended Position, PWM, Current, Current-base Position
Gear Ratio 258.5 : 1 212.6 : 1
Stall Torque 1.0 N.m (@ 9V, 1A) 2.7 N.m (@ 11.1V, 2.1A)
  1.4 N.m (@ 11.1V, 1.3A) 3.0 N.m (@ 12V, 2.3A)
  1.5 N.m (@ 12V, 1.4A) 3.7 N.m (@ 14.8V, 2.7A)
No Load Speed 47rpm (@ 9V) 70rpm (@ 11.1V)
  57rpm (@ 11.1V) 77rpm (@ 12V)
  61rpm (@ 12V) 95rpm (@ 14.8V)
Communication TTL Level Multi Drop Bus TTL Level / RS485 Multi Drop Bus
Material Engineering Plastic Full Metal Gear, Metal Body, Engineering Plastic
Standby Current 52mA 40mA
  • More information for actuators can be found at below ROBOTIS e-Manual links.

Dynamixel SDK

The ROBOTIS Dynamixel SDK is a software development library that provides Dynamixel control functions for packet communication. The API is designed for Dynamixel actuators and Dynamixel-based platforms. TurtleBot3 uses the Dynamixel SDK in OpenCR to control the actuator.

TurtleBot3 additional_sensors

July 17, 2019 major

Additional Sensors

TurtleBot3 can be attach to additional sensors. Examples shown here can be that how to use additional sensors such as IR, ultrasonic, switch, etc. in OpenCR of TurtleBot3.

Bumper

(Front side)

(Back side)

  • Default PIN
Device PIN
Front sensor ROBOTIS_5-PIN 3
Back sensor ROBOTIS_5-PIN 4

Tip : If you want to use another PIN, refer to OpenCR PIN Map.

  • Run with Turtlebot3

WARNING : Make sure to run the Bringup instruction before performing Example.

[Remote PC] Launch the bumper launch file.

$ roslaunch turtlebot3_example turtlebot3_bumper.launch
  • Run with Arduino IDE

This example can be open Arduino IDE.

Select to File -> Examples -> ROS -> 2. Sensors -> a_Bumper. Upload to OpenCR.

[Remote PC] Run ros serial_node package.

$ rosrun rosserial_python serial_node.py __name:=turtlebot3_core _port:=/dev/ttyACM0 _baud:=115200

WARNING : If you upload examples to OpenCR, you have to re-upload turtlebot3_core.

IR

  • Default PIN
Device PIN
IR sensor ROBOTIS_5-PIN 2

Tip : If you want to use another PIN, refer to OpenCR PIN Map.

  • Run with Turtlebot3

WARNING : Make sure to run the Bringup instruction before performing Example.

[Remote PC] Launch the cliff launch file.

$ roslaunch turtlebot3_example turtlebot3_cliff.launch
  • Run with Arduino IDE

This example can be open Arduino IDE.

Select to File -> Examples -> ROS -> 2. Sensors -> b_Cliff. Upload to OpenCR.

[Remote PC] Run ros serial_node package.

$ rosrun rosserial_python serial_node.py __name:=turtlebot3_core _port:=/dev/ttyACM0 _baud:=115200

Ultrasonic

  • Device - Ultrasonic sensor (HC-SR04)

  • Default PIN:
Device PIN
Trigger BDPIN_GPIO_1
Echo BDPIN_GPIO_2

Tip : If you want to use another PIN, refer to OpenCR PIN Map.

  • Run with Turtlebot3

WARNING : Make sure to run the Bringup instruction before performing Example.

[Remote PC] Launch the sonar launch file.

$ roslaunch turtlebot3_example turtlebot3_sonar.launch
  • Run with Arduino IDE

This example can be open Arduino IDE.

Select to File -> Examples -> ROS -> 2. Sensors -> c_Ultrasonic. Upload to OpenCR.

[Remote PC] Run ros serial_node package.

$ rosrun rosserial_python serial_node.py __name:=turtlebot3_core _port:=/dev/ttyACM0 _baud:=115200

Illumination

  • Device - LDR sensor (Flying-Fish MH-sensor)

  • Default PIN
Device PIN
Analog A1

Tip : If you want to use another PIN, refer to OpenCR PIN Map.

  • Run with Turtlebot3

WARNING : Make sure to run the Bringup instruction before performing Example.

[Remote PC] Launch the illumination launch file.

$ roslaunch turtlebot3_example turtlebot3_illumination.launch
  • Run with Arduino IDE

This example can be open Arduino IDE.

Select to File -> Examples -> ROS -> 2. Sensors -> d_Illumination. Upload to OpenCR.

[Remote PC] Run ros serial_node package.

$ rosrun rosserial_python serial_node.py __name:=turtlebot3_core _port:=/dev/ttyACM0 _baud:=115200

LED

  • Device - led (led101)

  • Default PIN
Device PIN
Front_left BDPIN_GPIO_4
Front_right BDPIN_GPIO_6
Back_left BDPIN_GPIO_8
Back_right BDPIN_GPIO_10

Tip : If you want to use another PIN, refer to OpenCR PIN Map.

  • Run

This example is allways active when connected led. the leds show a specific pattern depend on the linear and angular velocity of Turtlebot3.

Media support

January 12, 2016 major

ChatApp introduces media support! Send images, videos and documents to your contacts.

Features:

  • Image support
  • Video support
  • Document support

Fixes:

  • Edge case contact syncing issue
  • All memory leaks obliterated