Lab 3: Robot Setup, Basic Motion, and Visualization
In this lab, you will configure your Ubuntu 24.04 VM to communicate with a physical TurtleBot 4 via a Discovery Server, visualize real-time sensor data in RViz2, and write a Python node to navigate the robot in a specific geometric path.
1. Collaboration & Submission Policy
Students work in groups of three to share one physical robot. You are encouraged to collaborate on setup and troubleshooting. However, each student must submit their own unique motion script. To ensure individual work, each member of your trio must choose a different shape from Task 4 (Rectangle, Triangle, or Pentagon).
2. Power Management
The TurtleBot 4 uses two systems: the Raspberry Pi 4 and the Create® 3 Base. Correct shutdown is essential to prevent SD card corruption.
2.1. Startup
- Place the robot on the charger dock.
- Wait ~3 minutes for the robot to initialize:
- First, you will hear a chime that indicates the Create Base is booted up
- Around 45 seconds later, you will hear a second chime while a fast purple light rotating in the ring. This indicates the RPi and Create 3 are communicating, and the Robot is ready to use!
2.2. Graceful Shutdown
This procedure ensures a graceful shutdown of both the Raspberry Pi and the Create 3 base, preventing SD card corruption and hardware-level data loss.
- Move the robot off the charger.
- In your SSH session:
sudo shutdown -h now. - Wait 30 seconds until the LiDAR stops spinning, then hold the Power Button (large ring button) for about 8 seconds until you hear a chime and the LED turn off.
3. Environment Setup
3.1. ROS2 Environment Setup
To ensure you only communicate with your assigned robot, we use specific Domain IDs and a FastDDS Discovery Server.
Run the commands below to pull the latest changes and run the setup script:
cd ~/vm_setup
git pull
./setup_robot_env.sh
Instruct your VM to use the Robot's Discovery Server instead of its own that was used during your simulation:
set-ros-env robot
Now, reboot your VM for the new environment setup to be used:
sudo reboot -h now
3.2. Connect your computer to the Lab Wifi!
3.3. Setup Bridged Network Mode
Your VM must be in Bridged Network mode to act as a physical device on the local network.
Verify Connection
Run this command in your VM terminal:
ip route get 8.8.8.8
The output should show a local IP address (e.g., 192.168.50.x). If it shows 172.x.x.x or 10.x.x.x, you are in NAT mode and must switch to Bridged mode in your VM settings (VMware/VirtualBox) and restart.
4. Task 1: Remote Access and Hardware Check
4.1. Ping the Robot
ping turtlebot
turtlebotis a hostname that translates to the IP address of your specific robot, configured by thesetup_robot_env.shscript. It makes your life easier by not having to remember the IP Address of your robot.
4.2. SSH into the Robot
SSH (Secure Shell) is the industry-standard method for connecting to a remote computer (like your TurtleBot 4 or a VM) over a network. It encrypts all traffic to prevent eavesdropping.
It creates an encrypted tunnel to the robot. Once you authenticate, your local terminal "hands over" its interface to the remote machine. You are no longer typing on your laptop; your command prompt effectively "teleports" to the robot, and every command executes on the robot's processor.
ssh ubuntu@turtlebot
Password:
turtlebot
You will see a warning: "The authenticity of host… can't be established." when you connect to a remote machine for the first time. Type yes and press Enter.
4.3. Check Nodes and Topics in the Robot
Once logged in via SSH, verify the sensors are broadcasting:
ros2 node list
You should see nodes like /motion_control and /oakd.
4.4. Check Nodes and Topics in the VM
In your VM terminal:
ros2 node list
CHeck if you see topic messages:
ros2 topic echo /robot_<XX>/battery_state
is your Robot ID.
If your VM environment is setup correctly, every few seconds, you should see a battery state message. Check ROS Communication Issues or ask for help.
5. Task 2: Sensor Visualization (RViz2)
RViz2 (ROS Visualization) is a 3D visualizer that allows you to see what the (real or simulated) robot "sees." It takes abstract data, like sensor readings, camera feeds, and coordinate frames, and renders them into a 3D environment that humans can understand.
Read this RViz2 User Guide before proceeding further.
You do not need to install RViz2; it is pre-installed in your VM.
Launch RViz2 on your VM (locally, not via SSH) to visualize the robot's "eyes."
- Run RViz: Type
run-lab-rvizin your VM terminal.This is a custom rviz launch command that takes into account your robot namespace.
- Global Options: Set Fixed Frame to
rplidar_link.Note: The
scanmessages use this frame ID. If you usemapwithout SLAM/Localization running, the data will not appear. We will learn how to build a map later in the course. - Add Displays:
- LaserScan: Topic
/robot_<XX>/scan(Set Reliability to Best Effort).The LIDAR does not spin when the turlebot is docked!
is your Robot ID. - TF: To see the robot's transform tree.
- Image: Topic
/robot_<XX>/robot_11/oakd/rgb/preview/image_rawto see the OAK-D camera feed.
- LaserScan: Topic
RViz is super helpful when debugging your assignments, so play around with the GUI and familiarize yourself with the UI. Change the Fixed Frame, add other topics, change colors of the data visualized, save RViz configs, etc.
6. Task 3: Manual Control and Message Inspection
Before writing your script, you should manually drive the robot to observe how the /robot_<XX>/cmd_vel topic translates movement into data. In Lab 3, you did this with a simualted robot.
6.1. Run Teleop (VM Terminal)
Open a new terminal on your VM and run the keyboard teleop node:
run-lab-teleop
Its a command that runs the same teleop node you used in your simulation. The command uses the right namespace for the roboto your are using.
6.2. Echo the Velocity Topic (Second VM Terminal)
Open another terminal and "eavesdrop" on the commands being sent to the robot:
ros2 topic echo /robot_<XX>/cmd_vel
The teleop_twist_keyboard node is sending the appropriate commands to the topic /robot_
6.3. Observe the Patterns
Try the following movements and watch the terminal output:
- Go Straight: Notice that only
linear.xhas a value. - Turn in Place: Notice that only
angular.zhas a value. - Curved Path: Use the "diagonal" keys (like
uoro). Observe that bothlinear.xandangular.zhave non-zero values simultaneously.
Reflection: When you write your Python node, you are simply automating these same numbers!
7. Task 4: Robot Motion Node
Do NOT write your ROS 2 nodes directly on the robot. Since you are sharing hardware and ROS 2 is distributed, you must program and run your nodes within packages on your respective VMs.
Each student must implement one of the following shapes. Coordinate with your team to ensure no duplicates.
All the lengths are approximate values.
| Shape | Required Logic | External Turn Angle |
|---|---|---|
| Choice A: Rectangle | 4 sides (Length: 1.0m, Width: 0.5m) | 90° |
| Choice B: Triangle | 3 equal sides (Length: 0.8m) | 120° |
| Choice C: Pentagon | 5 equal sides (Length: 0.6m) | 72° |
Starter Code Template (robot_drive.py)
Below is a reference code. You are free to modify the structure or implementation as needed.
import rclpy
from rclpy.node import Node
from geometry_msgs.msg import TwistStamped # Changed
import time
class RobotDrive(Node):
def __init__(self):
super().__init__('robot_drive')
self.publisher_ = self.create_publisher(TwistStamped, '/cmd_vel', 10)
time.sleep(1)
self.get_logger().info('Starting Motion Loop...')
self.execute_shape()
def create_stamped_msg(self, linear_x=0.0, angular_z=0.0):
"""Helper to create a TwistStamped message with current time."""
msg = TwistStamped()
msg.header.stamp = self.get_clock().now().to_msg()
msg.header.frame_id = 'base_link' # Common frame for velocity
msg.twist.linear.x = linear_x
msg.twist.angular.z = angular_z
return msg
def move_forward(self, duration):
msg = self.create_stamped_msg(linear_x=0.2)
self.publisher_.publish(msg)
time.sleep(duration)
self.stop()
def turn_robot(self, duration):
msg = self.create_stamped_msg(angular_z=0.5)
self.publisher_.publish(msg)
time.sleep(duration)
self.stop()
def stop(self):
msg = self.create_stamped_msg() # Defaults to zeros
self.publisher_.publish(msg)
time.sleep(0.5)
def execute_shape(self):
# TODO: Implement your choice (A, B, or C) here using loops
pass
def main():
rclpy.init()
node = RobotDrive()
node.destroy_node()
rclpy.shutdown()
if __name__ == '__main__':
main()
8. Submission Instructions
- Your Script:
robot_drive.py(Include your shape choice in the comments). - Video Submission: A clear unedited video showing (in the order below):
- You running
hostname; check-roson your VM terminal in your laptop. - The physical robot completing the shape on the floor.
- After the robot stopped, recording tour laptop screen showing RViz with live LiDAR data.
- You running