What is the outcome?
By the end of this library, you will understand how professional robotics systems are architected, deployed, and integrated in real industrial environments.
After completing this pathway, you will be able to:
- Architect ROS & ROS2 systems instead of cop
- Build reusable ROS2 architectures instead of fragile demo projects
- Integrate robot motion, computer vision, and AI into a unified pipeline
- Deploy robotics applications from simulation to real hardware
- Structure scalable automation systems with Behavior Trees
- Interface robots with PLCs, Jetson devices, and industrial environments
- Think like a robotics systems engineer — not a tutorial follower
This program is designed to transform you from a ROS user into a Robotics Software Engineer capable of delivering real automation solutions.
The Engineering Stack Behind Real Robotics Deployment
Most robotics learners lose years navigating fragmented documentation, inconsistent examples, and partial explanations.
This course was built to eliminate that inefficiency.
Starting from the fundamentals, you will progressively build:
- Core ROS/ROS2 architecture understanding
- Modular system design skills
- Motion planning with MoveIt
- 2D and 3D vision integration
- Simulation validation in Gazebo
- Real-world deployment with an integrated vision pipeline
Every lesson follows a clear methodology:
- What you are building
- Why the architecture is designed that way
- How to implement it correctly
Instead of trial-and-error learning, you follow a coherent engineering roadmap that accelerates your competence without sacrificing depth.
By the end of the program, you won’t just know commands —
you will know how to design, build, and deploy reliable robotic systems in a structured, scalable way.
Motion Planning & Robot Control
We are going to use Robot Operating System to program industrial application, thanks to its modularity and integration with other inverse kinematic Framework like MoveIt and Computer Vision
Simulation-to-Real Validation
We will test our robotics application in Gazebo, a physics software able to replicate the physics of the real world, in such a way that you can invest on the hardware only after feasability analysis
Vision-Guided Robotics
We will sense the environment with 2D and 3D tools like OpenCV and Point Cloud Library. You will be able to interface camera with ROS and code Object Detection and much more integrated with your Robotics application
Docker Development for Robotics
Build reproducible robotics environments for ROS2, MoveIt, computer vision using Docker-based development workflows.
Real Deployment with FR3WML + SoftGripper + Suction cup
Bridge simulation to physical hardware using a real robot, depth camera, and inference pipeline.
Industrial Behavior Tree Architectures
Design scalable and fault-tolerant robotics applications using industrial Behavior Tree architectures and reusable ROS2 frameworks.
Build Real Capability — Without Wasting Years
Learning robotics through random tutorials often leads to:
• Incomplete understanding
• Fragile architectures
• Systems that break outside demos
This program compresses the learning curve by providing a structured path —
so you reach real deployment competence in a fraction of the time it typically takes.
Not by skipping fundamentals.
But by organizing them correctly.
What you will learn
- Repository of this Module
- Migration strategy (9:22)
- Migrate custom robot description package (28:23)
- ROS2 Control: understanding and deploying Joint Group Position Controller (54:23)
- ROS2 Control: setup and deploy Gripper Action Controller (16:20)
- ROS2 Control: setup and deploy Joint Trajectory Controller (12:19)
- ROS2 Control: Control the robot sending trajectory_msgs from C++ script (11:43)
- Replicate the .world from ROS1 (8:31)
- Deploy depth camera and the pick object (25:45)
- Create MoveIt2 configuration package for a custom robot (55:40)
- How to interface Moveit2 with Gazebo (18:24)
- Create an inverse kinematics node with MoveIt2 in C++ (23:39)
- How to simulate grasping with LinkAttacher in Gazebo (11:38)
- Pick and place with a Custom robot: Gazebo interfaced with Rviz (30:27)
- UR5, Depth Camera setup in Gazebo with MoveIt (81:24)
- Add Robotiq gripper to UR5 and setup in Gazebo with MoveIt (62:50)
- Setup the world and LinkAttacher for grasping (15:34)
- Test inverse kinematics with UR5 (20:46)
- Run Pick and Place with UR5 and Robotiq gripper (8:26)
- Point Cloud Library and 3D processing in ROS2 (52:12)
- Pick and Place with UR5 Robotiq Gripper using 3D camera in ROS2 (31:58)
- Resources and GitHub repository of the master class
- How to make your packages portable
- Fixing MoveIt Setup Assistant (MSA) Crash after ROS2 Updates
- Creating the ROS2 Workspace & Cloning xarm_ros2 (12:02)
- Building the Custom Package my_xarm6: Robot, World, and Camera Setup (63:25)
- Creating the my_xarm6_app Package: First Motion Node (26:46)
- CNC Motion Node: Drawing Circles and Squares with Constant Orientation (37:50)
- Blind Pick and Place: Fixed Object Positions (29:03)
- Vision Integration with OpenCV: Depth Reconstruction via Intrinsics (38:32)
- Vision-Based Pick and Place + Custom Service Definition (35:17)
- Introduction to Ollama and LLM Integration (12:17)
- LLM Task Pipeline: Designing the Action-Oriented Architecture and System Prompt engineering (55:45)
- LLM-Driven Pick and Place: Updating System Prompt + Vision + Motion Integration (45:48)
- Test Robotics Tasks with LLM Ollama integrated in ROS2 (21:26)
- General Task LLM: Unified Prompt and Full Robot Commanding (CNC, pick place with vision, end effector control)) (24:59)
- Module available on April 2026
- Simulation to Reality – Building a Real Industrial Robotic Pipeline - Introduction (9:06)
- Setup comunication between PC and Robot Hardware controller (8:22)
- Understanding the Robot Driver Layer: How to Interface Any Robot Controller with ROS2 (26:54)
- How to Create a Bridge to Interface MoveIt with Robot Controller (33:50)
- Gripper Hardware setup (6:54)
- Bridging Digital Outputs to ROS2 Gripper Control (20:54)
- Building a Custom ROS2 Robot Package: Adding a Gripper and Camera to the FR3WML (18:17)
- Designing a Real ROS2 Robot Architecture with MoveIt (12:21)
- Integrating a Gripper (SoftGripper) into MoveIt (47:59)
- Control Robot Speed from ROS2 and MoveIt (24:33)
- Move the Real Robot with MoveIt Inverse Kinematics (22:42)
- Hardware Architecture for Pick and Place with 3D Camera (17:31)
- Jetson Orin Nano Setup for Real-Time RealSense Streaming with Docker (31:24)
- Software Architecture -- From Camera Streaming to Local AI Inference (29:47)
- Create the Inference Container on Jetson Orin Nano (YOLO + ROS2) (21:44)
- 3D Pose Estimation with YOLO + RealSense on Jetson (17:56)
- Handle Real Images noisy and 3D Pose Publishing (17:29)
- 6D Pose Estimation with YOLO + Realsense on Jetson (36:21)
- Industrial Pick and Place Framework: Bottle from Crate and Capsule on Bottle Neck (29:08)
- From Monolithic Code to an Industrial Behavior Tree Framework (28:41)