Results
Unfortunately, a full integration of all the project components was never successfully accomplished. The video above shows the current functionality of the integrated system in Gazebo simulation. Due to roadblocks and challenges we faced working on this project throughout the quarter, we adjusted the scope of what we hoped to complete. We focused on achieving what would be one loop of our state machine by moving the robot towards a block of our desired color, picking up the block, moving to a set resource station location, and dropping the block.
Unfortunately, a full integration of all the project components was never successfully accomplished. The video above shows the current functionality of the integrated system in Gazebo simulation. Due to roadblocks and challenges we faced working on this project throughout the quarter, we adjusted the scope of what we hoped to complete. We focused on achieving what would be one loop of our state machine by moving the robot towards a block of our desired color, picking up the block, moving to a set resource station location, and dropping the block.
Code Repository
Check out our Github to view the low level implementation of the code. To run what is shown above, we run the shell file, "launch_locobot_gazebo_moveit.sh". This shell file launches the Interbotix MoveIt package, "gazebo_moveit_arm.launch", "matching_ptcld_serv", and "locobot_gazebo_example.launch". "gazebo_moveit_arm.launch" launches the ROS node "locobot_arm_motion.py" which handles the motion planning for the pick and place functionality. "matching_ptcld_serv" is a C++ file that handles perception for the robot. "locobot_gazebo_example.launch" launches the ROS node "locobot_base_motion.py" which contains the code that navigates the robot base to a desired goal. Using ROS topics, the different nodes communicate information such as the completion of a successful pick, the location of a detected block, and the successful arrival at a navigation goal. Efficient communication between nodes is crucial for the robot to know when to switch states.
The scripts for our unintegrated implantations of perception, motion, and mapping can also be found in the Github repository. These scripts are labeled "BlockDetection.py", "Obstacle_Avoidance.py", and "FastSLAM.py", respectively
The scripts for our unintegrated implantations of perception, motion, and mapping can also be found in the Github repository. These scripts are labeled "BlockDetection.py", "Obstacle_Avoidance.py", and "FastSLAM.py", respectively