GSoC 2022: Update ROS integration for Non-GPS navigation and off-board path-planning

gsoc

Introduction

Hello AP community! I am Sanket Sharma, a second-year undergraduate student currently pursuing B.E. Mechatronics from India. I am glad to announce that I am selected this year to work with Ardupilot under GSoC’22 with my mentors @rmackay9 and @jmachuca . In this blog I’ll be discussing my GSoC project to update ROS integration for Non-GPS navigation and off-board path-planning.

This project requires testing the instructions for non-GPS navigation and path planning on a multicopter or a rover, updating the base code for the same and based upon time feasibility, integrating the offboard object avoidance with ArduPilot Auto mode. This project would involve development and modification of scripts for Ardupilot and cartographer SLAM along with setting up a companion computer to make everything work as a whole for the multicopter and rover.

As per time allowance, integration of offboard object avoidance with ArduPilot Auto mode that would require modifying and updating mavros package for mavlink protocol.

Current Scenario
Ardupilot as of now has instructions for setting up non-GPS navigation and path planning on the website but they are incomplete in some sense making it difficult for new users to set up navigation properly.

Terms
Definitions of the important terms:

  • ROS - Robotics Operating System is an open-source robotics framework that provides a range of libraries, device drivers, and high-quality developer tools that helps students, hobbyists, and enthusiasts to develop their robotics softwares.

  • Cartographer SLAM - Cartographer SLAM is a combination of two connected subsystems, Local SLAM and Global SLAM. Local SLAM builds successive submaps. Global SLAM’s main work is to find loop closure constraints between nodes and submaps and then optimising it. The core concept on SLAM is pose graph optimization.

  • Occupancy grid - We would be using occupancy grid for defining the area mapped using cartographer SLAM. Occupancy grid divides the area into map-pixels, also known as cells and assigns them as occupied or free using a binary method. ‘1’ for occupied and ‘0’ for free. This method is also called a binary occupancy grid method.

  • Costmap - Costmap is a grid in which every cell gets a value(difficulty or cost) based upon the area to be transverse. In the case of a rover, moving in rough ground, the costmap could be a grid where cells with lower values represent flat surfaces and cells with higher values represent rough areas or obstacles or walls.

  • Trajectory Rollout and Dynamic Window - Trajectory Rollout and Dynamic Window is a reactive motion planner that takes local information available within a sensor footprint and a global objective defined in a map coordinate frame to identify a locally feasible path to follow that is collision free and makes progress to a goal.

Approach
I have divided the approach into 2 step tasks:

Step 1 - Setting up SITL simulation environment, non-GPS navigation and path planning:*

This task revolves around setting up the environment, the dependencies, and configuring ardupilot’s build system along with ROS integration and set up a drone(software testing(SITL)).

In this task, we’ll set up a simulated environment with either a multicopter and a rover and set up non-GPS navigation and path planning for the vehicle. It will help in making major and minor modifications to the already available code base before testing on real hardware.

This task would be accomplished in the following subtasks:

  1. Setting up ROS Noetic in Ubuntu 20.04

  2. Setting up ardupilot SITL and ardupilot_gazebo

  3. Installing the majorly required ROS packages in our ROS workspace
    a. cartographer
    b. cartographer_ros
    c. move_base
    d. ap_navigation / ros_navigation
    e. mavlink & mavros

  4. Setup and modify the required files as mentioned on the ardupilot docs.
    There were required changes needed to be done to configure ardupilot to support positioning though visual odometry instead of relying on GPS. All the changes are/will be mentioned on this page:

  5. Modifying the parameter in the lua file for tuning the cartographer SLAM.
    For cartographer to properly map and localize the rover/multicopter, its parameters are required to be modified.

    For tuning cartographer, the following parameters are needed to be modified while testing the LiDAR:
    ➢ TRAJECTORY_BUILDER_2D.ceres_scan_matcher.translation_weight
    ➢ TRAJECTORY_BUILDER_2D.ceres_scan_matcher.rotation_weight
    ➢ TRAJECTORY_BUILDER_2D.real_time_correlative_scan_matcher.translation_delta_cost_weight
    ➢ TRAJECTORY_BUILDER_2D.real_time_correlative_scan_matcher.rotation_delta_cost_weight
    ➢ TRAJECTORY_BUILDER_2D.real_time_correlative_scan_matcher.linear_search_window

    All the parameters and configurations are/will be updated in this PR.

  6. Test the mapping and navigation part.

    Full map generated by cartographer using LiDAR data from multicopter:
    gsoc_maze


    Path planning and obstacle avoidance:

    As we’ll be using the move_base package from the ros_navigation stack, we’ll be dealing with tuning parameters for local and global costmaps and trajectory planner for base_local_planner ros package.
    The ap_navigation package from the ardupilot community has some of the parameters required for tuning trajectory planning and costmaps but various other parameters are required to be added and modified.
    After adding and modifying some of such parameters the result:


    The green line here depicts the planned path and for motion planning, velocity commands are given to the vehicle.

Step 2 - Setting up the vehicle and companion computer environment:

Here we’ll setup companion computer normally with Ubuntu 20.04/18.04 with ROS Noetic/Melodic and make all the necessary connections with autopilot flashed with ArduPilot.

Almost all the packages will be installed similar to what have been done in simulation except for rplidar_ros package if we’ll be using RPLiDAR to get the LiDAR data as a ROS topic.

After all this, majority work is done in tuning the cartographer SLAM. All the parameters needed to be modified are discussed above in simulation part. Generally two types of parameters are required to be modified:

  • CeresScanMatcher: It interpolate the already created submap and through the initial guess it made, it finds a place where the scan match fits. If sensors are trustworthy enough, then this method is preferred.

  • RealTimeCorrelativeScanMatcher: It can be enabled if we don’t have much trust in our sensors. It matches the scan to the current submap using loop closure. Then the best match is used in CeresScanMatcher.

Git link: https://github.com/snktshrma/Ardupilot-Cartographer-SLAM


Integrate the offboard object avoidance with ArduPilot Auto mode

If the time allows, the final task would be to integrate the offboard avoidance with Ardupilot Auto mode. For this task, we need to modify the mavros plugins and scripts.

  • MAVLINK: MAVLink is a very lightweight messaging protocol for communicating with drones (and between onboard drone components).
    MAVLink follows a modern hybrid publish-subscribe and point-to point design pattern: Data streams are sent / published as topics while configuration sub-protocols such as the mission protocol or parameter protocol are point-to-point with retransmission.

  • MAVROS: MAVROS is a ROS package that enables MAVLink extendable communication between computers running ROS for any MAVLink enabled autopilot, ground station, or peripheral.


Note: The above architecture for ardupilot is for the SITL environment but connection with GCS or mavproxy remains the same in real life usage.

Ardupilot communicates with the simulator (e.g. Gazebo) to receive sensor data from the simulated world and send motor and actuator values. In real life, sensors and actuators will be directly connected with FCU.

Architecture below represents the PX4 on SITL, that is the integration of ROS/Gazebo with PX4. The architecture is very much similar to the ardupilot except for the Simulator node which will be replaced in real drone by actual connections of actuators and sensors with the FCU.

Assuming the above architecture to be very much similar to that of Ardupilot after connecting with ROS through MAVROS. When ROS (API/Offboard) on a companion computer will be available, the communication of ROS with Ardupilot and GCS will happen over MAVLink using the MAVROS package. MAVROS package bridges ROS topics to MAVLink and Ardupilot conventions.

Ardupilot communicates with the GCS and an Offboard API (e.g. ROS) to send telemetry from the vehicle and receive commands.

For the completion of this task, modification of MAVROS plugins is required.

The general flow to solve this task is as follows:

  1. Final target sent from GCS/mavproxy when vehicle in Auto mode.
  2. Final target received by Ardupilot and ROS over MAVROS continuously at 1hz.
  3. ROS offboard path planning calculates the path around the obstacles and sends appropriate velocity commands to the ardupilot.

This problem is also mentioned in this PR .

The way I see it is to create a new MAVROS plugin that subscribes to POSITION_TARGET_GLOBAL_INT mavlink message, convert the position targets to ENU and publish a rostopic with the position target for the ROS offboard path planning to calculate velocity commands. Then these velocity commands will be sent back to FCU to command the vehicle around the obstacle.

It would be fascinating to hear what members of the community and the development team have to say about this approach. Suggestions and feedbacks are always appreciated, and we may discuss them in the comments section below.

Thank you!

7 Likes

We need CI for the avoidance stuff ! :stuck_out_tongue_winking_eye:

1 Like
  1. ROS Noetic is ROS1 last release and it will be EOL in 2025. It’s also not supported in Ubuntu 22.04. IMO you should switch to ROS2 and navigation2
  2. rtabmap is far more versatile then cartographer
  3. Your work seems to be restricted to rover 2d planning/avoidance
1 Like

Thank you very much for your suggestions @soldierofhell!

As for the questions you asked:

  1. Yes! Shifting to ROS2 and Navigation2 is on the future tasks for this project. As of now, working on ROS1 gives us the idea of the overall workflow for the non-GPS navigation and shifting to ROS2 after this will be much easier.

  2. I am not very well acquainted with the working of RGB-D SLAM but surely it is much more versatile than cartographer. I would love to learn more about it in depth and shift to use rtabmap as another non-GPS navigation approach in future.

  3. Yes it’s right that the project is based on 2d obstacle avoidance and planning but it is not just restricted to the rover. It works pretty well on multi-rotors as well. But yes, 3D planning and avoidance will remain a future task alongside rtabmap.

: )

looking forward to your progress. great information roadmap. i am just setting up VOXL CAM Perception Engine | ModalAI, Inc.
i hope to follow along with your progress and hopefully will be able to learn your methods vs the program i am following with the VOXL engine

1 Like

Thanks @local_ad : )
Would also love to know what you are up to using VOXL CAM!

Sounds like a long term learning session. I have to finish my 3/4 scale build, should be done in a week, I am trying to use all canbus interface, esc’s, gps along with the modalai flight controller. I designed and 3d printed my drone “Rooster”
but thought it was to large for a indoors spying drone so I just printed it 3/4 scale.
The modalai uses px4 and they have a detailed process that I am attempting to follow, including 4g cellular

3/4 scale

Full size build. I am still working on reducing the number of parts and reducing waste in the printing process

I have 5 builds using different controllers and motors trying to gather data for the ultimate utility drone

2 Likes

Wow! that’s great! I’d surely love to see the final build! All the best @local_ad : )

@snktshrma that’s a good roadmap! Looking forward to your updates!

1 Like

update: I got overwhelmed trying to setup the sim card and VPN for cellular 4g connection.
alot of my problems were because I wanted to and had to use a linux as the os. So there was compounded learning, and then docker
I am talking about a printed version that is 80 percent size and using the VOXL flight computer and camera setup. The voxl camera is functional and software provided by modoul working in linux. I have a build almost complete with great motors and esc, hope to have posts of flights soon. I purchased a motor test stand for performance so i could record the motor/blade performance and make adjustments as needed.


1 Like

That’s a great progress. Would love to see a flight