Simple object avoidance using OAK D Lite camera and pymavlink is not working

Hardware:

  • Jetson Nano
  • Oak D Lite stereo depth camera

I have a python script that runs on the Nano, getting depth data from the Oak D. I can see the heartbeat for the companion computer and the distance_sensor_send (at about 30Hz) current distance data in Mission Planner. The proximity radar in MP also shows the distance from perceived obstacles.

Problem is, my copter does not maintain distance to, nor back away from, any obstacles. I’ve tried positioning obstacles from1 meter to 5 meters away.

Log plot shows only brief time periods (seconds) where prx data is captured, despite what I can see on MP’s companion computer distance sensor plot and proximity radar as mentioned above.

I’ve been able to get an LW20/C working for simple and bendy ruler avoidance previously but this companion computer setup is not. Any ideas on what the cause could be and a potential fix?

distance_sensor_send fields:

        # Send DISTANCE_SENSOR message and reset last distance sensor time
        # see https://mavlink.io/en/messages/common.html for fields
        if current_time - last_distance_sensor_time > 1/distance_sensor_frequency:
            jetson_nano.mav.distance_sensor_send(
                time_boot_ms=0,  # timestamp in ms since system boot
                min_distance=40,  # minimum distance the sensor can measure (cm)
                max_distance=900,  # maximum distance the sensor can measure (cm)
                current_distance=int(depth),  # current distance measured (cm)
                type=4,  # type of distance sensor: 0 = laser, 4 = unknown. See MAV_DISTANCE_SENSOR
                id=1,  # onboard ID of the sensor
                orientation=0,  # forward facing, see MAV_SENSOR_ORIENTATION
                covariance=0,  # measurement covariance in centimeters, 0 for unknown / invalid readings
            )
            last_distance_sensor_time = current_time

MP params:
prx1_type = 2
prx1_min = 0.4 m
prx1_max = 9.5 m

avoid_enable = 3
avoid_behave = 1 (stop)
avoid_margin = 3 m
avoid_dist_max = 5 m
avoid_backup_dz = 2 m

Hello,

I assumed you follow steps similar to
https://ardupilot.org/copter/docs/common-realsense-depth-camera.html#common-realsense-depth-camera

with a script that look like

I would suggest you disable FENCE for indoor testing avoid_enable = 1

I have seen that page but I used a different implementation and script. My script was far less sophisticated and only handles sending of distance_sensor_send for now, whereas the d4xx_to_mavlink also handles sending of send_obstacle_distance_message. Perhaps I should be using this function instead to send messages for proximity sensing?

Another key difference is I’m not using MAVROS nor APSync. Is this potentially a problem? I’m just running:

  • OAK D Lite (depth processing) → Jetson Nano executed script (ssh’d from my computer before flight) → pymavlink serial /dev/ttyTHS1 → heartbeat and distance_sensor_send → Pixhawk ↔ Mission Planner (~30Hz distance sensor data and Jetson Nano heartbeat are visibile in mavlink inspector).

I’m doing outdoor testing so I’d imagine this shouldn’t matter?

Cany you provide a log file so we can check the signsal and parameters

Sure thing, thank you @ppoirier. It was too large to upload directly:

You have to make your file link accessible

Oh… I think I see the problem now; if you send the Distance_sensor Mavlink message , your Oak is acting as an rangefinder.

So you should process the signal as a Mavlink Rangefinder
RNGFND1_TYPE = 10 Mavlink
and set
PRX1_TYPE= 4 to enable using first range finder as a “proximity sensor”

Set the RNGFNDx_ORIENT parameters (i.e. RNGFND1_ORIENT) to specify the direction the camera is pointing = 0 - Forward

https://ardupilot.org/copter/docs/common-simple-object-avoidance.html

@ppoirier I don’t have access to the log so I can’t comment accurately.
But you don’t need RNGFND_ params for DISTANCE_SENSOR.
All you need to do is send the mavlink message and configure PRX_TYPE = 2. As long as the orientation field is between 0->7 it should work.

Thanks @ppoirier. My understanding, similar to what @rishabsingh3003 said, was I don’t need the rngfnd params set for proximity sensing using the distance_sensor_send function?

I’ve given access to the file now Rishab, thanks.

One thought I had was the sparse PRX.D0 datapoints could be due to noisy data and setting prx_filt could help? Just an idea, I’ve tried all I could think of!

If it helps, here is the script I run on my Nano:

import cv2
import numpy as np
import time

import depthai as dai
from pymavlink import mavutil

newConfig = False

########################## 
# Setup DepthAI pipeline #
##########################

# Create DepthAI pipeline
pipeline = dai.Pipeline()

# Define DepthAI sources and outputs
monoLeft = pipeline.create(dai.node.MonoCamera)
monoRight = pipeline.create(dai.node.MonoCamera)
stereo = pipeline.create(dai.node.StereoDepth)
spatialLocationCalculator = pipeline.create(dai.node.SpatialLocationCalculator)

xoutDepth = pipeline.create(dai.node.XLinkOut)
xoutSpatialData = pipeline.create(dai.node.XLinkOut)
xinSpatialCalcConfig = pipeline.create(dai.node.XLinkIn)

xoutDepth.setStreamName("depth")
xoutSpatialData.setStreamName("spatialData")
xinSpatialCalcConfig.setStreamName("spatialCalcConfig")

# Setting properties for depth AI pipeline
monoLeft.setResolution(dai.MonoCameraProperties.SensorResolution.THE_400_P)
monoLeft.setCamera("left")
monoRight.setResolution(dai.MonoCameraProperties.SensorResolution.THE_400_P)
monoRight.setCamera("right")

stereo.setDefaultProfilePreset(dai.node.StereoDepth.PresetMode.HIGH_DENSITY)
stereo.setLeftRightCheck(True)
stereo.setSubpixel(True)

# Config
topLeft = dai.Point2f(0.45, 0.45)
bottomRight = dai.Point2f(0.55, 0.55)

config = dai.SpatialLocationCalculatorConfigData()
config.depthThresholds.lowerThreshold = 400
config.depthThresholds.upperThreshold = 9000
calculationAlgorithm = dai.SpatialLocationCalculatorAlgorithm.MEDIAN
config.roi = dai.Rect(topLeft, bottomRight)

spatialLocationCalculator.inputConfig.setWaitForMessage(False)
spatialLocationCalculator.initialConfig.addROI(config)

# Linking
monoLeft.out.link(stereo.left)
monoRight.out.link(stereo.right)

spatialLocationCalculator.passthroughDepth.link(xoutDepth.input)
stereo.depth.link(spatialLocationCalculator.inputDepth)

spatialLocationCalculator.out.link(xoutSpatialData.input)
xinSpatialCalcConfig.out.link(spatialLocationCalculator.inputConfig)

# Create RGB camera
camRgb = pipeline.create(dai.node.ColorCamera)
camRgb.setPreviewSize(640, 360)
camRgb.setResolution(dai.ColorCameraProperties.SensorResolution.THE_1080_P)

# Create XLinkOut for RGB frames
xoutRgb = pipeline.create(dai.node.XLinkOut)
xoutRgb.setStreamName("rgb")
camRgb.preview.link(xoutRgb.input)

########################## 
# Setup MAVLINK connection #
##########################

# Create a MAVLink connection
jetson_nano = mavutil.mavlink_connection('/dev/ttyTHS1', baud=115200)

# Distance sensor frequency should be much faster than heartbeat
heartbeat_frequency = 1  # 1 Hz
distance_sensor_frequency = 40  # 40 Hz

last_heartbeat_time = time.time()
last_distance_sensor_time = time.time()

########################## 
# Setup video writer #
##########################

# Define the codec using VideoWriter_fourcc() and create a VideoWriter object.
# Ensure the frame size matches the dimensions of the frames you're writing, 
fourcc = cv2.VideoWriter_fourcc(*'mp4v')
# Variables to write video to sd card
video_file_path = '/home/tsungxu/video_test.mp4'
frame_size = (640, 360)
frame_rate = 30.0
# instantiate cv2.VideoWriter before the loop
out = cv2.VideoWriter(video_file_path, fourcc, frame_rate, frame_size)

##########################
# Connect to device and start pipeline
##########################

with dai.Device(pipeline) as device:

    # Output queue will be used to get the depth frames from the outputs defined above
    depthQueue = device.getOutputQueue(name="depth", maxSize=4, blocking=False)
    spatialCalcQueue = device.getOutputQueue(name="spatialData", maxSize=4, blocking=False)
    spatialCalcConfigInQueue = device.getInputQueue("spatialCalcConfig")

    color = (255, 255, 255)

    # print("Use WASD keys to move ROI!")

    #!!! rgb output
    rgbQueue = device.getOutputQueue(name="rgb", maxSize=4, blocking=False)

    while True:
        inDepth = depthQueue.get() # Blocking call, will wait until a new data has arrived

        inRgb = rgbQueue.get()

        # !!! Convert the retrieved rgb frame into a numpy array
        frame = inRgb.getCvFrame()

        # Write the frame to the file
        out.write(frame)

        spatialData = spatialCalcQueue.get().getSpatialLocations()

        ##################
        # send heartbeat to flight controller #
        ##################

        # Send a heartbeat message every second. https://mavlink.io/en/mavgen_python/#heartbeat
        current_time = time.time()
        
        if current_time - last_heartbeat_time > heartbeat_frequency:
            # The arguments for the heartbeat message are type, autopilot, base_mode, custom_mode, system_status, and mavlink_version.
            jetson_nano.mav.heartbeat_send(
                18,  # Type of the system - MAV_TYPE_ONBOARD_CONTROLLER: https://mavlink.io/en/messages/minimal.html#MAV_TYPE
                mavutil.mavlink.MAV_AUTOPILOT_INVALID,  # Autopilot type
                0,  # System mode https://mavlink.io/en/messages/common.html#MAV_MODE
                0,  # Custom mode, this is system specific
                3,  # System status https://mavlink.io/en/messages/common.html#MAV_STATE
            )
            last_heartbeat_time = current_time

        for depthData in spatialData:
            roi = depthData.config.roi
            roi = roi.denormalize(width=frame.shape[1], height=frame.shape[0])  # adjust this to frame's dimensions
            xmin = int(roi.topLeft().x)
            ymin = int(roi.topLeft().y)
            xmax = int(roi.bottomRight().x)
            ymax = int(roi.bottomRight().y)

            depthMin = depthData.depthMin
            depthMax = depthData.depthMax

            fontType = cv2.FONT_HERSHEY_TRIPLEX

            cv2.rectangle(frame, (xmin, ymin), (xmax, ymax), color, 1)
            cv2.putText(frame, f"X: {int(depthData.spatialCoordinates.x)} mm", (xmin + 10, ymin + 20), fontType, 0.5, color)
            cv2.putText(frame, f"Y: {int(depthData.spatialCoordinates.y)} mm", (xmin + 10, ymin + 35), fontType, 0.5, color)
            cv2.putText(frame, f"Z: {int(depthData.spatialCoordinates.z)} mm", (xmin + 10, ymin + 50), fontType, 0.5, color)

            ##################
            # send distance sensor data to flight controller #
            ##################

            # Get the depth in cm (assuming depthData.spatialCoordinates.z is in mm)
            depth = int(depthData.spatialCoordinates.z) / 10

            # Send DISTANCE_SENSOR message and reset last distance sensor time
            # see https://mavlink.io/en/messages/common.html for fields
            if current_time - last_distance_sensor_time > 1/distance_sensor_frequency:
                jetson_nano.mav.distance_sensor_send(
                    time_boot_ms=0,  # timestamp in ms since system boot
                    min_distance=40,  # minimum distance the sensor can measure (cm)
                    max_distance=900,  # maximum distance the sensor can measure (cm)
                    current_distance=int(depth),  # current distance measured (cm)
                    type=4,  # type of distance sensor: 0 = laser, 4 = unknown. See MAV_DISTANCE_SENSOR
                    id=1,  # onboard ID of the sensor
                    orientation=0,  # forward facing, see MAV_SENSOR_ORIENTATION
                    covariance=0,  # measurement covariance in centimeters, 0 for unknown / invalid readings
                )
                last_distance_sensor_time = current_time


            # # DEBUG print distance sensor data
            # if(depth > 300):
            #     print(depth)

        # !!! Show the rgb frame
        # cv2.imshow("rgb", frame)

        key = cv2.waitKey(1) & 0xFF  # Get ASCII value of key
        if key == ord('q'):
            break

    # release video writer
    out.release()

Yes, right, proximity can consume mavlink distance message as well

I see on log that proximity is set to PRX_1 but logged under PRX_2 , that is not normal

It might be interesting setting your system as a rangefinder just to check if the mavlink distance is logged correctly as rangefinder distance

The PRX2 log is a little bit of logging weirdness that was added by me because I could not figure out how else to log 3D proximity values.
So essentially, PRX2 is the values logged for the obstacles detected in the horizontal plane around the vehicle.

1 Like

@tsung seems like you are are getting some mavlink data… but just very sparse… Can you put some debug statements in your python script to figure out what rate you are sending those distances at?
Also, can you also temporarily keep min distance = 0 and max_distance a very large number so we know you aren’t filtering anything out by mistake (and do the same with the PRX_MIN/MAX, you can actually set those at 0 since you don’t really need them because you are setting the range in the mavlink message)

@tsung

Just tested your code on a Jetson Xavier, with Orange Cube (log not Armed)

Seems to work pretty good on the bench indoor.
The difference on my setup is:
I am using a FTDI Usb-Serial converter on ttyUSB0

And I modified the HeartBeat command

    conn.mav.heartbeat_send(mavutil.mavlink.MAV_TYPE_ONBOARD_CONTROLLER,
                            mavutil.mavlink.MAV_AUTOPILOT_GENERIC,
                            0,
                            0,
                            0)

Edit: additional tests made me modify the Mavlink Rate
distance_sensor_frequency = 30 # 30 makes the mavlink out at 20 HZ - going faster ( 40 ) makes the Proximity Radar Signal Disappear

1 Like

@ppoirier

I also tested this on a bench, setting at 30Hz and can confirm that the proximity radar signal in MP does not disappear, and it disappears often at 40Hz.

Strangely, my logs for this bench and unarmed test do not show a prx field nor any D0 data.

Also, at 40Hz, while the radar signal sometimes disappears, the distance sensor data in mavlink inspector and graphing it do not show any dropped data points.

Re, heartbeat, does choosing mav_autopilot_generic as the autopilot type matter here?

@rishabsingh3003

Agreed! I added debug statements on a benchtop test for about 10 seconds. They show send frequency is between

  • 25.5 to 30.06 Hz when distance_sensor_frequency = 40
  • 17.41 to 20.33 Hz when distance_sensor_frequency = 30

These frequencies align with what I see in MP (for example when set at 30hz):

I’ve also set prx1_min and prx1_max to zero and will test that tomorrow in field too.

Just tested setting a 30Hz frequency and simple avoidance now works as expected.

The logs are now showing far more readings. As @ppoirier pointed out, having a consistent radar signal was indicative that this would work.

I do wonder why 40Hz was too high to set distance_sensor_frequency.

Regardless, thanks to both of you for the help!

1 Like