Question about customizing the obstacle avoidance

Hello,

To my understanding, there are two scenarios of avoiding an obstacle so far, please correct me if wrong:

  1. In Loiter, AltHold mode, while control the drone manually, if facing the obstacles, it will react by stopping or sliding based on the which behaviors we set.
  2. In Auto, RTL, Guided mode, there are two algorithms (BendyRuler and Dijkstra) which help the drone find shortest path.

I want to implement my own obstacle avoidance function in both 2 cases above on the companion computer (manual and auto mode). My questions are:

  1. In manual mode, what is the optimal way for keep being in the Loiter and still moving based on command sent from the companion computer (which is the concept of Guided) because I do not expect to change from Loiter to Guided, move the drone and change back to Loiter. For example, when I keep pushing Pitch stick (RC 1), drone will find the way to avoid facing obstacle?

  2. Inverting to the above problem, when drone is in Auto, Guided mode, how could I used the RC controller for moving drone and when I release stick, it will automatically back to the plan? Testing a drone with self-implemented algorithm always causes problems, so I want to minimize the crash as much as possible.

I intended to stick to the method of RC override but it seems like this is not the smart way, so I raise those questions for getting advises from you.

Thanks in advance!

Right.
I would add a 3. (my preferred) option. By using a companion computer connected over mavlink to the flight controller you can freely implement and scale the object avoidance solution to your needs.
That can start with a simple Arduino or Raspberry which takes sensor readings and eg sends a Brake command to the FC. Here is an example I implemented on a Solo. It’s taking Lidar distances and puts the drone in Brake if there is an obstacle ahead or issues a visual and voice feedback if there is an obstacle to the right or left of the current flight path.

And you can scale up to an Nvidia or Intel supercomputer which allows you to run eg neural networks for realtime image processing. Here is another example for autonomous navigation based on neural networks.

Now to your second question about RC overriding. If you want your drone to do anything more sophisticated than stopping or sliding sideways (as in Bendy Ruler) you have to switch it to Guided. It’s not possible to have it in Loiter and send directions to the FC in parallel.
Vice versa, it’s not possible to override the movements with your RC when the drone is in Guided. BUT: you can use a joystick. However I would not use the RC_OVERRIDE option.

The project Redtail I linked above includes the implementation of a ROS controller which takes the input from the Nvidia trailnet DNN for autonomous navigation along a track and mixes it with the inputs from a gamepad controller (I use a Logitech 710). Joystick movements always override the autonomous direction commands and the results are sent via MAVROS to the FC in Guided mode.

In case of an emergency, you can always switch back to Loiter on your RC and have full RC control.

I have skimmed your code and see a lot of thing to learn, they are well organized! I like the lidar tilt adjustment a lot, it helps the sensor not interfering with land because of changing pitch angle when moving forward. Very nice! In my case, I have successfully interface with Ardupilot using Arduino or Raspberry Pi by sending Mavlink message. I have some words reply:

  1. They joystick you mentions is based on method of feeding RCx into Ardupilot by Mavlink, I guess?

  2. My bigger question was about more sophisticated than just stopping at the obstacle in manual modes. For example in my case, I have the radar module which returns X,Y coordinate of objects in the field of view, I took them all and look for closest objects in number of sectors I define (right now, I take 20 sectors in 120 degree of field of view of sensor). Due to the information of closest distance in each sector, I will develop the appropriate algorithm. The default Ardupilot OA provides sector of which cost 45 degrees, my case could be much less. My intention is generating velocity vector which guides the drone direction. As far as I know, to do that, the appropriate way is in GUIDED. But what if keeping in the LOITER and still pushing the RC1 (pitch) and drone will follow the “guiding” velocity vector that I create? There is a hint in AC_Avoid.cpp where drone decided to stop or “slide”. What if instead of “slide” it will “follow” some directions?


Line 341

What do you think?

Re the radar’s input to object avoidance, simple avoidance only makes use of 45degree sectors but BendyRuler has a 2D object database where it stores the locations of objects that have been sensed by the lidar. To get the radar’s input into AP’s object avoidance we’d probably need a new driver or you could send in the data over malink using the obstacle-distance message.

If you’re looking to write your own obstacle avoidance library you could extend the AC_Avoidance library to add a 3rd path planning method (the 1st and 2nd methods being BendyRuler and Dijkstra’s).

Doing obstacle avoidance is quite difficult but if you’re interested in contributing to AP’s avoidance features there are a number of ways that we know we can improve it and these are in the issues list. Coincidentally I’ve added links to some of them from our GSoC page. I’m not suggesting you’re a GSoC student of course but it’s just one place that I’ve listed up some of the top priority items.

Yes. The Redtail controller takes input from 2 nodes - one is the pose data coming from the trailnet DNN, the other is the input from the /joy node. By default the trailnet data is used. The controller periodically checks for any stick movements, eg when you see the drone to behave unexpectedly and try to override its flight direction. In that case it takes the stick inputs instead of the traillet to calculate the new pose.positions before they are sent to the FC via the MAVROS. See line and following: https://github.com/mtbsteve/redtail/blob/12b36815e0a764a3a54c258f00f6958f76c1104c/ros/packages/px4_controller/src/px4_controller.cpp#L842
The RC controller is ideally only used in emergency situations to take over full human control by switching over to Loiter or other human controlled flight modes.

Actually that was my idea as well when I started to play with my obstacle avoidance projects… But I could not determine a way to do this in Loiter or other non autonomous modes. So if you find a solution along the lines what @rmackay9 suggested, I would be all ears and very interested :slight_smile:

@mtbsteve Thanks for explaining the mechanism of Joystick in detail. Technically, it is the RC override so if we read the RC_IN message and we can write the RC_OUT back to AP, I guess. It will be tested soon.

About my algorithm, basically, it is developed base on this publication https://www.researchgate.net/publication/224668909_Mobile_robot_obstacle_avoidance_in_a_computerized_travel_aid_for_the_blind. I think you may want to have a look. My radar returns a beam of nearly 110 degree with multiple objects. So my job is about mapping many objects into smaller sectors (72 elements of the array described in OBSTACLE_DISTANCE message). I will share the results when they come.

By the way, let me ask this question:

The picture below show “red circles”, they are 72 elements mapped from my 110-degree radar. Does smaller radius of circle mean better resolution? I see @rmackay9 's video about implementing BendyRuler and there were a lot of circles with different radius (definitely 360-degree with spinning lidar)

Imgur

@bigboy061293,

re the circle size, yes, small means better resolution. There’s a OA_DB_BEAM_WIDTH parameter that specifies how wide the beam is from the objects coming in from the proximity sensor. So a wider beam will lead to fewer objects but of a larger size.

Not that it matters too much but this parameter is probably in the wrong place - it should probably be in the Proximity and/or RangeFinder library but for the moment it’s only used by the object database so I put it there.

P.S. this parameter is quite new and not included in Copter-4.0.3. It will go out with Copter-4.0.4 though.

@rmackay9,

I am going test this.

By the way, I just input the OBSTACLE_DISTANCE but not DISTANCE_SENSOR, the Mission Planner still report it in the proximity display windows (Crtl + F, Proximity Sensor). I inject the array of 72 elements from -50 degree to 50 degree.

  1. What is the mechanism of choosing the 0th sector in 8-sector of proximity, is that the middle element in my array or averaging some of them?

  2. So in manual modes (Loiter…), Ardupilot will use that information to perform the avoidance (slide, stop)?

Imgur

Imgur

One more thing, can we avoid it to take the value from 72-element array and use another source to send via DISTANCE_SENSOR?

I believe it chose the smallest distance in specific sectors:

However, It does not make sense to me because the proximity sector return non-middle element in the array. Photos below:

  1. In the console is my array sent to AP in each 10ms by Raspberry Pi via Mavlink
  2. Proximity UI and Mavlink inspector did not show middle elements (89) and sometimes, it did not update.

Imgur

Imgur

Imgur

I have checked with angle offset but I think it is not the problem though.
@rmackay9, could you please have a look on this. Would be a great appreciation!

UPDATE:

When looking at the Mavlink inspector, I observe that the DISTANCE_SENOR behaved very strange: at 5 seconds after rebooting, this message go with high rate - depending on SR1_EXTRA3 parameter. After 5 seconds, it does not go with that high anymore. Sometimes it was 0.3Hz, sometimes was 0Hz.

Imgur

Moreover, even I send a dummy 72-element array with all are 0 and middle element (#36) is 500, the DISTANCE_SENSOR still “blinking” 0 Hz and 0.3 Hz.

Imgur

Do I loss somewhere?

MORE UPDATE:

I have solved the problem of “blinking” DISTANCE_SENSOR by adjusting the stream rate of Mission Planer. I guess it was overloaded. However, the shortest distances still be incorrect. I sent out dummy array again with first element is 0cm, middle element is 200cm, last element is 150cmm, the rest is all 500cm but here is what I see in the Proximity UI

Imgur

I think I realized the problem:

Although “red-circle” are displayed correctly relating to “heading” of the quad, the sectors shortest distances are shifted counter clockwise 45 degree. Please note my distance information in the console, the red one are shortest distances, supposedly to be appeared in sector 7, 0 and 1 but in the UI, we see they appeared in sector 6, 7 and 0

Imgur

I solve this problem temporarily by checking the drone in Manual (Loiter, AltHold) or Auto. If in Manual, just sending DISTANCE_SENSOR with appropriate sectors and min distances. If in Auto, send the OBSTACLE_DISTANCE as default.

Would be a great appreciation if someone could point out where the problem comes from!

@Michael_Oborne @bugobliterator

Can I implement these code with arduplane my frame is quadplane with 4 tilt?

Also can I add the other modes ?

Yes, you can, if you change the source code and get it to work.

Which source code , could you tell me specifically , I think that is master right?

Yes, code changes and pull requests must be based on the master branch.

1 Like