Camera follow another vehicle

I seem to remember functionality where you specify the SYS_ID of another vehicle, and your vehicle’s camera mount starts tracking that SYS_ID. I can’t immediately find documentation for this. Did I dream this up?

Antenna Tracker has SYSID_TARGET, but plane and copter don’t show this parameter in the full parameter list. Maybe I just imagined this functionality outside of antenna tracker.

OK, I’m not crazy. This page Gimbal / Mount Controls — Copter documentation describes a “SysId Target” mode for the camera mount. It lists 7 modes:

  1. Retract Mode
  2. Neutral Mode
  3. MAVLink Targeting
  4. RC Targeting
  5. GPS Point
  6. SysId Target
  7. Home Location

The entry for SysId Target says, “the gimbal points at another vehicle with a specified MAVLink system id. Users never need to actively set the gimbal to this mode and there are no known GCSs that support setting the system id.” I’m not entirely sure how to interpret this sentence. Does it mean that I need to send custom MAVLink to set the SysId? Does this mean some other mechanism exists that sets the gimbal to this mode as a side effect of some other action?

The modes seem to correspond to values for MNTx_DEFLT_MODE, except that 5 (SysId Target) is missing:

Value Meaning
0 Retracted
1 Neutral
2 MavLink Targeting
3 RC Targeting
4 GPS Point
6 Home Location

@rmackay9 Randy, I think this is your domain?

Ari.

What camera are you using?

I have a Siyi arriving next week, but I would imagine that this should work the same for any mount, including simple servo pan/tilt contraptions.

Ari.

@iter,

Yes, you’re right. It should work but as far as I know it has never been tested and no GCSs support it. Most users will also have no way to send another vehicle’s position to the following autopilot.

Not that it matters too much but I suspect what happened is that one of the other core AP devs did 80% of the work (make the gimbal point at a vehicle, which was the easy bit) but didn’t finish the last 20% (testing, wiki, ground station) which is more difficult.

1 Like

Thank you, @rmackay9 . Is that dev still around? Should I pick up the slack? In my mind’s eye, the way this would work is similar to Antenna Tracker’s SYSID_TARGET: “The identifier of the vehicle being tracked. This should be zero (to auto detect) or be the same as the SYSID_THISMAV parameter of the vehicle being tracked.”

I’m building a tracking camera to film my rocket planes. It’s hard to capture them with stationary cameras, and asking kind strangers to film produces inconsistent results. I’m doing more than SYSID_TARGET would do by itself–for example, since I know the distance to the target, I can set zoom and focus appropriately if the camera allows it. I’m doing most of this as a standalone Python script, but if support in ArduPilot is close to done, I’m happy to support that effort instead.

Ari.

1 Like

@iter,

I can help you with the coding part of this if necessary or feel free to modify AP and ideally raise a PR later if you’re up for that.

I think that MNTx_DEFLT_MODE can be set to 5 even though the parameter description doesn’t have this option. Making it so that option does appear in the GCSs is just a matter of modifying the parameter description in the AP_Mount library.

Re the system id that the gimbal is following, this can apparently be set by sending a DO_SET_ROI_SYSID command (within a command_long or command_int) to AP. This is documented on the MAVLink interface page including some example commands that can be manually copy-pasted into MAVProxy in case you’re using that GCS. This GCS is more aimed at deverlopers but anyone can use it.

Thank you for your guidance, @rmackay9. I never got the hang of MAVProxy, but I’m handy with pymavlink.

As long as we’re talking about my project, I want to ask a question higher up the decision tree. I’m building a ground camera. My first inclination was to use Antenna Tracker as the closest vehicle type to what I’m doing. But AT assumes the IMU is mounted on the gimbal and is the only source of attitude data. AT, unlike other vehicle types, lacks support for gimbal mounts. So the next place for me to go was to use ArduPlane and try to get ROI/SYSID to work. Maybe that’s the wrong direction to go? Maybe the thing to do is either add mount support to AT, or introduce a different configuration (“frame type?”) to AT where the IMU is stationary and the gimbal can rotate by specific angles relative to the IMU base?

Ari.

Hi @iter,

Very interesting.

The strategic solution for this problem would be, as you’ve described, to extend antenna tracker so it supports the gimbal library and doesn’t require the autopilot to be mounted on the tracker’s camera platform. I’ve definitely thought of doing this as well but I’ve got a full plate at the moment.

Using another vehicle as a tracker should work too. I would probably have used Rover instead of Plane but perhaps that’s because I’m more familiar with Rover and it’s much simpler.

FYI @peterbarker @iampete

@iter,

I very much like the idea of setting the zoom and focus based on the known distance to the vehicle being filmed. I was thinking of doing something similar recently for a “dronie” using the Siyi A8 although my aim was not to keep the target (e.g. the pilot) in focus but rather to produce that odd effect we sometimes see in TV and movies where the camera is moved back at the same time the camera is zoomed in leaving the subject apparently unchanged while the background changes dramatically.

1 Like

@rmackay9 The dolly zoom. Here’s a YT video that talks about it. I think A8 only has a digital “zoom;” I doubt you’d get the same effect. But ZR10 has a 10x optical zoom. It would be cool to have a canned software setup (a mission item?) that would allow you to do this repeatably and without human intervention.

This brings up an important issue in protocol design. Many camera protocols use arbitrary units for controls. This is either because they are designed for human use (like a macro/tele button on a camcorder–or the Siyi zoom command) or because the vendor doesn’t want to bother with specifics of a lens and just uses 100 or 255 to represent full extension, whatever “full” may mean. Like 1.5ms servo ;=) Now if you’re designing a protocol (like MAVLink or AP_Camera) or if you have the ear of a camera vendor like Siyi, I always push for real units. The unit of focus is meters (from focal plane to subject). The unit of zoom is degrees of field-of-view. Don’t let anyone tell you different :=)

Ari.

@iter,

Txs for the video explanation. Re the canned mission idea, I was planning on writing it as a Lua script which can be integrated into missions via the SCRIPT_TIME mission command.

Re scaling, I bumped into this issue with the Siyi driver’s rotation rate so I measured the gimbal’s actual rate and then the driver does a conversion from desired rate to the gimbal API’s value. It would be nice for the SDK to accept a real rate but, you know, we’ve seen a lot worse sins in other sensor APIs so I’m not too worried about it.

You can’t control vendors, and I agree that Siyi’s documentation is much above average (took me all of three evenings to implement the protocol, with unit tests). Where I’m going is you do have control over ArduPilot APIs, and I encourage you to define them in terms of physical units. This would make it much easier for users like me to use them and would drive us from rolling our own :=)

https://github.com/arikrupnik/tracker/blob/master/siyi.py

1 Like

@iter,

Ah right. Which interfaces in particular do you mean? We’re definitely not going to take the blame for MAVLink’s interface… some of it is our fault but not most of it…

When you design the zoom/focus API, remember me :=)

AP_Mount uses degrees and degrees/second, that’s excellent:

void set_angle_target(float roll_deg, float pitch_deg, float yaw_deg, bool yaw_is_earth_frame);
void set_rate_target(float roll_degs, float pitch_degs, float yaw_degs, bool yaw_lock);

AP_Camera has dimensionless zoom_pos in

void control(float session, float zoom_pos, float zoom_step, float focus_lock, float shooting_cmd, float cmd_id);

and only on-off control in

bool set_zoom_step(int8_t zoom_step);
bool set_manual_focus_step(int8_t focus_step);

Now if you had something like this, and in addition made that available in Lua… I realize this is a lot of work for drivers of cameras that don’t support this directly, but you did this for Siyi pan-and-tilt, and that sets expectations high :=)

bool set_zoom(float deg_fov);
bool set_focus(float m);

And just to clarify where I am–I’m expecting my ZR10 next week, and I’m looking forward to testing your suggestions with it. Until then, I’m just thinking outloud.

1 Like

I’m taking your advice to heart. I’m using Rover. And instead of setting the gimbal on a tripod, I’m setting it up on a rock crawler chassis. Might as well drive the camera instead of carrying it around the flying field :=) Someone gave me this crawler over 10 years ago and it’s mostly sat in the garage. Maybe finally a good use for it. The body is going back in the box. (yes, that’s my 35-year-old BigTrak on the shelf above)

Ari.

1 Like

Something like this.

Very nice!

Thanks for the advice on the scaling of the zoom and focus. I’ve taken this to heart and it is good timing to review this 'cuz we’re actively working on camera enhancements.