GSoC 2018: Balance Bot with Ardupilot

Google Photos

It is the year 2018.
Ardupilot forces have victoriously swept over most of the RC world. Both enthusiast and academic communities alike, have been taken over by this brand of elite and deadly-reliable autopilot software. Current reports from the ground indicate that their arsenal includes sophisticated Multicopters, Helicopters, Planes, (a HUGE list) and now more recently… what’s that? BALANCE BOTS too!

The serious stuff begins…
The concept of a Self Balancing Robot(Balance Bot) running on Ardupilot software, though still a very refreshing idea, is not entirely new. It was done way back by Jason Short, one of the first ArduCopter developers. More recently, Jonathan Challinger also made his Balance Bot based on Ardupilot. The robust and modular software structure, extensive hardware support and easy to implement libraries, makes Ardupilot an ideal platform for a Balance Bot. The aim of this project is to build on this idea and turn the Balance Bot into a fully supported Ardupilot platform, as an extension to the Rover vehicle class.

How it works

We use a two stage control algorithm for the balance bot. The first order of business here, is to ensure the balance bot does not topple. The inner loop of the controller uses the IMU sensors to get the present pitch(lean) angle which it compares with a desired pitch angle, to send an appropriate throttle value to the motors. This desired pitch angle is zero(upright) when the bot is not moving. What happens when it wants to move? In that case, the desired pitch will be set to a small angle in the direction of movement. If the bot tries maintain a forward pitch, it will have to accelerate forward to keep its balance and likewise for reverse. The outer control loop controls linear velocity. It basically sees to it that the acceleration due to pitching in any direction does not go out of hand by modifying the desired pitch accordingly. Balance bots by design use a differential drive or skid steering. So turning is done by setting different velocities to each wheel.

Phase one: Simulation
Simulation for Ardupilot, is more than just a convenience tool for testing and development. The SITL simulator forms the bedrock for Ardupilot’s continuous integration checks. Those are tests, that each patch to the main code has to clear before it is accepted. This ensures that no patch can corrupt or break the functioning of the existing code, while keeping code contribution simple and open to all. So basically, no new vehicle code can be acceptable without an SITL simulator backing it.
The fact that SITL is one of those libraries without any examples or documentation, makes it a demon to wrestle, for newbies to the code base. I intend to change that after this project. Still, after a week or two of frustration, dead ends, lots of studying physics :joy: and because of my two amazing mentors, we managed to get a reasonably good simulator running, It still needs a bit more work to make it accurate. These are the sources I used to model the balance bot:

Phase two: Testing on the real robot
Here is a video of one my first tests:

Evidently, it was very unstable. Even after some painstaking PID tuning, it still wouldn’t stabilise properly. One reason as I later figured out(when Jonathan Challinger told me), was that all the weight was at the bottom. This means very low centre of gravity and hence low inertia, making it incredibly difficult to balance. Remember trying to balance a stick with a lump of clay on it? Same thing really. This could be fixed by moving the battery to the top. The second reason was that the adaptor between the motors and wheels had a considerable amount of backlash, which is OK for a rover but unforgivable for a balance bot. That meant getting new wheels.

Phase three: Manual Mode(Current)
Manual mode in rover, means the user inputs are directly mapped to the motors without any control system in between. For the balance bot, we decided to implement only the inner loop for pitch control, in Manual mode. The speed control would be up to the user. One downside would be that it is incredibly hard to move the robot without toppling it, because constant pitching can easily make it accelerate out of control. But again, manual mode is for the experts :stuck_out_tongue: (which I definitely am not)
In this trial, I moved the battery to the top, but the wheels are still the same. So the robot is a bit twitchy when it moves. Take a look:

What’s next?
The short version: there’s more left to be done than what’s done. The immediate tasks in front would be:

  1. Improve the accuracy of the SITL model
  2. Add the second control loop for speed control(using wheel encoders for feedback)
  3. Test and verify each rover mode
  4. User Documentation
  5. Unit Tests

Parts used

  1. Pixfalcon (switching to Pixhawk clone, to use wheel enoders and DIR/PWM motor driver)
  2. SiK Telemetry radio
  3. Quicrun 1060 brushed ESC (Switching to DIR/PWM motor driver)
  4. Flysky FS-i6s RX/TX
  5. 200RPM 12V motors (Switching to 600rpm motors with encoder)
  6. 3000mAh Lipo battery
  7. Ublox M8N GPS(currently disabled for testing indoors :stuck_out_tongue:)

On a closing note, I’d like to add that the real brains behind the operation have been my mentors @tridge and @peterbarker. Without these guys, I really would have been lost at sea(probably still figuring out the SITL stuff :wink: ). Special mention to @rmackay9 and @jschall for all the help and advice.


thanks for a great posting! It is a real pleasure working with you on this project
It is also worth mentioning your current pull request for adding balance bot support to master:

I expect that will be accepted into master soon

1 Like

I would like to know if it is possible to use the Pixracer (or other low cost FC based on F405 like Matek, Omnibus …) for this Balance Bot and in general for a Skid steering Rover with wheel encoders.
From this docu page I see that are needed 2 PWM out and 4 inputs for the quadrature signal from the encoders.
The aforementioned boards have 6 PWM out, is it possible to set them as 2 PWM out and 4 inputs?


I don’t think so. Encoders and PWM/DIR motor drivers use aux pins which all the smaller pixhawk variants don’t have. It’s the way the libraries are written as of now. I am hoping to change that, but I’m not sure of its feasibility. I would recommend getting one of those pixhawk clones.
As for matek/omnibus, I’m not really sure and the right people to answer would be @tridge or @bugobliterator

It is possible to do that if the outputs are on the main MCU, like they are for the Matek, Omnibus etc boards. Just set BRD_PWM_COUNT=2 and the rest can be for wheel encoding. For the Pixracer it’s also possible in the same way.
For something like a pixfalcon it isn’t nearly as easy, as all 8 outputs are on the IOMCU, which means not directly accessible from the main MCU.
Cheers, Tridge

1 Like

Ah! Just our luck, we ended up with the pixfalcon!

Good to know @tridge

Hey @Ebin_Philip, if you ever need some mechanical advice, I may be able to help out. I’m not much of a code guy myself, but when it comes to the prototype, my mechanical engineering education does help a bit. Haha

Real cool work bro!!

1 Like

Oh sure! Thanks. Any help would be appreciated. As of now, there is this person who’s designed a chassis for the balance bot(ArduRoller - balanced car on pixhawk). Do take a look and see if we can make it better! @Rijoe_Samuel_Mathew Thanks :blush:

Docs are up:
Time to start testing, people!

1 Like

I have this thing, with an Arduino and stepper motors, controlled with an R/C radio. It is driven manually on that video.

What should I do to replace the Arduino with a Pixhawk, maintaining the stepper motors (no encoders)? Is that contemplated in the code?

The final objective is place a RTK GPS and do a mission consisting of laps on that circuit, which is for 1/18 cars (around 2m wide).

I’ve been designing this robot for a while now, the goal is to have it recognize faces and say hello if it recognizes the person.

I’m using a pixhawk drone brain for the balancing part using ardupilot balance bot code to get it all working with minimal redevelopment. So, thanks a bunch for the project and all the awesome code and discussions that you guys have posted. I wouldn’t have been able to do it without you.

The reason why I am posting is that part of my design is to have a retractable “LEG” to come down and support the robot while it is in park mode. This will require the balancing to be turned on and off depending on whether the leg is down or up. Also, it will require the balance bot to fall in a particular direction when the leg is down and it wants to “Park”.

Does anyone have any suggestions on the best way to accomplish this using the existing code? I was thinking that a special mode could be created that would allow it to only fall in one direction, but my programming ability is perhaps not fully up to snuff to carry this off.

Any help would be awesome. Thanks!