Hello everyone, I’m trying to do a drone guided by artificial vision and from a simulation STIL works without problem, but when I’m working with the real drone I can not get it to move, although I can take data from all the parameters.
I am using ArduCopter 3.5. and a Raspberry with Dronekit-Python.
def to_quaternion(roll = 0.0, pitch = 0.0, yaw = 0.0):
# Convert degrees to quaternions
t0 = math.cos(math.radians(yaw * 0.5))
t1 = math.sin(math.radians(yaw * 0.5))
t2 = math.cos(math.radians(roll * 0.5))
t3 = math.sin(math.radians(roll * 0.5))
t4 = math.cos(math.radians(pitch * 0.5))
t5 = math.sin(math.radians(pitch * 0.5))
w = t0 * t2 * t4 + t1 * t3 * t5
x = t0 * t3 * t4 - t1 * t2 * t5
y = t0 * t2 * t5 + t1 * t3 * t4
z = t1 * t2 * t4 - t0 * t3 * t5
return [w, x, y, z]
def set_attitude(roll_angle = 0.0, pitch_angle = 0.0, yaw_rate = 0.0, thrust = 0.5):
global vehicle
# Thrust > 0.5: Ascend
# Thrust == 0.5: Hold the altitude
# Thrust < 0.5: Descend
msg = vehicle.message_factory.set_attitude_target_encode(
0, # time_boot_ms
1, # Target system
1, # Target component
0b00000000, # Type mask: bit 1 is LSB
to_quaternion(roll_angle, pitch_angle), # Quaternion
0, # Body roll rate in radian
0, # Body pitch rate in radian
math.radians(yaw_rate), # Body yaw rate in radian
thrust # Thrust
)
vehicle.send_mavlink(msg)
vehicle.flush()
Somebody could help me? maybe the way to send the data is not correct.