GPS_HDOP_GOOD has a range of 100-900. However the hdop value displayed in MP is typically less than 1.
Why the discrepancy in units and what is a good default for value for GPS_HDOP_GOOD ?
100 = HDOP of 1. It’s just scale. So the default of 140 is 1.4 HDOP. I can’t explain why it is like this.
Integer representations are more precise than floating point.
However, for small values only precise to two decimal places, it could just as easily be represented as a float with no real loss of precision. Probably just a holdover from past design philosophies or maybe avoids a type conversion in the GPS backend.
1 Like