suggestion about get_accuracy_from_hdop() formula
Submitted by Fabrice Bellet
Description
I think the HDOP-to-accuracy formula should be tweaked a bit, to provide more realistic values if possible. I understand that no universal formula exists, as it seems to depend on the hardware precision of each GPS device, according documents I read (*). According to this information, the HDOP should roughly be multiplied by this internal hardware precision, which is usually considered to be 5-10m.
(*) http://www.developerfusion.com/article/4652/writing-your-own-gps-applications-part-2/3/
The primary reason for this change is to avoid returning an accuracy of zero when HDOP is below 1.0. Such a value has no real meaning, and should be avoided IMO. Why not for example returning 5*max(hdop) for each range tested in get_accuracy_from_hdop() ?.
With my garmin GPS device, a "good" accuracy is generally between 5m-15m while moving in urban environment. I can reach 3m-5m in perfect clear sky view (not urban env). I guess the antenna from this device may of good quality.
For example, I traced the NMEA data of a bluetooth cheap GPS device, and the HDOP value was approximately in the range [0.8,2.0], with the majority of the values around 1.0, in an urban environment.
What about these values ? if (hdop <= 1)
-
return 0;
-
return 5; else if (hdop <= 2)
-
return 1;
-
return 10; else if (hdop <= 5)
-
return 3;
-
return 25; else if (hdop <= 10) return 50; else if (hdop <= 20)
or simply hdop * 5 when hdop <= 20 ?