I was reading this question here:
What datatype to use when storing latitude and longitude data in SQL databases?
And it seems the general consensus is that using Decimal(9,6) is the way to go. The question for me is, how accurate do I really need this?
For instance, Google's API returns a result like:
"lat": 37.4219720,
"lng": -122.0841430
Out of -122.0841430, how many digits do I need? I've read several guides but I can't make enough sense out of them to figure this out.
To be more precise in my question: If I want to be accurate within 50 feet of the exact location, how many decimal points do I need to store?
Perhaps a better question would actually be a non-programming question, but it would be: how much more accurate does each decimal point give you?
Is it this simple?
- List item
- x00 = 6000 miles
- xx0 = 600 miles
- xxx = 60 miles
- xxx.x = 6 miles
- xxx.xx = .6 miles
- etc?
See Question&Answers more detail:
os 与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…