[Openstreetmap] How do we handle large amount of GPS points?

Petter Reinholdtsen pere at hungry.com
Sat Nov 20 16:10:28 GMT 2004


[SteveC]
> thats fantastic!

Yes, it is. :)

I'm still investigating how to convert the complete dump to GPX on my
small machine, but was able to get a subset out.  I fetched all points
with longitude 10.71 - 10.76, and latitude 59.93 - 59.955, and
exported it to
<URL:http://developer.skolelinux.no/~pere/openstreetmap/oslo-strip-10.71-10.76-59.93-59.955.gpx.gz>.
This file include 537047 points.  When I try to import it into qgis,
it eats all the memory on my 256 MB machine, and crashes (or is killed
by the kernel, not sure which :).  We need to find tools capable of
handling large data sets.

I plan to export on time instead, and split the dump into week or
month bulks.  But that code isn't done yet, so this will do for now.

Part of the problem is that my home network is without Internet
connection at the moment.  This make it harder to get any real work
done, as most of my tools and documentation is on the web.

> Now, my table looks like lat,lon,alt,time,userid

My table includes (lat,lon)::point, altitude, quality
(0,1=gpx,2=dgps), satellites, and hor. dilution.  I fetch data using
the GGA NMEA sentence.

> The question is, what other fields are important to people? FWIW my
> views are:

I believe the exchange format should include enough data to filter it
using DGPS info, and that the long time storage format should include
some info on the percieved accuracy.  Not sure which values are most
useful for this.

> fix - you guys might want that?

It might be useful, or it might be a complete waste.  I'm not sure
yet.

I'll try to get a look at the XML-RPC stuff when my network at home
get back.




More information about the talk mailing list