[OSM-dev] Evaluating the Full-History-Dump
osm-lists at mazdermind.de
Thu Oct 14 21:14:38 BST 2010
Am 12.10.2010 00:52, schrieb Anthony:
> On Mon, Oct 11, 2010 at 5:07 PM, Peter
Körner<osm-lists at mazdermind.de> wrote:
>> These dumps could be used as test-cases to benchmark tools and
>> compare results without the need of a huge 64GB RAM server or 100
>> MBit Internet connection.
> FWIW, I've been able to download and parse the full history dumps on
> my less than a gig of ram server connected to a cable modem. It took
> about a week and half IIRC.
My proplem is not processing the full dump with a ready tool but testing
a tool under development with the dump. It always take minutes to skip
those >5000k changesets.
I wrote a tool  that splits a planet or full-experimental dump into 4
It best runs on a multicore system (1 de-, 4 compress and 1 splitter). I
currently run it on a full-experimental dump and I think it will take
some more hours but it would be useful to share them on planet.osm.org.
More information about the dev