[Imports] First questions

Pieren pieren3 at gmail.com
Mon Jul 13 16:34:53 BST 2009


Hello the list,

Thanks for creating this dedicated list for imports !

Here in France, we are preparing an import of a large amount of
landuse data (11.300.000 nodes defining 200.000 polygons).
We are currently making some tests where we try to create all
polygones one after the other. What we don't want is an import
creating the 11 millions nodes and 2 or 3 weeks later start creating
the ways. The risk is too high that many nodes are manually removed in
the interim.

Some questions here:
- because all polygons are linked together (landuse), it is very hard
to cut the import in small parts without either creating duplicate
nodes or merging data (download area, merge, upload). Is it the best
approach to create a single .osm file with all such amount of data ?
(note that for test purpose we already generated a single file
containing everything in 1.4GB uncompressed)
- about the import itself, what are the recommandations ?
  - use a script (like bulk_upload.py) uploading continuously small
changsets (about 1000 changes) ?
  - create bigger changesets ?
  - insert some delays between changesets (30..60 seconds) ?
  - send the file to one of the server administrators who could
transfer the data more directly (thus saving some bandwidth) ?

Thanks for your help,
Pieren




More information about the Imports mailing list