I've installed a local OpenStreetMap architecture for a project I'm currently working on. We need to load a very large amount of data into our OpenStreetMap database.<br>The original data is stored in a FileGDB that is around 40Gb. It has been transfered into the OSM structure (with negative ids and no version, changeset,etc tags).<br>
<br>My question is what would be the ideal way to procede to load this initial set of data?<br><br>I'v tried bulk_upload.py which takes about 1 hour to process 1/30000 of the data. Either my local API 0.6 needs to be tuned or it is not made to handle such important data updates.<br>
Osmosis requires that their be changeset and version tags which do not exist in the data I currently have. Writing directly to the database without passing through the API would possibly speed up the process, but I've not found a way to do this.<br>
<br>Thank you in advance for your advice.<br><br>