[Talk-us] How to apply changesets

David Niklas doark at mail.com
Mon Mar 13 21:43:02 UTC 2017

On 03/12/2017(Sun) 05:14
Frederik Ramm <frederik at remote.org> wrote:
> Hi,
> On 11.03.2017 15:48, David Niklas wrote:
> > 1: Do I need every changes set or just the latest of the year?  
> You don't say exactly what you want to achieve so I can't tell you
> exactly what you need. In your subject you talk about changesets, but it
> seems that you are concerned about diff updates published on
> openstreetmap.org. If that'S the case: They are not cumulative; you have
> to find out what the timestamp of your data set is and download those
> between then and now.
I apologize for being unclear, I'm referring to the
changeset-XXXXXX.osm.bz2 files.

> Osmosis' "read-replication-interval" action can help you keep track of
> how current your data is, and download all changesets between then and
> now, and combine them into one automatically.
Thanks, but that is not an option for me, as my laptop's disk is too small
and my desktop can't be connected to the internet (I live in a rural

> > 2: If I need multiple changesets, can I apply them all at once using
> > multiple --read-xml-change directives?  
> Yes, *and* multiple --apply-change of course. But if you use
> "read-replication-interval" then you will only deal with one diff file
> that has all the changes.

Unfortunately, I'm getting an error:
SEVER: Thread for task 1-read-xml-change failed
org.openstreetmap.osmosis.core.OsmosisRuntimeException: This does not
appear to be an OSM Change XML file.

Here is my (minimal) command line (I checked the hashes and decompressed
the changesets before hand), (I also checked the hash of the planet
lbzip2 -dkc planet-160718.osm.bz2 | pv -WcarbB 100m | osmosis
--read-xml-change file=changesets-160725.osm compressionMethod=none
--read-xml file=/dev/stdin compressionMethod=none --apply-change
--write-xml file=- compressionMethod=none | pv -WcarbB 100m | lbzip2
-z4c5 > planet-160725.osm.bz2

In the unlikely event that decompression resulted in a corrupted file, I
did try using the compressed and md5sum verified originals, but still got
the same error.


More information about the Talk-us mailing list