[OSM-talk] Live Data - all new Data in OSM

Matt Amos zerebubuth at gmail.com
Wed May 13 18:18:38 BST 2009


On Wed, May 13, 2009 at 5:15 PM, Tom Hughes <tom at compton.nu> wrote:
> Ian Dees wrote:
>> The whole argument I'm making is that after the initial
>> implementation**, streaming the data is a lot less resource intensive
>> than what we are currently doing. Perhaps I don't have the whole picture
>> of what goes on in the backend, but at some point the changeset XML
>> files are applied to the database. At this point, we already have the
>> XML changeset that was created by the client. The stream would simply be
>> mirroring that out to anyone listening over a compressed HTTP channel.
>
> You don't want Potlatch's changes then? or changes made by changing
> individual objects rather than uploading diffs?

+1

or even the diffs? any diff where someone creates an element has
negative placeholder IDs, so extra work would have to be done altering
the XML to match the IDs returned by the database.

and the HTTP stream would contain many osmChange documents? that won't
really work with any XML parser i know of... you'd need to pre-parse
it into separate XML documents first.

and how would you take these XML documents on the API servers and
merge them into a consistent ordered stream, ensuring all data
dependencies are satisfied?

all of that in less work than than osmosis' diff queries?

cheers,

matt




More information about the talk mailing list