[OSM-dev] Full History Files

Brett Henderson brett at bretth.com
Thu Apr 23 03:04:00 BST 2009


Stefan de Konink wrote:
> Brett Henderson wrote:
>> Anyway, happy to discuss.  And I'd like to hear if there are better 
>> alternatives or if people think this is a bad idea.
>
> What about setting up a RSS with torrent files that seed the history?
Perhaps it could form part of the solution.  It is a good way of 
distributing large files that aren't updated at fast intervals (ie. the 
planet), so would be suited to distributing a single large history file 
or a large directory of history files.

I'm not sure if Bittorrent is a good way for distributing new daily 
files on a daily basis though.

If I'm downloading changes I use the osmosis --read-change-interval task 
which reads the timestamp file on the planet server, reads the local 
timestamp file, then downloads all intermediate changes and merges them 
into a single change stream which then be applied to a local file or 
database.  It can be launched from cron on a regular basis (every minute 
if necessary) to keep a local replicate up to date.  It requires very 
little effort on the client side and is robust in making sure changes 
get applied in the correct order even in the face of networking 
problems.  That would be very difficult to achieve with bittorrent and 
would add a lot of latency.

So bittorrent could definitely be used as a way of people seeding their 
original database with historical data up to a point in time, but if 
they subsequently want to keep in sync with changes they may be better 
served by switching to direct changeset downloads.  My suggestion would 
be to provide weekly merged changesets via bittorrent and make 
daily/hourly/minute? files available via direct download.  What do you 
think?

Brett





More information about the dev mailing list