<!DOCTYPE html PUBLIC "-//W3C//DTD HTML 4.01 Transitional//EN">
<html>
<head>
<meta content="text/html;charset=ISO-8859-1" http-equiv="Content-Type">
<title></title>
</head>
<body bgcolor="#ffffff" text="#000000">
Frederik Ramm wrote:
<blockquote cite="mid:4AAFEC48.3010000@remote.org" type="cite">
<pre wrap="">Hi,
Ian Dees wrote:
</pre>
<blockquote type="cite">
<pre wrap="">That's disappointing. Am I the only one that cares about this? With
permission I'd be happy to try and write something for the
openstreetmap.org <a class="moz-txt-link-rfc2396E" href="http://openstreetmap.org"><http://openstreetmap.org></a> site to make this feasible
and not kill the API frontend servers.
</pre>
</blockquote>
</blockquote>
I don't think you're the only one who cares about this, it's come up
fairly often in the past.<br>
<blockquote cite="mid:4AAFEC48.3010000@remote.org" type="cite">
<pre wrap=""><!---->
I am very much in favour of supplying, and periodically updating, a full
OSM history dump. I think the main reasons why this is not being done
currently are
(a) nobody wrote a program for it
</pre>
</blockquote>
I have, but not in a single file format.<br>
<blockquote cite="mid:4AAFEC48.3010000@remote.org" type="cite">
<pre wrap="">
(b) any program that does get written would have to be engineered in a
clever way in order not to put too much strain on the data base and
still produce something consistent
</pre>
</blockquote>
The easiest way to avoid strain is to extract deltas based on timestamp
and merge offline.<br>
<blockquote cite="mid:4AAFEC48.3010000@remote.org" type="cite">
<pre wrap="">
So my suggestion would be to write such a program an then campaign for
it being used on openstreetmap.org. - Third party servers importing that
data could then offer all sorts of APIs for history queries.
</pre>
</blockquote>
Something like this does already exist in Osmosis. The only reason it
isn't being run regularly is due to lack of disk space on dev.<br>
<a class="moz-txt-link-freetext" href="http://planet.openstreetmap.org/history/">http://planet.openstreetmap.org/history/</a><br>
<br>
They are full history diffs, broken into daily chunks. Grouping them
into larger chunks is fairly trivial and can be done offline.<br>
<br>
Once the new services server comes online I'll run it daily and keep it
up to date. A new full history file could be generated every so often
(eg. weekly) by merging in the most recent daily files into an existing
rollup file.<br>
<br>
If you wish to keep an offline database up to date with full history
you can use:<br>
<a class="moz-txt-link-freetext" href="http://planet.openstreetmap.org/minute-replicate/">http://planet.openstreetmap.org/minute-replicate/</a><br>
<br>
Note that the above replication files are very experimental at the
moment. They are full history files which means they may contain
multiple changes for a single entity. They appear to work but I can't
be 100% sure they're not missing data. Any missing data will be due to
bugs in code, not in the replication technique. However they are 100%
up to date (ie. no 5 minute delay). They're using PostgreSQL
transaction ids to select data and don't use timestamps at all.
They're generated once per minute, but the data inside them is not
aligned to minute boundaries. Existing osmosis tasks should be used
with care because most of them have not been verified with full history
files (eg. The --apply-change task will probably fail).<br>
<br>
Brett<br>
<br>
</body>
</html>