[OSM-dev] Cartagen - client-side vector based map renderer, dynamic maps
nospam-abuse at bloodgate.com
Fri May 8 14:34:13 BST 2009
On Thursday 07 May 2009 01:46:36 you wrote:
> Hi, Tels -
> > > It's not been optimized yet, so loading is a little slow, but I'm
> > optimistic
> > > that it will scale.
> > Based on my experience, I can tell you right away it won't scale :)
> > Not to discourage you, but:
> > * the amount of data is really huge. Throwing a few dozend megabyte
> > XML or even
> > JSON at the browser will bring it to its knees. Not to mention the
> > data you need
> > to render even a small city.
> > * re-rendering everything takes a long time. You want to avoid that
> > :)
> I was actually talking about server-side load time. I'm running it
> off the 0.6 API, so it packs up XML, sends it to my server, i unpack,
> re-encode to JSON, send to the browser, render. Obviously that's
> SUPER inefficient, so I'm looking forward to cutting a lot of that
> out in the next week or so.
Heh, that sounds like my setup :)
My client requests data in (currently) 0.1° tiles, like 7.4,50.1 - 7.5,
50.2 from a proxy. The public proxy runs at
http://bloodgate.com/cgi-bin/osmapi and serves these requests as
gzipped JSON. It is also possible to run one locally so you don't need
(I intend to document it, please give me a day or two).
* The proxy receives XML from the api or xapi server. Currently it
requests the full dataset.
* Then it removes unnec. tags (like note, fixme, attribution and a whole
bunch of others that are not needed for rendering). Some of them are
very minor, but 10000 nodes with "attribution=veryvery long string
here" can make up like 40% of all the data, and just clog the line and
* The data is then converted into a structure that is easier to work
with (nodes as hash, ways as list, multipolygon ways attached to the
outer polygon to avoid issues with the render order etc.).
* The data is then pruned into (currently 3) levels and stored in a
* level 0 - full
* level 1 - no POI, no paths, streams, tracks etc. used for zoom 11
* level 2 - no tertiary roads etc. used for zoom 10 and below
* The client is served the level it currently requested as JSON.gz.
That scheme can surely be improved but it works for now.
The problem with that scheme is that:
* There are three servers in the list (api.openstreetmap,
xapi.informationfreeway and tagwatch) and a lot of them do not complete
the request (internal error, not implemented etc. etc.). It can take a
lot of retries to finally get the data.
* Even when you get the data, it takes seconds (10..40 seconds
is "normal") to minutes - upwards to 360 seconds just to serve one
So currently all received data is stored in the cache for 7 days to
avoid the very very long loading times.
Ideas of fetching the full dataset and pre-computing the cache simple
don't work because I don't have a big enough machine and no big enough
online account to store the resulting JSON :(
> Actually, rendering in the browser's been pretty good - for example
> this page loaded with no noticeable slowdown, and I haven't even
> begun optimizing:
> But you're right, it's a challenge. I'm impressed that you rendered a
> whole city like Berlin - do you have some code online so I can see,
> or a screenshot? I bet it looks great...
Actually, I managed to render Bonn, Germany, but Berlin, Germany is out
because the amount of data exceeds Firefox stack limit. Oops.
I'll upload a new screenshot soon.
> What I'm looking at now is:
> a) rendering only some tags per zoom-level, so no rendering footpaths
> and buildings as you zoom out... but that's dependent on the xapi,
> which I haven't been able to fetch from reliably (help anyone?)
I already have that implemented, the render rule specifies for which
zoom level that feature applies. Plus a few other hard-coded rules
like "no borders from zoom X upwards", but these should be pushed into
the ruleset, just like the default map background should be pushed
there. (Nice idea :)
> b) cutting the API out of the loop and running direct from a
> planet.osm, but then you can't use it to view live edits, like you
> can here: http://vimeo.com/4435969
Also, somehow processing 150 Gbyte XML into JSON will prove to be a
> c) trying to serve partial polygons... I'd like to try plotting only
> every 3rd or 10th node... do the polygons collapse? Can i cull nodes
> in a more intelligent way? Someone on this list or geowanking pointed
> to a company that can serve lower-res polys over an API. I'm sure
> folks have worked on this in tile systems, so if you know anything
> about it and are willing to share, I'm all ears. This becomes really
> relevant as you zoom out... don't want to render every node for the
> coast of Argentina, for example.
Yes, but reducing the polygons is also a lot of work :) I haven't
started on this yet, because on zoom 12 or higher you need to render
almost anything, anyways. Plus, you would then need to cache thepartial
data somehow (computing it is expensive in JS..)
> d) oh, and localStorage. I've partially implemented that but haven't
> had much testing... other work... ugh. So caching on a few levels,
I fail to see what localstorage actually gains, as the delivered JSON is
put into the browser cache, anyway and the rest is cached in memory.
Could you maybe explain what your idea was?
> What strategies have you employed, if you're willing to share?
* I spent more time writing code instead of in-depth technical
explanations on my wiki, so please bear with me as I write them and
clean up my code.
* There is a talk I proposed for State of the Map and I don't want to
spoil everything before :)
As for the code, you can access a packed version here:
Both the "original" and the cgi-proxy are not yet public yet, as they
need a bit more polishing. But maybe I can give you a copy offlist?
> Also agreed that GSS is not technically spectacular - the driving
> motivation is that it is legible to those new to mapping, being
> CSS-like. So really an adoption decision, though the
In my opinion they violate the "don't mix data and code" rule and are
thusly out. Plus, you wouldn't be able to load rules dynamically from a
random webserver and interpret them due to cross-domain-security
violations in the browser.
Of course, semi-dynamic rules like "color them according to feature X by
formula Y" are still useful and fun, and avoid the problems above.
(Like: "use maxspeed as the color index ranging from red over green to
> Anyways, I'm excited to hear you've been working on this kind of
> stuff too. I'm happy to collaborate or just share information, the
> whole codebase is at http://code.google.com/p/cartagen/.
Thanx, I had a brief look, will look further the next days. :)
All the best,
Signed on Fri May 8 15:12:05 2009 with key 0x93B84C15.
Get one of my photo posters: http://bloodgate.com/posters
PGP key on http://bloodgate.com/tels.asc or per email.
"Duke Nukem Forever will come out before Unreal 2."
-- George Broussard, 2001 (http://tinyurl.com/6m8nh)
-------------- next part --------------
A non-text attachment was scrubbed...
Size: 481 bytes
Desc: This is a digitally signed message part.
More information about the dev