[OSM-dev] API 0.6 returns 403 forbidden
warren at mit.edu
Thu May 14 17:40:54 BST 2009
Tels - I'd like to reiterate my offer of a hosted machine with lots of
bandwidth and space for a planet dump for our applications. But i also agree
that another solution might be better long-term.
For the short term, are you interested in working with me to get a dump
running and serving... JSON?
On Thu, May 14, 2009 at 12:35 PM, Tels <nospam-abuse at bloodgate.com> wrote:
> On Thursday 14 May 2009 09:31:56 Tom Hughes wrote:
> > Jeffrey Warren wrote:
> > > Hi, I suddenly started getting "403 - Forbidden" from my any
> > > machine. I also get a 403 when i simply go to
> > > "www.openstreetmap.org
> > > <http://www.openstreetmap.org>" in a browser (just discovered
> > > this...). Have I been blacklisted or is this an outage?
> > I suspect you're the person I blocked last night because you were
> > apparently trying to scrape map data from the API with a large number
> > of presumably automated map calls.
> > If you are the person I blocked then in the last four days you've
> > made a total of 8186 map calls, an average of about 1.5 a minute.
> > If you want bulk data, please use the planet dump not the API to get
> > it.
> I presume my own application (http://bloodgate.com/wiki/map) will then
> be blocked, too.
> Using the "plant dump + diffs" sounds easy, but is in fact impossible
> for me. My online server does not have the capacity to store the entire
> world, and I do not have the computing power and time to convert it
> up-front, either. (Not to mention do upload the converted data to my
> server from my home DSL line :(
> (There is also the question what happens if importing an hourly diff
> takes regulary longer than 60 minutes. So far the only data I have on
> that is someone mentioning that it "takes between 40 and 70 minutes
> with a fast harddisk"...)
> Also, to download the whole dump, update it with minute diffs and
> pregenerate a huge database is vastly more complicated and uses a lot
> more processing power than just to fetch (and cache for X days) the few
> areas someone is actually looking at.
> So, how will OSM handle this situation in the future? Will there be a
> fast API server (plus a few "shadow", copy servers) than can deliver
> tile data in (semi) real-time, or will each and every person that wants
> to create a "view" of the data have to go to the hassle of dealing with
> the entire "download dump, update it and convert it" scheme?
> The latter seems like it will rule out quite a few applications just
> because it places a really high burden on each and every project to
> duplicate the same work. And that is without the small detail that next
> year the dump might be easily twice as large :)
> All the best,
> Signed on Thu May 14 18:26:38 2009 with key 0x93B84C15.
> View my photo gallery: http://bloodgate.com/photos
> PGP key on http://bloodgate.com/tels.asc or per email.
> ". . . my work, which I've done for a long time, was not pursued in
> order to gain the praise I now enjoy, but chiefly from a craving after
> knowledge, which I notice resides in me more than in most other men.
> And therewithal, whenever I found out anything remarkable, I have
> thought it my duty to put down my discovery on paper, so that all
> ingenious people might be informed thereof."
> -- Antony van Leeuwenhoek. Letter of June 12, 1716
-------------- next part --------------
An HTML attachment was scrubbed...
More information about the dev