[OSM-talk] Wiki Offline Copy?

Alexander Menk menk-you.should.remove.this.for.permanent.contact at mestrona.net
Mon Nov 10 15:20:57 GMT 2008


Hi!

Grant Slater wrote:
>> maybe a good intermediate solution would be to provide zipped "wget -r" 
>> copies of the wiki? Of course not everybody shall do "wget -r 
>> wiki.openstreetmap.org" because this would cause to much load, but I 
>> could offer some webspace and do weekly snapshots or s.th. like this?
>>   
> 
> Dude! No no no.
sorry, did not want to scare you - I just asked...

> ~23100 pages x multiple page views x all history x special page links = 
> big scary number. Do not do it.
I would have excluded the same stuff that is excluded in the robots.txt 
(history, special pages etc.)

> Bots are causing most of the load problems on the wiki, do not start 
> another one.
"but I want the bot ... ;-(" (Over the hedge / Hammy - "but I want the 
cookie")

>> Unfortunately wget is excluded by robots.txt, that's why I ask before I 
>> do so..
>>   
> 
> Yes, wget is blocked for good reason. Wikipedia blocks it too.
> 
> The server which runs the wiki will be upgraded soon. If the community 
> then agrees, we could setup a weekly data dump. Something like: 
> http://meta.wikimedia.org/wiki/Data_dumps

[x] agree

Unfortunately a data dump is not as handy as a ZIP file with many HTML 
files you can just pass around on some CDs to people with modems who are 
interested in OSM. But with the Data Dump I could set up a mediawiki on 
my own and do the wget there .. of course. But it's not very trivial to 
do so, especially including all the images.

That's why I came up with the wget suggestion. Of course only ONE should 
do such a wget during a timeframe with a low load on the wiki (but I 
guess such a timeframe is just not existing...)

Alex





More information about the talk mailing list