[Talk-us] A Friendly Guide to 'Bots and Imports
Nathan Edgars II
neroute2 at gmail.com
Fri Aug 6 21:41:35 BST 2010
On Fri, Aug 6, 2010 at 3:40 PM, Kevin Atkinson <kevin at atkinson.dhs.org> wrote:
> If I had too go though a formal approval process I would not have even
> bothers with my script. I'm not sure I wouldn't really call it a bot
> because I manually downloaded the data, ran a script on the data, than
> manually uploaded the data. As oppose to something complete automatic.
> And what exactly consists of a bot. Would the clean up of Florida's County
> routes "ref" tagging been a bot. It a large scale task systematic change,
> even if a script (I think he used search and replace on an editor) was not
It seems that both of these are akin to using AutoWikiBrowser on
Wikipedia - you need to be approved to use AWB (and you're
automatically approved as long as you have a bit of experience and no
major problems), but then you can do whatever you want with it (and of
course get blocked if you screw up). AWB is mainly used to do
complicated find-replace actions while looking at each individual page
before saving. With an approved bot you can skip the individual
My county route ref change was somewhere in the middle - I did an
automated find-replace for most, but did some abnormal cases by hand
and then ran a few checks before saving.
Bot imports on Wikipedia are extremely rare. The only one I know of
that went successfully was the creation of stubs for all US populated
places from Census Bureau data back in 2002.
> If you want to go though with this I think you need a better definition of a
> "bot" which should consist of at least one of
> 1) Large Scale Change
> 2) Fully automatic
> Defining 1) would be tricky, something over the united states count. But
> what about an entire state if the change is limited in scope?
What about the whole world if you're fixing a small number of
incorrect tags (barrier=bollards to barrier=bollard for example)?
More information about the Talk-us