[OSM-dev] Data source for robot

Matt Amos zerebubuth at gmail.com
Tue Oct 12 18:12:02 BST 2010


On Tue, Oct 12, 2010 at 5:50 PM, Andrew <wynndale at lavabit.com> wrote:
> Matt Amos <zerebubuth <at> gmail.com> writes:
>
>> of course, the best thing is that these automated edits never happen
>> at all, instead that tools are provided (like the geofabrik inspector,
>> keep-right or the duplicate nodes map) to help the community fix these
>> errors themselves. we need to start cracking down on these disruptive
>> and counter-productive bots.
>
> The duplicate nodes map is not the best example of this. People have sometimes
> jumped in and done heavy edits not based on local knowledge.

i totally agree, which is why the about page of the duplicate nodes
map says: "This means that each point marked on the map isn't
definitely an error, but it could be one" and the wiki page says "It
is important to note that, despite the design of the map, these are
not all necessarily errors, and should not all be 'fixed' by merging
nodes". no-one should be doing edits, especially on a large scale,
without a healthy dose of common sense and, preferably, local
knowledge.

> A bot that joined
> roads to other roads (and only to other roads) at county boundaries would
> actually be more constructive as we could then add a notice that this issue
> with TIGER roads has been fixed completely.

i don't think so - it might be more constructive to have a map which
only displayed duplicates between roads and other roads at county
boundaries so that US mappers could concentrate their efforts - in
fact, it's something i once ran and (time permitting) hope to develop
again. a bot, however, would always be less constructive than real
people looking at the data. see
http://matt.dev.openstreetmap.org/dupe_nodes/about.html#just_use_a_bot

cheers,

matt



More information about the dev mailing list