[OSM-dev] Server-side data validation

Paweł Paprota ppawel at fastmail.fm
Fri Jul 13 18:27:25 BST 2012

Hi all,

Today I have encountered a lot of bad data in my area - duplicated
nodes/ways. These probably stem from an inexperienced user or faulty
editor software when drawing building. I corrected a lot of this stuff,
see changesets:


As you can see, these changesets remove thousands of nodes/ways. I have
done this using JOSM validators and "Fix it" option which automatically
merges/deletes nodes that are duplicated.

That is all fine of course but this sparked a thought... why is this
garbage data like this allowed into the database in the first place? Of
course it can always be fixed client-side (JOSM, even some autobots) but
why allow an unconnected untagged nodes or duplicated nodes, duplicated
ways etc.?

I understand (though don't wholly agree...) the concept of having a very
generic data model where anyone can push anything into the database but
it would be trivial to implement some server-side validations for these
cases (so that API throws errors and does not accept such data) and thus
reduce client-side work by a very significant margin - i.e. I could have
been working on something more useful in that time than removing garbage

Server-side validation could be of course taken even further - OSM
server could reject meaningless tag combinations etc. - basically JOSM
validators on the "error" level should be implemented as server-side
validators, some "warning" level validators possibly as well.

This would ensure data consistency and integrity at least a little
bit... (of course first bad data would have to be pruned from existing
database so that it is consistent with validation logic but that's for
another discussion).

What is the current consensus within OSM dev community on this aspect of
OSM architecture?


More information about the dev mailing list