[HOT] Validating & TM2 - providing feedback to new mappers
Mikel Maron
mikel.maron at gmail.com
Sun Aug 31 14:21:52 UTC 2014
Severin
I like the ideas here. Essentially, we're talking about calculating and
recording mappers "reputation", and then incorporating into handling of
validation.
"How did you contribute" http://hdyc.neis-one.org/ is one example of
reputation calculation. It's entirely automated. Your suggesting to add
specific, manually recorded feedback on a users specific edits. A
prerequisite would be to categorize OSMTM taks by the general kind of work
required, which could have other benefits (like standardizing instructions).
I wonder if we could have the same result, to focus validator efforts and
give feedback to mappers, by simply using the HDYC reputation calculation
("Type of Mapper"). Are most cases of mistakes in HOT jobs from relatively
new mappers to OSM? Or do we also see experienced OSM-ers making mistakes
in unfamiliar terrain?
In any case, the work on the OSMTM2 API could help. We can then more easily
analyse active jobs, by querying and maybe even setting status through the
API.
Another idea I've been mulling over is "microtasking" validation. The idea
is to make in depth validation easier; it seems that validation is not so
popular or easy with our current tools.
Split up the validation, of say buildings over a task square, into
individual building microtasks. A contributor simply marks whether the
building was drawn correctly, or not. This should only take a few seconds,
and could involve large numbers of people in a very simple task. The
collective microtask results of a task square could then be used to guide
selective in depth validation.
-Mikel
On Sunday, August 31, 2014 3:09 AM, Severin Menard <severin.menard at gmail.com>
wrote:
> Hi all,
>
> Thinking aloud, would not be possible (actually asking tech people) to
have a tool allwing the
> following:
> - detecting changesets with hotosm hashtags and picking up the username
> - comparing the username to a list of HOTOSM contributors and stating if
it is new or not and
> already validated for a certain numbers of quality items (like drawing
buildings correctly, drawing
> roads correctly, tagging roads correctly, etc.)
> - when contributors have not been validated yet, a task is sent to a
validation team
> - one team member picks up the task, check, validate the work of the
contributor and contact
> her/him if some mistakes. A form would be great, with checkboxes for
typical errors, and if
> making a typical one, the contributor would receive in the answer link to
the Learning point (there
> are already quite a few in LearnOSM) dedicated to this error
> - once done the hotosm contributor quality status for this contributor
would be: good for such and
> such aspects, bad for such and such ones and the latter would then be
tasked in the future for
> validators as soon as this contributor would submit a new changeset
>
> Thoughts?
>
> Sincerely,
>
> Severin
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.openstreetmap.org/pipermail/hot/attachments/20140831/74b0be2e/attachment-0001.html>
More information about the HOT
mailing list