<html>
<head>
<meta content="text/html; charset=utf-8" http-equiv="Content-Type">
</head>
<body bgcolor="#FFFFFF" text="#000000">
<div class="moz-cite-prefix">On 21.06.2016 15:18, Martin
Koppenhoefer wrote:<br>
</div>
<blockquote
cite="mid:CABPTjTCXPwU3BQ9sH+HUofNQa1rVZrjgXb2D=XNwKeuuCZFiNw@mail.gmail.com"
type="cite">
<div dir="ltr">
<div class="gmail_extra"><br>
<div class="gmail_quote">2016-06-21 14:40 GMT+02:00 Andy
Mabbett <span dir="ltr"><<a moz-do-not-send="true"
href="mailto:andy@pigsonthewing.org.uk" target="_blank">andy@pigsonthewing.org.uk</a>></span>:<br>
<blockquote class="gmail_quote" style="margin:0 0 0
.8ex;border-left:1px #ccc solid;padding-left:1ex"><span
class="">
<p dir="ltr">On 20 Jun 2016 5:31 pm, "Martin
Koppenhoefer" <<a moz-do-not-send="true"
href="mailto:dieterdreist@gmail.com" target="_blank">dieterdreist@gmail.com</a>>
wrote:</p>
<p dir="ltr">> I have just discovered another type of
problem:</p>
</span>
<p dir="ltr">> people adding full wikipedia urls into
the website tag. In all cases there was already a
wikipedia tag present.</p>
<p dir="ltr">This is precisely the sort of thing a bot
could clean up, daily or weekly say.</p>
</blockquote>
</div>
<br>
actually it is not that simple. As we haven't only 1 method,
but, for good reason, several, to store references to
wikipedia, this bot would have to check whether the linked
full url in the website is already covered by the wikipedia
interlanguage links or not. This is not impossible, but also
not completely trivial. This bot should also check whether the
previous version had a different website value and restore
this in case it makes sense, or flag it for human review.<br>
</div>
<div class="gmail_extra"><br>
</div>
<div class="gmail_extra">Cheers,<br>
</div>
<div class="gmail_extra">Martin<br>
</div>
</div>
<br>
</blockquote>
<p>I wrote a program <a class="moz-txt-link-freetext" href="http://ausleuchtung.ch/geo_wiki/">http://ausleuchtung.ch/geo_wiki/</a> which allows
to find location of all Wikipedia articles either by coordinates
in the articles themselves, or by the OpenStreetMap tags
(wikipedia, wikimedia_commons, wikidata) in the radius of ten
kilometers around a click.</p>
<p>It works for all language versions of Wikipedia, just change
Wikipedia language field from en to fr, de, it, ru, etc. A search
by an OSM tag may take 2 - 3 seconds.</p>
<p>So it is possible to check Wikipedia articles locations and OSM
tags in an area visually. I noticed and corrected quite of few
Wikipedia articles with wrong geographical coordinates with this
tool. Probably people, who posses encyclopedic knowledge and
create articles, are not always too good in cartography.</p>
<p>Best regards,<br>
</p>
<p>Oleksiy<br>
</p>
</body>
</html>