[Tagging] Hotel dataset import? / Re: Baby-sitting

Tom Pfeifer t.pfeifer at computer.org
Wed Mar 13 11:15:29 UTC 2019

On 13.03.2019 10:50, Cascafico Giovanni wrote:

> The lattest is great and leads to grotesque consequences: when a colleague hands me 12 guidepost 
> locations to geocode, I must message planet import ML, local ML, write an import page and ask my 
> fellow a consent form with certified signature. Is this what we aim?
> I think setting such a binding requirements on small size "imports" is putting a gravestone on many 
> potential small, local sources.

I think you misunderstand. OSM is based on locally sourced, handcrafted data. That creates the high 

Import means data are copied from another database into OSM. That means, somebody else collected the 
data. This collector has made his own mistakes. All databases contain errors, or outdated items, 
even if they are labelled official, governmental etc.

IIRC the dataset you are using is several years old, last updated 2017. The items in your Umap do 
not say when the database entry was last checked.

Thus when copying these data into OSM, they appear as freshly updated, March 2019, although the 
information is much older and the hotel might have closed or changed ownership/pet policies/wifi in 
2018. In particular in a "stealth" import, one node at a time, I would assume the OSM data to be 
individually checked and current.

Thus your task is to discuss with the Italian community if your import, i.e. copying the other data, 
improves or decreases the OSM data quality. The changeset size is less important, though I would 
prefer an community-approved import to be traceable in not too many changesets.

Your colleagues data of 12 guideposts are fine if they are freshly collected from the ground.

Hope that helps to understand why we are careful with imports.


More information about the Tagging mailing list