[Imports] Best practices for address imports
cquest at openstreetmap.fr
Sun Nov 24 19:04:01 UTC 2013
2013/11/24 Martin Raifer <tyr.asd at gmail.com>
> Randal Hale <rjhale at northrivergeographic.com> wrote:
>> I ask because I will (hopefully) be faced with a comparable problem in
> I'm also involved in a similar address import which is still in an early
> phase. And I see more similar projects upcoming (e.g. Italian housenumbers
> by ISTAT: ).
> I think that now would be a good time to discuss some good practices for
> address imports in general, wouldn't?
Here are some concrete questions we are currently facing:
> * If you had an address dataset that positions the addresses at or near to
> the respective building entrances: Wouldn't it make sense to not conflate
> the address data with building outlines? (Because by conflating the
> usefulness of the data would actually be reduced.)
It depends what if your definition of "an address".
In France, an address is related to a piece of land, not to a building.
In some cases (not the usual one), the building occupies the whole parcel.
An address may also be seen as related to an entrance on large buildings.
This is the (simplified) situation in France, and I'm sure it is country
and even region dependent.
I prefer to manually add addresses on nodes instead of building polygons
because of the above.
When the building is not starting at the street/sidewalk, I put the address
node at the limit between public space and private scape, more or less
where you can find the property entrance, and the post box otherwise the
best location is a node on the building outline where you have the
corresponding building entrance. This provide more information at the end.
I wouldn't recommend having some automatic import tool to add addresses on
polygons for the above reasons at least in France, but the situation may be
completely different in other places.
* What are the pro/cons of a manual community import versus an automated
> import versus some hybrid form?
I would say quality of original data... if the quality is very high,
reviewing before import will be more a waste of time than anything.
If the data quality if not so high, you indeed need some human to decide
how to fix what need to be fixed before upload.
> * What are good tools to help with such an import?
Here again it will depend a lot of the dataset. It's quite hard to have a
> * What are good ways to manage quality assurance after the import?
Have you looked at existing address related quality assurance tools ?
For example, osmose is doing several analysis on addresses, especially with
* Is there a practicable way to keep the imported data in sync with the
> source data?
Like with most external source of data, this depends a lot of the
dataset... no unique answer here again I'm afraid.
Address are usually node based, so doing a conflation is not too hard
(compared to conflating linestring or polygons due to their nature).
> * Can one produce a feedback loop for data providers?
We did that during Nantes address import. The local community sent back the
error they detected which made the local authority impressed of our quality
> It would also be good to hear from people who have already done such
> imports to learn from their experiences.
> Eventually we could summarize this into a "best practice" wiki page for
> address imports.
That would be usesul.
The tool that has been use in Nantes has then been setup to handle other
address datasets published in opendata in France.
You can look at it for example on
http://addr.openstreetmap.fr/bordeaux/for Bordeaux addresses.
Christian Quest - OpenStreetMap France
Un nouveau serveur pour OSM... http://donate.osm.org/server2013/
-------------- next part --------------
An HTML attachment was scrubbed...
More information about the Imports