[Talk-us] Address Node Import for San Francisco
gregory at arenius.com
Fri Dec 10 01:06:00 GMT 2010
A few comments...
> 1) San Francisco explicitly says they do not have building outline data.
> :( So, I suppose we get to add buildings ourselves. I do see that SF does
> have parcels.
> For DC, we are attaching addresses to buildings when there is a one-to-one
> relation between them. When there are multiple address nodes for a single
> building, then we keep them as nodes. In vast majority of cases, we do not
> have apartment numbers but in some cases we have things like 1120a, 1120b,
> 1120c that can be imported. Obviously, without a buildings dataset, our
> approach won't quite apply for SF.
We mostly only have building shapes drawn in downtown where its unlikely
there will be many one-to-one matches. I wish we did have a building shape
file though, that would be great. I have thought about using the parcel
data but I'm not sure thats as useful.
> 2) I don't consider the addresses as noise. The data is very helpful for
> geocoding. If the renderer does a sloppy job making noise out of addresses,
> the renderings should be improved.
> 3) Having looked at the data catalogue page, I do have concerns about the
> to use the data.
What terms in particular caused you concern? I'll need to know if I'm going
to ask for explicit permission. A while back I posted the previous terms to
talk legal and they pointed out problems. The city changed the license when
I pointed out that it cause problems for open project (apparently that was
in the works anyway). I thought those problems were removed. I had a
conference call with one of the datasf.org people the helps make city
datasets available and an assistant city attorney prior to those changes and
I was told that unless specifically noted otherwise in the dataset that the
data was public domain. I do understand that that isn't in writing though.
If there is a problem with the terms though there is still a good chance the
city would give us explicit permission to use the data; they seemed excited
about the prospect of some of it ending up in OSM.
> 4) If you can get explicit permission, then I suggest breaking up the
> address nodes into smaller chunks (e.g. by census block group), convert them
> to osm format with Ian's shp-to-osm tool, and check them for quality and
> against existing OSM data (e.g. existing pois w/ addresses) in JOSM before
> importing. QGIS and/or PostGIS can be useful for chopping up the data into
> geographic chunks. This approach gives opportunity to apply due diligence,
> to check things, and keep chunks small enough that it's reasonably possible
> to deal with any mistakes or glitches.
I had been planning on using shp-to-osm to break it in to chunks by number
of nodes but doing it geographically makes more sense. Do you think census
block size is best or maybe by neighborhood or aim for an approximate number
of nodes in each geographic chunk?
-------------- next part --------------
An HTML attachment was scrubbed...
More information about the Talk-us