[OSM-dev] No new planet.osm files?

Nick Whitelegg nick at hogweed.org
Mon Oct 9 21:52:27 BST 2006


On Monday 09 Oct 2006 14:23, Immanuel Scholz wrote:
> Hi,
>
> > Swap cache: add 776211, delete 776211, find 2065432/2097947, race 0+0
> > Out of Memory: Killed process 28432 (planet.rb).
> >
> > I think planet.rb is loading almost the entire node table in to memory
> > before spitting it out.
> >
> > Nick you hacked on this right? Any way to make it row seek or something
> > so it doesn't die?
>
> I coded the reading-from-db part.
>
> The probably offending rows are:
>
> def all_nodes
>   $mysql.query "..." do |rows|
>     rows.each do |row|
> ...
>
> If 'rows' is an Array, and this completly reads the data into 'rows'
> before iteration starts, this is the place to change. If rows is of some
> "cursor-like" type, this should be fine.
> Can someone check this ("puts rows.class" before the rows.each...)? I have
> no ruby environment here at hand.. :(

This gives Mysql::result. This is broadly the same as PHP, you get a result 
set back from the MySQL query. If there are huge numbers of nodes I can see 
this would use up lots of memory.

I'm not sure how one would best deal with this, to be honest - a huge result 
set is a problem I haven't had to deal with before. Get one node out of the 
database one at a time (I can see this being very slow)? Do tiled queries, 
say 10x10 degrees?  Shutdown osm while the export is done? Anyone?

Nick




More information about the dev mailing list