[OSM-dev] Can SQLite3 handle OSM 150G data file?
kolda at web2net.cz
Tue Apr 28 08:43:21 BST 2009
There is no reason to be sqlite slow. How much are you are using
transactions on your local db? What are you using for import? I'm
importing Czech republic (50Mb bziped) in one or two minutes. So for
5.4GB it takes 108 minutes (two hours). But I'm importig to modified osm
schema (something like mapnik schema in postgress). It is because of
I'm playing with sqlite and Mapnik. Sqlite knows RTree spatial index and
with small modification to mapnik (sqlite datasource) it runs ok. If you
use 8x8 tile per render, it is quite fast.
RTree in sqlite is fast, but there is overhead, when you use one bound
rect per entity and also with mapnik (it queries many times per one
render for same area). But if you do not need speed it works.
Shaun McDonald napsal(a):
> On 28 Apr 2009, at 00:16, Kelly Jones wrote:
>> I've seen many posts saying that SQLite2 can't handle OpenStreetMap's
>> large planet.osm data file:
>> which is 5.4G bzip2 compressed, about 150G uncompressed.
>> Can SQLite3 handle this? Has anyone tried?
>> I tried to do this myself, but I'm on a slow machine and it's taking
>> too long (several days so far). If someone's done this, can I get a
> SQLite isn't designed for huge databases like OpenStreetMap. You could
> get away with a city or small region, but more than that, you will get
> the slowness that you are seeing.
> http://www.sqlite.org/faq.html#q19 and some other FAQs on that page.
> You are much better using Postgres, or even MySQL.
> dev mailing list
> dev at openstreetmap.org
-------------- next part --------------
An HTML attachment was scrubbed...
More information about the dev