2006/10/27, Aaron Griffin <aaronmgriffin@gmail.com>:
No no no.... you're missing the point. Because reading individual files is slow, we need a new backend. This does not imply a database backend is a good idea. It implies a single-file solution is ideal, yes. What I've been trying to show was that, considering the slow-down is from reading hundreds of files, that does not imply a database. ANY single-file solution would work.
Yes, I know this. You just get words out of my mouth! :) But single file will need some structure anyway. So, either we'll have our own format, or use something like gdbm or even simpler, or something more complex like sqlite.
That is why I was on about string matching and things of that nature - because if we take "single file solution A" and compare it to "DB solution A", there are negligable differences, and the DB one adds much more complexity and security flaws (i.e. embedded SQL statement strings).
OK, so, what do you propose? Use a simple tar.gz and process it in memory, or what? Still, I think sqlite or gdbm can be tried, but that's will be not in the near future anyway. There're more important things now anyway. :-) -- Roman Kyrylych (Роман Кирилич)