HACKER Q&A
📣 ddxxdd

SQLite, MySQL, or PostgreSQL?


My application is storing ~12GB of federal election data for personal research purposes. I simply want to be able to search for queries without having to wait several hours for the result (in contrast with my previous solution, which was searching the data line-by-line with python).


  👤 boshomi Accepted Answer ✓
R Programming language[1] has Rdata-Files included. With data.table[2] you have fast memory database for your local machine.

[1] https://bookdown.org/ndphillips/YaRrr/rdata-files.html

[2] https://cran.r-project.org/web/packages/data.table/vignettes...


👤 pwg
Provided you can index the data properly, all three should work to speed up your queries.

Sqlite has the advantage of not needing to install a server, which makes it easier to bring it up as your datastore.


👤 simplecto
SQLite is perhaps a little easier because you only have one file to think about, backups, etc.

I use both but give preference to Postgres/PostGIS if I have to do mapping things.

MySQL is never a consideration for me.


👤 bockris
sqlite and datasette is a killer combo imo https://datasette.readthedocs.io/en/stable/

12GB is not a lot of data, not sure what was happening with your python that it took several hours. Don't get me wrong though, SQL seems like it could be the right tool for this use case.


👤 e2le
If it's for local use, I would go with sqlite.

👤 gshdg
For production or for local use?