HACKER Q&A
📣 herodoturtle

Best way to store thousands of ~10mb files via browser upload interface?


We've got decent (albeit old school) full stack in-house dev experience, so re-inventing the wheel is optional given our tailored needs.

Some solutions we've come up with:

1. Store the files as MEDIUMBLOBs in our MySQL database - but we're not sure how responsive this database would be once we hit 100TB+ of files.

2. Store the files on a standard Linux file system, and limit the MySQL database to tracking values linked to each file (but not storing the actual files themselves). The data tracked in this database would thus be limited, and therefore scale better. In said data, we'd track things such as the date the was file created, a customer's name, and around 100 other variable text fields.

What are your thoughts on the above two options?

And are there other good/better options?

Please note: we're a bunch of old guys that still tinker in LAMP, so kindly take it easy on us if you're responding with modern framework solutions. But we're open to learning new things!

Cheers from South Africa.


  👤 neximo64 Accepted Answer ✓
on S3, South Africa region using a javascript library that uploads them using pre-signed urls for you and returns the final urls which you can save in your mysql database

👤 epc
I’d compute a couple of checksums on the files and store those plus metadata in a database, storing the files either on multiple RAIDs under your control, or one or more geographically dispersed cloud storage systems.

Whether local storage or cloud storage is best is a function of how often the files are going to be downloaded or otherwise manipulated by your service.