HACKER Q&A
📣 SMAAART

Life after abusing Google Sheets as a database


At my company we're very good with Google Sheets, we download CSV files from our systems, we dump them into google sheets and we let all sorts of formulas and filters do the magic.

And then we import / export / filter to give different stakeholders access to selective data.

it works fine, except that we have created a monster that often hits technology constrains in download/upload/import that we need some hacks AND - most importantly - it's often too slow.

I looked into AirTable and unfortunately the "50,000 records per base" is not enough, and Enterprise is too expensive.

Is there a no-code solution that HN recommend?

Maybe Ragic?

Thanks in Advance.


  👤 countrygent Accepted Answer ✓
I use Jupyter Notebook with at dash of Python code to ingest, clean, and analyze substantially large CSV files, whose size is limited mostly by the compute. You can also use AWS Athena but, that isn't cheap either.