1) SELECT [fields I care about] FROM [Table] > dump.csv
2) Upload CSV to Google Cloud Storage
3) Drop/truncate/whatever old table, replace with the new dump
Don't need anything fancy like incremental loads, just dump every record every night.
The above would be easy to write as some sort of nightly job, but I want my data analysts to be able to self-serve - I want a tool with a web UI where they can configure everything themselves, picking and choosing the tables/fields they need.
Does anyone know of a tool that does this, that's simple/cheap? I looked at Fivetran, but their pricing is very high, just soooo much higher than a simple tool like this would be (to be fair, they don't do a simple dump like this, they incrementally sync changes using the Postgres Write Ahead Log). Fivetran's approach also incurs significant BigQuery costs - copying a full table to BigQuery from GCS is cheap, but lots of incremental upserts are actually quite a bit more expensive (BigQuery has to do lots of expensive scans to see if the records already exist). I also looked at Airbyte, but they also do incremental loads using the Postgres WAL, and it seems pretty immature/flakey.
I basically want a tool with a UI sort of like Fivetran/Airbyte, but I don't need it for anything but Postgres > BigQuery, that just does simple full table dumps/replacement from Postgres to BigQuery via GCS.
If you need just CSV result dumped to GCS you can use clickhouse-local mode that has all the features like integrations with Postgres, GCP, formats.
Only but it is not a service with UI where your data analysts can click and drag what they want to export. But SQL can be simple for them to write and you need nothing more than a trivial cron job analogue to run it.