I want to spin up a simple api. I plan to eventually monetize it. The backend data is 1m rows with 3 columns. That will grow by about one order of magnitude.
api consumers are likely to be bursty, requesting 10s of thousands of requests.
I tried D1 but it's not quite there and I couldn't find a reliable way to import this many rows. I did like how fast it was to develop with workers.
Any recs?
👤 chatmasta Accepted Answer ✓
This is the vision of what we're building at Splitgraph. [0] You might be most interested in our recent project Seafowl [1] which is an open-source analytical database optimized for running "at the edge," with cache-friendly semantics making it ideal for querying from Web applications. It's built in Rust using DataFusion and incorporates many of the lessons we've learned building the Data Delivery Network [2] for Splitgraph.