I'm curious what I should use for the fastest response experience (<300ms response time for loading data from Postgres/APIs).
- Language - Framework
Snarky responses with detailed opinion welcome
I recently wrote my very first API servers (from-end, back-end and Redis & RabbitMq in between) using .net Core with very little real world knowledge, and I easily get <100ms responses and I know I’m not following best practices.
The biggest issue you may have, TBH, is going to be your backend database queries which is why I use Redis for caching - the same request coming in over the same n minutes gets served directly from cache by the front-end which is a huge time win for the F5’rs amongst our users.
I'm not really sure what you're building, but generally speaking your time-to-first-byte is bottlenecked by geography more than anything, unless your server is constantly under really heavy load.
Generally (and vaguely) speaking, because there isn't much detail in your post, IMO this is less a question of language & framework and more a question of caching and CDNs (unless every API request you respond with is totally dynamic). Depending on what you are trying to build, maybe some of that data can be cached/stored in edge nodes in a cloud KV of some sort (which will be duplicated to data centers around the world if you pay for it) so that each local user can get a fast response from their nearest edge.
Even if you just have one VM sitting in one single data center running Postgres, with some basic indexing, any normal web stack should get you well under 300 ms response time even from across the world. You can cache write-rarely-read-often DB/API content in Redis or similar, and you can cache HTML output in Varnish or similar.
And even if you do have a slow GPT in there that takes some time to respond, that should probably be a different API endpoint anyway (maybe on a different architecture) so the rest of the app isn't slowed down by it. (If by "task management" you mean something like Linear/Jira/kanban board style thing, basic UI states and such shouldn't have to go through a GPT).
If you need fast UI interactions (for things like dragging between columns), probably you can optimistically update it clientside and then send the state updates serverside in the background, and validate that they succeeded. Depending on your frontend framework this can be relatively easy (such as for react, in rtk-query: https://redux-toolkit.js.org/rtk-query/usage/manual-cache-up... or swr: https://swr.vercel.app/examples/optimistic-ui or tanstack: https://tanstack.com/query/v4/docs/framework/react/guides/op...).
If your state model is especially complex, you can maybe also consider saving changes to some local datastore and then batch syncing them to the server every so often. But then having to deal with two-way sync can get really annoying, like if the user logs in on different machines... avoid that if you can; it's a lot simpler when everything is server-authoritative with no client being more than a second or two behind.
Sorry if I'm misunderstanding anything...