Our service is event sourced and follows CQRS and so our Redshift dataset is just another projection. This got me thinking, can other services simply use this projection directly instead of a GraphQL or REST API?
In general, services could provide an API for mutations that updates a projection which other services can directly query using SQL. Although services are accessing a database directly, it's not breaking encapsulation because the data is a projection intended for use by other services like an API.
There's a few benefits I see here. Firstly, if all services were modelled like this, querying multiple "services" could be handled with simple joins. Latency is reduced as data doesn't pass through an intermediate service. Scaling reads is straightforward. The same "API" works for both transactional and analytical use-cases. And finally, low-write services could be scaled down and no-write services could just be data pipelines.
So HN what do you think about a database as a service API?
However once the frontend and backend begin to mutate data, you're going to have problems doing that consistently; it's an anti-pattern I recommend against. Better to have an API handle everything.
LOTS of rest services are just urls params -> sql -> result set -> json
https://postgrest.org/en/stable/
Isn't this what graphql was supposed to offer?
Web API calls are converted into MySQL Stored Procedure calls and the result set returned. I am interested in extending this to other databases.