What are your thoughts? and are you aware of any framework/library using such approach?
It became too expensive to keep up with evolving business requirements and I was laid off as a result even though I was a late hire who did not plan any of that madness. Other people were laid off too but I am not sure how many because I was the first person on this small team they released.
My suggestion is to impose a tight separation of concerns and let data be just data. If you are planning architecture always remember that software is always a cost center, so always automate the shit out of it at all stages. Tools and frameworks will not save you from a lack of vision.
But it sounds like a lot of overhead to maintain, presuming it's one insert/update/delete proc per table.
IMO stored procedures help where you want to make some reasonably complex logic easier to call into from the application, OR where you want to trigger said reasonably complex logic from multiple DB events.
Here's an injection vulnerable stored proc call.
string badSql = "exec fooProc " + arg;
User input need to be parameterized whether you are calling a stored proc or not.
I'd only use stored procs for specific niche uses cases. And only if it was really performance critical. If you want to avoid SQL injection, you use prepared statements (or a framework/ORM that uses them.)
Snowflake introduced their marketplace and those apps / data shares necessitate sprocs and database functions. I made a couple of those apps in my last role and versioning is definitely a challenge. Rolling back is easy enough but what do you do with lost/corrupted data?
I have worked on a small CRUD app where everything was done with sprocs with a small laravel layer and that worked well enough.
That said, I have had use recently to split out which tables can be written by which applications and then abstracting the selects into views.
It's pain to debug and document.