The speed of answers and computation is not really an issue, and I know that most selfhosted solutions obviously is in no way fully on-par with services like Chatgpt or Stable-diffusion.
I do have somewhat modest resources.
16 GB RAM NVIDIA GPU with 4 GB vRAM.
Is there any options that means I can run it selfhosted?