Otherwise you need to selectively allow certain bots. However, as well as with web crawlers, respecting a robots.txt is optional.
Insidious with AI-models is that it is difficult or practicably impossible to prove that it trained on your data.
Difficult to establish a standard like robots.txt. There also was .well-known/security.txt that Google proposed. Some sites serve it, but it hasn't really become a standard.
But if you are concerned there's a good resource here for blocking them: https://darkvisitors.com/