I’m curious to see whether it might be worth getting the new iPad Pro M4, which I’m guessing should be pretty fast at inference, but it’s obviously a very locked down system so I’m not sure if it’s viable.
It really does not make sense to pay for a screen and form factor you won't use though. You could make a $500 headless inferencing server with a few used 3060s and buy a used iPad Pro with the savings.