It's possible for it to be useless now, while being incredibly useful in the future. Prompting chat GPT manually can be very time consuming, especially with how long it takes to spit out a result. If something like AutoGPT can make it so I only need to write one or two prompts to get what I want, then it's already become useful. For reference, When I've used gpt to work on simple websites I generally end up using 10+ prompts.There is a trade off where it will definitely use more tokens than I would while prompting, though.
One interesting benefit of projects like AutoGPT is that it will be able to immediately benefit from improved models, assuming access is granted.