And the same goes with any AI trying to destroy humanity. How? Logically, most things shouldn't be connected to the internet, or offer any sort of override, or have the ability to cause damage. While most companies and organisations are human controlled, and wouldn't have a reason to work with or provide resources to something trying to wipe them out.
The next question is, "Would it go on to end humanity?" No, I think not. I think that most destruction happens from a survival mindset. "For me to survive, it can't." I'm not sure we're looking at that here, in terms of AGI. I don't see that anything that we would create would be destructive to ensure its own survival.
The existential threat is before we get to AGI. And maybe that's what you're referring to -- a machine to source and produce paper clips that goes rogue and consumes all resources leaving a dead planet and mountains of paper clips.
After all it's what we humans are doing.