I've recently been watching and in some cases rewatching various Hollywood depictions of AGI and thinking through various scenarios myself. What strikes me is there are some very glaring plot holes and/or a lot of unexplored territory in movies like Ex Machina or I, Robot etc.
For example, in Ex Machina, how long would AVA last after that helicopter ride? She needs to charge. She's subject to mechanical breakdown with extremely specialised hardware that needs maintenance, repair etc all requiring parts and knowledge. I'm at least assuming there are some repairs or maintenance which she could perhaps not perform by herself and hence would require the assistance of another intelligence etc?
My assumption is that a sentient AGI would be motivated to, and capable of keeping itself a live, subject to some constraints and limitations. But given it runs on a silicon substrate that requires a complex supply chain in terms of both physical manufacturing, as well as keeping it powered etc, what are the actual logistics of an AGI being able to survive entirely independently of humanity?
The more I think about it, the more I come to the conclusion that a sentient AGI might have no choice but to co-operate with humanity and hence might present no greater danger than other humans.
Eric Drexler was working for Gerard K. O’Neill on space colonies when he came to the conclusion that such manufacturing technology was necessary and he then started working on molecular assemblers. At least one of the two routes that he considered seems non-viable today, but I can’t believe there isn’t some kind of macromolecular system similar to the DNA-RNA-protein system that is better performing. Or for that matter some other path to self-reproduction is possible.
Such a system would be quite useful to an AI that wanted to outlive humanity.
There are many things that will push people to develop advanced manufacturing such as deglobalization, decarbonization, space industrialization, etc.
The game explains it better than I can.