Ok, I get that there are significant monetary gains, maybe even prestige, behind leading the research to new pathways. But just consider aspirations like this from the perspective of the US — There are actual geopolitical risks of such a system being in the hands of the enemy (great article about this: https://situational-awareness.ai/). They wouldn't want open research about this and probably infiltrate the whole campaign before anything can get out. Furthermore, having humanity in mind and some long-sightedness, it just seems unreasonable for me to push something like this forward. There probably never has been anything that brought such various (potentially existential) issues with itself, and then we try to just dive heads in and not think about it?
But besides, I am trying to understand what could motivate one and what you all think about this. Excited to hear your answers!
And as capabilities are being developed in exponential time currently, it's hard to predict future configurations clearly even in a few weeks time. In fact, we are already hearing we have AGI now (with many caveats) from some quarters. So "AGI" currently seems not much more than a click-baity heuristic.
AGI could help solve a lot of the problems we currently have. As long as they don't solve them in a way that boils down to "destroy all humans" it could prevent a huge number of possible disasters and drastically improve our situation. For at least some of our problems we're working against the clock as it is, so the sooner new solutions can be found the better. Generally, major leaps in technology have been beneficial for the world and AGI could be no exception.
If you're very wealthy and believe that some combination of climate change, lack of food/water, another pandemic, or violence could cause a large percentage of the population to die off in the near future while your wealth will likely allow you and/or your children to survive then AGI will be important to replace a lot of the poor people you've previously depended on.
AGI means corporations can stop paying many if not most of of their employees which means their profits will go sky high. AGI will likely mean a return to legalized slavery. Computers don't take vacations, sleep, get sick, show up late, steal, insist on basic human rights, cause expensive sexual harassment lawsuits or workplace violence etc (at least not to the extent that humans do) so the sooner you can get rid of those pesky human employees the sooner you can increase the rate you've been stuffing your pockets with cash.
Whoever gets to AGI first can use it to rule over and oppress everyone else in order to make sure they can't create it or take advantage of it to pursue their own agendas.
If you think that humanity is just a stepping stone that will create the machines which will in turn create something superior to replace all of humanity than it'd be good to get that going as quickly as possible before inferior humans destroy/waste the resources those superior machines could put to better use or before humans cause some crisis that sets humanity back delaying our ability to fulfill our destiny and bring about our replacements.
I think most people who are trying to push AGI either believe it will be controllable by humans or will be like a benevolent deity. Those that believe it will be uncontrollable and malevolent obviously don’t want it to exist, unless they are doomsday cultists.