It's never been explained to me why a god like AI would care one way or another whether people tried to bring it into being or not. I mean the AGI exists now, hurting the people that didn't work hard enough to bring it into existence won't benefit the AGI in any way.
God-like AI feels like we're really overestimating how tough the problem "Destroy humans" would be and so we've imagined this very impressive reason it could happen when actually we're fragile and might more or less blow away by mistake.
It seems we're much more likely to accidentally build something dumb that kills us via an unanticipated side effect. Like actual Weaboo but for society not just one business. AI helps Coca-Cola develop a new beverage that initially seems very popular and cheap so it's quickly the world's top selling drink - and then we realise, too late, that it's actually extremely addictive and withdrawal induces violent rage. Oh dear. That sort of thing.
Hence why I like using the paperclip maximizer as an example. It's not out to get you, you getting got is just an unintentional side effect. Kind of like global warming. Nobody wants the earth to burn up, they just want to give up transportation/heating/whatever else a lot less.