Where I think Hinton’s views fall down is that the e have zero idea of what AGI smarter than us might want or what it might do. Like people always talk about it as if an entity like that would just hang around and bully our species. It might evolve into a ball of light and leave the planet. I don’t know but we seem to assign a lot of human traits to something that wild likely be completely unrecognisable to us in probably twenty minutes after birth.
What's the difference between desires and goals in this context really? You could say he is worried about a reasoning machine "relentlessly programmed" to achieve some goal, but a reasoning machine might just reason itself out of everything you've told it to do. Something so creative, so capable, so beyond us, yet it's going to...assassinate other people for you? Why?
When something goes from being a computer program to a self-aware, conscious being with agency, things change a lot.
Hinton is a major paradox of a human, he has spent his life building the very thing he says will likely doom us, and now spends his life warning us against his own work? So much of this AI doomerism just seems like a "chinese finger trap" for the ultra logical thinker.
It's a fucking weird time to be alive. The 90s felt much less weird and dystopian to me.