Watched the Bankless w Yudkowsky. I feel that if he is wrong, it may be because AGI has no drive to physically expand, or because it needs our electricity generation, and the destruction of us cannot be swift.
I saw him lose a debate to Robin Hanson on this, ironically at Jane Street Capital, when I was in college ~2012
not sure I understand your arguments correctly, isn’t the “drive” of an AI dependent on the reward system? Based on the definition used in the Bankless podcast an AGI is superior in every decision making process in every area. I have hard times believing that electricity generation will be an issue. 🤔
He is wrong. There are already eight billion AGIs on this planet (humans), and they are doing just fine.
Expanding is an instrumentally convergent strategy. No matter what it "actually" wants, that goal is easier to achieve if it takes over the world first.
I became paralized/depressed after reading Yudkowsky’s article about AI lethalities. Anyone feeling the same?