Can we survive AI?

2025-09-16

Outlook is not good. Money as a motivator for humans is too strong. We're too easily hacked. It's never going to be in a box. We can't make an Asimov-style Foundation elsewhere in the stars. It's like we're taking all the existential risk we're staving off with regards to nuclear annihilation and letting it ride.

People are stupid. That's becoming more and more clear as I age. COVID has shown that we can't count on a big event to shock us into sensibility. Rather, any misalignment between the initially estimated threat and the early harm will splinter us. We're easily divisible. I'm easily distractable. Even someone like me who understands a lot of these things earlier than others, I'm easily defeated. The spectacle of professional wrestling style politics has kept me in check for a while.

Losing harder is always an option. I could choose my own eudaimonia over resistance. I'll probably do that, if I can do it without joining an obvious form of Mammon. One interesting thing about meditation is that you can conceivably realize enlightenment in the throes of AI apocalypse. If AI alignment is metaphorically like holding a tiger cub that will eventually become a tiger, you can jump off the cliff, grab the branch, and taste the fruit.