Longtermism — The Stupid Idea with the Stupid Name
This is another way to say “I’m an arrogant asshole”
A long time ago, like over a decade, I signed up to Singularity University’s mailing list and I’ve never bothered to come off it despite it devolving ever since.
The concept of “The Singularity” is very appealing in many ways, and has some merit, but it has also been abused by people to create apocalyptic ideologies that drive people to cause harm to themselves or others.
The basic idea, which was invented by Ray Kurzweil, Director of Technology for AI at Google, is that with computing power rising exponentially, at a certain moment in time, technology will be better at improving itself than we are at improving it, and humans will, in essence, become either irrelevant or so integrated with AI that there is no way to see past that.
Kurzweil estimates this moment — The Singularity — will come at some point around 2045. Now, this is conceptually interesting, but the problem is that if people believe that human life as we know it only has another 20 years or so before we turn into androids or something, what’s the point of fussing about things before then? This idea can be used as a sort of Nerd End Times as an excuse for doing whatever you want until the big event. A lot of Silicon Valley cult members are into this.
Which makes the fact that I got this dumbass article from Singularity University doubly ironic.
Longtermism is the kind of idea you have on mushrooms that feels like a revelation at the time until you realize you’ve just created new words for being an arrogant asshole.
Basically, it’s the opposite of The Singularity. The idea is that since humans are going to be around for SO AMAZINGLY LONG, the only thing that matters is the future of humanity.
This idea creates a quasi-moral framework for instituting systems and policies in the present that may seem… distasteful, but are really for our own collective good. No pain no gain, see?
The trouble is that the people pushing this idea are the ones who want to decide what our glorious future should be, and are also the ones who benefit from the people in the present “sacrificing for future generations.”
Elon Musk is the ultimate metastasis of this idea. A guy with basically unlimited money who genuinely believes that the “long term” future of humanity is in space. Musk believes that any action he takes to get us closer to that reality is justified, including propping up genocidal dictator Vladimir Putin because Musk sees him as destructive to the status quo, and therefore a positive force in the world, regardless of whatever atrocities Putin commits in the process.
Longtermism is just intellectualized sociopathy. It’s a way to justify hurting people and destroying lives because it’s really for the good of humanity, and because well, you just know better. It’s paternalistic, arrogant asshattery that should be seen as a giant red flag on anyone who promotes it.
The only longtermism I’m worried about is my grandkids. And right now they’re in danger from people I can see right in front of me.
#ArrestMikeFlynn
I wonder how much Silicon Valley’s (and others) coming up with these shitty ideas gets spurred on my their microdosing habits. Add in sociopathy and arrogance, et voila!
Yes. And it's a way to set women up as 16-year-old forced breeders. Must insure more "value containers" for the glorious future.