I need some relationship advice. I suggested 125% but my wife won’t budge from 10%. Is this normal? How did it go when you had this conversation with your romantic partner?
I need some relationship advice. I suggested 125% but my wife won’t budge from 10%. Is this normal? How did it go when you had this conversation with your romantic partner?
I honestly enjoy seeing people like this with batshit insane but logically consistent views. Makes things much more fun
This guy essentially founded modern “rationalism.” He has millions of literal followers, not just the Twitter kind. His dumbass is the one that spawned the Effective Altruism cult that has become extremely popular with tech bros. Sam bankman-fried, Sam Altman, Elon musk all subscribe to this “philosophy.” It’s all batshit insane and incredibly stupid.
When your ego is large enough to fantasize that a malevolent AI will create a simulation of yourself to torture for eternity simply because you didn’t spend all your money trying to bring it into being.
As an autistic dude, I feel like I know that it’s weird too say, but I also feel like it makes sense. Like it’s hard to quantify x% better, but I’m sure there is a number, for me at least, where if someone is that much better and would date me, I’d do it. It’s not romantic to say, but it’s true. And I’ve been dumped for other people twice so the same must have been true for them.
It just feels like one of the thousands of unspoken rules you’re not allowed to talk about out of politeness. But honestly I would like to know that number for my SO.
If you’re curious about an alternative view, I suggest The Art of Loving by Erich Fromm. Relationships are about growing your own and the others natural abilities, something you do and not about trading something you have. The OP post is a materialistic view and a belief in inequality. YMMV.
He’s also the psycho who founded a movement which designed to let insane billionaires justify spending their money however they want, no matter the people they hurt now, as long as it’s ‘for the greater good’ long term.
The OOP needs to kiss the business end of a wood chipper if you ask me
Interesting, I haven’t heard his name. I do like Nick Bostrom though. I started reading about this Effective Altruism, on paper it sounds all very nice, but this OOP materialist nonsense bodes very bad for any ethical AI lol. It also seems to be focused on donating and solving everything with billionaire money instead of on governance.
Do you have a link to some good critique of this EA stuff?EDIT: Never mind, found lots of it lol !sneerclub@awful.systems. These extremes growing out of longtermism and TESCREAL should be a laughing matter but apparently they are well funded gaining access. A good article summing this up.
I’m very much aligned with these sci-fi ideas except the first thing we should teach an AGI is to love (see my book recommendation). Which seems something OOP has little capability for. Extinction might not be the worst case scenario with these guys at the helm lol.