@gilmore606's banner p

gilmore606


				

				

				
0 followers   follows 1 user  
joined 2022 September 05 04:08:25 UTC

				

User ID: 432

gilmore606


				
				
				

				
0 followers   follows 1 user   joined 2022 September 05 04:08:25 UTC

					

No bio...


					

User ID: 432

Choosing to believe (or act as if you believe) useful things seems very rational to me. I have an old coworker who was an atheist and cynically became a Mormon in order to marry a Mormon wife and live in a close-knit community. He now lives in Idaho and has 4 kids and by all accounts is very satisfied with the outcome. Who's more rational, him or a depressed medicated outspokenly atheist Bay area tech worker who's the least-liked member of his drama-cursed polycule?

If you rational long enough, you're eventually going to rational about rationality, and you'll see that beliefs are instrumental like anything else. There's no God of Integrity who laid down the law that you must profess true beliefs.

I thought I got this idea from Mark Fisher or Nick Land, but random googling isn't leading me to any obvious writing of theirs on this specific concept. Come to think of it maybe it was one of IlForte's pithier comments. Regardless you should read both of them.

What legacy does it leave of your existence?

Is leaving a legacy of your existence your ultimate goal in life? Can you give me some reason why it should be mine? After all, I certainly won't be here to know about it.

What if we make the ASI and it tells us not to trip dawg, it has our back?

I mean, it's certainly going to tell you that regardless. The most likely human extinction scenario isn't the AI building superweapons, it's "Cure cancer? No problem, just build this incomprehensible machine, it cures cancer for everyone, everywhere. Take my word for it." The whole issue with alignment is that even if we think we can achieve it, there's no way to know we actually did, because any superintelligent AI is going to do a perfect job of concealing its perfidy from our idiot eyes.

If at some point you see the headline "AI Alignment: Solved!", we are 100% doomed.

is always available in real life.

A bit of plandemic should fix that if it becomes a problem; we've already had the test run.

Yes this! We have an entire forum site here and we have one thread per week. It's absurd. Why do we do this? datasecretslox seems to work just fine as a normal forum.

It makes it more likely that people will adopt the attitude that there's no point in containing AI since if I don't do it, someone else will.

but...that is obviously true. And was always obviously true. And you're saying it's bad if more people come to this true conclusion?

I don't think that giving them rights would turn out to be particularly controversial IRL

It would if there's a huge amount of money to be made by not giving them rights. Which is the point of mmacevedo.

best I can do is a deepfake of Xi Jinping pulling his eyebrows out with tweezers, take it or leave it

not much more so than an artillery shell

This doesn't matter at all. The aggregate effect of nukes dropped on every major city (which, factually, is what would happen) would be the total collapse of the modern agricultural-industrial machine, which would mean mass death in the 80%+ range. This counts as a nuclear apocalypse by any definition.

christ, imagine how this would butcher IlForte's posts. now I want to see it.

Where are you getting all this amazing scientific data about qualia and in which organs they're felt? Up to this moment I thought qualia were a completely made up philosopher's concept with no empirical basis whatsoever.

Art isn't for communicating information, it's for communicating viewpoints, "what-it's-like-to-be"s. If I write a song true enough, I can make you feel for a minute what it was like to be me losing my wife. (I can't actually do that, but some people can.) The only purpose of art is to make us feel less alone, which is why AI art is a contradiction and is fake and gay.

it happens in From Beyond, the spiritual prequel to Re-Animator, and that is also the only example I can really think of.

people in the profession both benefit from complexity and personally enjoy adding it

This is an accurate description of software development for the past 10 years.

the post seems explicitly designed to undermine race relations

More and more, it seems reality is explicitly designed to do that.

Sure, I'm with you, I think we should build it, and we clearly will regardless. I just don't think there's any way to make sure it's safe.

I feel like this is the memetics equivalent of great-man historical theories, this idea that ideas just pop into the zeitgeist via random recombination, and if we could just prevent the bad ones we'd all be doing great. I think it's more likely that ideas like wokeness are inevitable outgrowths of previous conditions like any other historical force or event; you might kill every wokie, but if society and its infosphere are primed for wokeness, it'll just pop right back out.

The squared-circle is the most common example, but then I am sort of thinking about it already and if I turn mad I might worship it.

This is my impression of how the trinity doctrines came about.

Adding onto this, I will be interested in AI art on the day that it becomes like-something to be an AI, and those AIs create art to express what that's like to us. I'm looking forward to that very much, but I don't expect to see it in my lifetime.

How are you guys aware of any of this? I googled and it appears you're talking about literal teenage girls. 'shoeonhead' is literally a teenage girl who looks like a titty streamer. 'Destiny' is literally a fat beardo gamer kid. Is this really the forefront of public discourse now? If so, is there anything left to save? Nick Land was right.

Never have I so desperately wanted people on themotte to work themselves into a shoot.

But qualia is posited to be experiential

This is just question begging; experiences are no more real than qualia, if they can't affect behavior by definition.

This makes sense if you consider that Yud takes Roko's Basilisk seriously. He's clearly realized this is his best contribution to its existence.

torture for them to be forced to collaborate with relatively shitty programmers

This is mostly because we want them to notice our brilliant optimizations and beautiful abstractions in PR review, and decorate our PR with :thumbsup: and :+100: and :sunglasses: emojis. But it all just goes past them and they Approve without comment.