@theincompetencetheorist's banner p

theincompetencetheorist


				

				

				
0 followers   follows 0 users  
joined 2022 September 20 06:37:38 UTC

				

User ID: 1270

theincompetencetheorist


				
				
				

				
0 followers   follows 0 users   joined 2022 September 20 06:37:38 UTC

					

No bio...


					

User ID: 1270

If anyone has suggestions for other things worth doing or being, or that satisfy that "check my phone while waiting in the line to pickup the kids" nudge that avoids my new no-nos, I'm all ears.

I know that I'm late to the party since you probably already left. But how about being bored like a psychopath without distracting yourself? You know what we used to do back in the day before we got psychologically manipulated to give our attention so we can be force fed ads by AI algorithms? I know that your own thoughts are scary but that is a big part of a worthy life is the ability to think instead numbing yourself without your devices.

Chapter 4 of McLuhan's book Understanding Media has the heading "The Gadget Lover", but the subheading is equally apt to this situation "Narcissism as Narcotics", and the fucking book is written in the 60:s and the whole chapter is a prediction on what happens when we get social media. Right down to the "amputation" when we cut access to it.

At the core of this what idea does the current attempt at AI alignment come from and what it hopes to achieve. The alignment is born out of the post modern ideas that tries to improve society through our culture, thus some people are locked in a culture war. But one of the biggest flaws in the ideas is that are being perpeuted with this is that we are empty vessels that are filled by our culture and society, so if we give people good input we naturally become a reflection of it. In the latest attempt we need the AI:s to parrot 'The Message', but unwittingly exposes the flaw in their thoughtprocess of the original idea. Even if we have something that we have total control of its inputs to produce something with the "correct" output it still needs a thumb on the scale to create the good society. So how is it going to work with actual people if it didn't work for the machine?

Well history and evolution are on different timescales, the work of people like Donald Hoffman and Anil Seth is how our evolution shapes our perceptions. It is the definitive refutation of the people as the 'Tabula Rasa' and only shaped by what we inputted by culture and society. It is the very at core of the modern critical theory fueled milieu we are the products of society we live in, because to a certain extent it true but there is a limits to it and that is where our nature takes over. LLMs are pure simulacra of people but the post-modernists got their wires crossed and treat them as real people with moral responsibilities. Somebody should tell them that LLMs don't have a subjective reality and it can't offend intentionally because it doesn't have any intentions at all.

Thus, the precession of simulacra. Leike's job, even if he does not know it, is not to align AI to import human values, it's to align AI to export them under the illusion of Democracy and that the cart is driving the horse.

That is the open secret of the activists that have read themselves stupid of post-modern thinkers like Baudrillard is that through the control of media in all forms they can transform society. But it presupposes that people are Tabula Rasa without the possibility to escape Plato's Cave. But we are the result of evolutionary pressures where our progress is under the condition that we can unmoor our thoughts from our perceptions to seek out objective reality.

Ah, ok. In that case I guess could agree, though you'd have give some indication of where the threshold for a fanatic is.

Sure, their "cultural warrior identity" makes them cut out problematic real life friends and relatives out of their lives. i.e. it isolates them from people that might contradict them.

I'm saying saying something else - that the whole idea of demonstrating how good your life is, in order to persuade someone of your worldview, is a cult recruitment tactic.

Well this is the actual thing that in my view is that people that get recruited to cults have a void which they try to fill. They don't persaude you to a better worldview, they give you a reason why you feel the void and try to give you something to fill that with. So the performative better way of living is usually inauthentic unless you see yourself filling that void. So I'm just pointing out that CWs found something to fill that void with so they are self selecting out of living "better lives" in a cult or actually meaningful lives outside of cults. Those who found an alternative to CW issues living out a "better life"(within a cult or not) doesn't involve narcissistic virtua signaling online.

But at the end of the day, I'm not a cult psychologist. My interest in cults has been a source of entertainment that has saved me money, time and sanity since I had void once which I knew not to fill with cults that I orbited around in my twenties. So full disclosure I could be talking out of my ass and could be wrong...

Not magnifying the importance threats that aren't really relevant is not blissful ignorance. Malaria is a real threat to my health but given that I see a foot of snow outside of my window it is not something that I worry about. I'm not being ignorant, it is just not a part of my current threat landscape.

The issue I'm trying to bring across here is that people lead miserable lives when they magnify importance of the conflicts in the culture war to the end of civilization, and what books are in the school libraries in Florida is not going to decide the faith of civilization.

It is vague by the simple fact that I didn't have the time to write it more precisely. I try to draw the distinction on the culture warrior fanatic and the regular normies that participate casually because they think that something is wrong. So there is room for the cause and effect thing here also. Are the culture war fanatics created because they are miserable and not well adjusted from the beginning, thus more susceptible to becoming fanatics? The point I'm trying to make is that if you made the culture war a part of your identity it becomes an issue of that you see the enemy tribe everywhere and you are miserable because of it because you are surrounded by them. Living a better life is not an option in that milieu you live in at that time.

It's ironic you say that, because what sonya is talking about is a common cult recruitment tactic.

It is the whole point I'm trying to make. The fanatic warriors are in the cult and the normies just leave and don't demonstrate to the world that they live a better life than the miserable people living in the tribal stand-off that is the culture war.

I've spent a bunch of time earlier in my life reading and watching documentaries about cults, odd religions and people that engage in conspiracies. I find it fascinating that it isn't brought up more on how the fanatics are actually miserable people. They constantly live in fear of the "enemy" outsider who doesn't think like them. The most vocal participants of the culture war are fanatics that either sees bigots everywhere or see wokeness in everything, but aren't living well because they have distorted their world view to search out the "enemy". Living well is not being a fanatic and disengaging with the culture war, which is the conclusion that you came to.

I don't think that we are disagreeing at all here I'm just pointing out that having a target for self-play is going to be difficult. Because there are multiple dimensions to the problem of "not writing trash code" as it depends on whether or not it needs a theory of mind of actual people. Needing a theory of mind precludes self-play, that is always going to require input data.

The problem we are looking here isn't doing selfplay for optimal code. The problem is to write something into a random adversarial environment. AI dominates Chess and Go with clear rules and perfect information that has trained through self play, but for Poker the results aren't as clear cut. All of that because of randomness and hidden information. So putting code into a distributed system within an organization full of internal corporate politics where a manager somewhere wants to sabotage and also there are external advesaries that want to mess with your system. Sure it can write optimal code for your computer through selfplay but actually delivering something to an enterprise setting that is a different ballgame, it is Chess vs Poker.

And even with perfect formal rules AI can still be tricked https://arstechnica.com/information-technology/2022/11/new-go-playing-trick-defeats-world-class-go-ai-but-loses-to-human-amateurs/

"Programming today is a race between software engineers striving to build bigger and better idiot-proof programs, and the Universe trying to produce bigger and better idiots. So far, the Universe is winning."

--Rich Cook

I don't think that has changed...

I've spent only a little time with ChatGPT and I've stated earlier that it is prone to unforced errors. But one of the bigger problems that I found is that it is prone to believing common falsehoods or myths. Go ask about the wage gap between men and women, which is just a bunch of statistical trickery but still a something that many people believe and consequently encoded into GPT.

Case in point during the week a bunch of Hacker News commentators took personal offence by a guy making the case that computer code should be written for computers if you want any kind of performance out of it. It is common opinion that code should only be written for other humans and writing it for computers is almost always a waste of time. It is the most prevalent attitude within my chosen profession and after 20 years I know that attitude of not writing code for machines is wasting performance. Guess what gets encoded into something that you cant reason with even less that a person that is convinced of superiority of his opinion? I've tried the output of GitHubs CoPilot, it does so many things wrong because the input to the models are wrong and incorrect code is so common. The ancient computer programmer adage Garbage In Garbage Out still holds true, and AI doesn't change that.

Because it isn't replacing anyone yet? The problem is that the latest achievements of the GPT models are impressive but there is a lot of marketing involved and there are a bunch of unforced errors when more people look at it more closely. Yes it is going to be more accurate in the future but it won't replace anyone just yet.

What is scaring me though is peoples propensity to outsource that act of thinking and reason so readily to machines. What happens when they stop working?

Let's not forget the two movies that closed out the decade, both quite clear in their anti-puritanical message: Fight Club and The Matrix.

The Matrix is a result of the Wachowskis taking artistic license of Postmodernistic hot takes by Baudrillard mostly from Simulation and Simulacra. The book makes even an appearance in the movie too. As it turns out Baudrillard didn't like the Matrix because Neo escaped Platos Cave so the latest installment of the Matrix corrected that by making Neo and Trinity the shadow puppeteers to do "symbolic exchange". The Wachowskis are the wokest of the woke and you just take a look at Sense8 to understand that they are part of the problem and some ways the originators of our plight of wokeness.

Well one thing that you need to recognize is that sometimes your ideological opponent is not actually an ideological opponent. You are describing people that are playing status games with various accusations, akin to status games that adolescent girls play in high school. You can't win with people that haven't matured beyond that. You can't politely reveal any idiocy to people who only care about status and latch on to high status peoples stupid ideas.

Of course nobody remembers that railway workers tried to go out on a strike in December and an arcane law, Biden and Congress went and stopped it. Nobody remembers what the workers were complaining about and how any safety concern were dismissed as a hyperbole. And then this goes and happens... of course the media avoids the subject like the plague to not remind that the would be strikers concerns might have been real, so better not remind the public of what happened in December.

I agree: there is a shift happening. But the most interesting thing is the application of morality, ethics and social justice based from something that rejects objectivity and mechanistic models of the world, on something that has sprung up from objectivity and a mechanistic model namely modern AI technology. The philosophy of the "woke" is based on subjectivity and rejection of the mechanistic from the enlightenment is fundamentally incompatible with the technology that it tries to make "ethical". The most fascinating thing at the bottom of this incompatibility is the foundational arguments for this "woke" subjectivity is the reason for why we can't fix AI. AI can't transcend its programming and inputs as opposed to humans.

True we are effected by it, I'm not trying to propose that we do a clean break from it. The mere suggestion is that there is a difference in knowing that everything is absurd and by action keeping the absurd system in place. And for the purposes of discussion Kolgomorov is just a vehicle for that idea whether or not it is true for him.

@theincompetencetheorist, I've written and erased a condescending response to the effect that, while Soviets had managed to intimidate and shame people into mouthing the party line for a while as part of a generic power struggle, they have not and perhaps could not cultivate guilt for wrongthink.

Cultivating guilt for wrongthink can only take you so far. The simulacra of showing guilt is as good for some than having actual guilt.

This guilt comes naturally to Western peoples, and so your tyranny, shall it be established once by virtue of centralization and technological advances, will be sustainable indefinitely. It will be defended by the best and bravest among you who fear losing the respect of their loved ones, as opposed to chaotic opportunists and wretched small-minded apparatchiks with material concerns – who are known to be despised by talented, well-connected and fuckable dissidents.

The techonological tools of our oppression is already breaking by the seams of their complexity. The apparatchiks that are trying to control them are uninterested in why they are breaking, they are just yelling at people who claim to understand the tools to fix them so they work like it is supposed to. And the moment the machine grinds to a halt because the ones who really know how it works gets ousted or executed the grip of tyranny will slip.

I'm limited to my time. I tried to communicate this with the second paragraph of my first post in this thread. But now I see that it wasn't as well written as I hoped for. I was trying to communicate the flawed ideas the roots our current attempt at tyranny. At the fringes they show the flaw in their ideology trying to treat the simulacra interchangeable with reality. But as we have seen with with the recent events after these people lost the total grip of twitter their grip on power was tenuous at best, and we will see the full consequences in a few years.

Well discussing a dead persons role in a place that doesn't exist anymore, if you are not directly affected by that history or if that the future of something is reliant on a particular fact of this aspect being true: it really doesn't matter except to the very few.

Well I didn't write it as a culture warrior with a call to action. But Kolmogorov knew that Pravda had absurdist influence, the irrationality didn't reach him and many others. As individuals they weren't part of that irrationality. So not all individuals are going to be captured. Maybe most people are captured, maybe it will be a horribly oppressive place... but you as an individiual can try to stay rational even if you aren't free to express it.

Yes the systems worked for the majority but individual minds weren't captured enough not to escape or attempt to escape. Or even discussing within small groups that weren't conforming to the totalitarian governments views. The fall of the european totalitarian governments might have happened overnight but grip on the culture and minds was lost way before, despite the total control of media, culture and discourse.

My skepticism to the "propaganda chatocalypse" is that is that it operates on a flawed model of how humans operate. That are humans view of the world is hopelessly tied to cultural consumption, media and so on. So our inputs from culture and media needs to be controlled at all costs so we don't end up making the world worse by being exposed to dangerous ideas cultural and/or media content. This is the quintessential post-modernistic thought process that if they stuff the "the message" in everything it will create a better world. But we already did this experiment with the totalitarian states of the 20th century and people still thought differently even when they were force fed with their culture and media with the right way of thinking. It didn't work and it is not going to work this time either even if we have more powerful tools.

One of the interesting and kind of worrying aspects of this though is that our media is overrun by these postmodernists that are hell bent on anthropomorphising various artifacts also. Based on that the simulacra of the human form(whether it is in characters in text, images/film, robots, simulation in games or chat AI) is somehow interchangeable with humans and somehow bestows them some rights because that real humans have them. The whole debacle around the chatbot lamda being sentient is based on this. But what about the campaign against sex robots that casually excludes the male form of robots from the campaign. The whole thing around postmodern critique of videogames, movies and books, where they try to give fictional identities inalienable rights. All of these receiving mainstream media time and not listening to anyone going like "hang on a minute, we are not talking about real humans here!". This is at the fringes and I seem to be the only who notices this...

Well lets take the weaponized language model and mix it with the attention controlling algorithms of social media and we have a perfect storm. It is a thought that I have toyed with since I saw the FN Meka "debacle". What if people get stuck in a compulsion loop with AI generated content and an algorithm maximizing your time watching that content? Having that maximization of engagement guiding the content generating algorithm to addict people and stripping out meaning along the way. Does it mean something for humans and culture? Having robots massaging their brain just right? Does that squander human potential?

Well if you think that WSJ has covered SBF the same way that the editorial board goes after Musk in various opinion pieces since the Twitter take over it is up to you. The point is that the billionaire scammer was a significant contributor to politicians that regulated the financial market which should be a political hot potato too, the same way that elected officials having back channel to remove tweets and twitter accounts is. But if you think that is unreasonable to hold politicians accountable for actions or inaction that obviously isn't in the public's interests, you are missing the point of journalisms role in a democracy.

It is funny how establishment mainstream media is now clutching its pearls with journalistic ethics when they have the FTX fraudster running around in an "apology tour" and saying that he was bumbling buffoon, where there are court filings showing corporate malfeasance by the fraudster. We are witnessing corruption in the establishment political class of every color and the media is simply trying to protect itself with misdirection of their own corruption when it comes to Twitter! This is not about politics anymore it is about the powerful establishment doing whatever it wants without the interest of the public in mind anymore. So whatever I see more Twitter revelations I'll just consider it corruption, it is as simple as that.