@FtttG's banner p

FtttG


				

				

				
6 followers   follows 0 users  
joined 2022 September 13 13:37:36 UTC

https://firsttoilthenthegrave.substack.com/


				

User ID: 1175

FtttG


				
				
				

				
6 followers   follows 0 users   joined 2022 September 13 13:37:36 UTC

					
				

				

				

				

				

					

User ID: 1175

it can hardly be simultaneously true, as right-wingers typically believe, that trans women naturally evoke revulsion, and that any adverse social consequences that they experience are imaginary.

I don't think all the adverse social consequences trans women experience are imaginary. No doubt they attract a lot of funny looks, as does anyone who deliberately dresses in an unusual way (e.g. goths with loads of facial piercings and/or tattoos). But when trans activists complain about a genocide of trans people which is either imminent or currently ongoing — yes, that is imaginary.

Damn, I didn't know you could do colour formatting here.

Impossible to talk about this without spoilers.

Dudley and his men murder Stensland and the other customers in the Nite Owl coffee shop. Dudley's original plan was to frame three young black men for the murders, which he planned to do by having his men surreptitiously plant the shotguns used in the murders in their car, then have his men shoot them dead in their apartment. Dudley reasoned that no one would bat an eyelid if three young black men were killed "resisting arrest". However, the plan goes awry when Exley and Vincennes arrive at the apartment building at the same time as Dudley's men and arrest the three men unharmed. During interrogation, Exley ignores the three men's consistent pleas of ignorance about the Nite Owl killings, and ultimately all three men are eventually shot dead by the police anyway, as Dudley had originally planned.

Additionally, there's the opening of the film in which a squad of white LA police officers viciously beat up a group of Mexicans in their prison cell, which was directly based on a real event and which has obvious parallels to the beating of Rodney King.

A good article highlighting the deficiencies identified in the BBC's coverage of the war in Gaza. Choice quotes:

The Telegraph has since reported that BBC Arabic had to make 215 corrections in two years to its coverage of Israel and Gaza - that’s two per week. [emphasis mine]

One issue that was already well known involves BBC Arabic using journalists who had made viciously antisemitic comments... In one example, Samer Alzaenen, who had suggested that Jews should be burned “as Hitler did”, was used 244 times by BBC Arabic between November 2023 and April 2025. He was “consistently introduced as a journalist”, according to the report, as was Ahmed Alagha, who appeared 522 times on the BBC during a similar time period despite having called Jews “devils” and saying Israelis are less than human. When these cases were reported in the media, the BBC disingenuously called them “eyewitnesses”. But normally the same eyewitnesses don’t appear on TV over 500 times in less than two years.

The BBC News homepage has a series of news tabs in the red strip at the top that begins with Home News and In Depth, and then the first specific news topic is ‘Israel Gaza War’ - before Ukraine, UK, World, Business, Politics or anything else. Needless to say, the war and famine in Sudan, on a scale far worse than Gaza, doesn’t get a tab of its own at all.

If all you're doing is going by your gut, you shouldn't pretend otherwise. An "irrational prior" is indistinguishable from a "gut feeling", but it's draped in the language of dispassionate, disinterested analysis.

Ahhh. That's actually a valid point! "Literally" as in "there are no hidden steps I'm eliding, that's all there is to it". Damn. I've been wrong all these years, it's an entirely valid usage of the word as originally defined.

I think the typical way Millennials and Gen Z signal uncertainty in their declarative statements is by dropping punctuation entirely ("Trump is a Nazi." sounds a lot more definitive than "trump is a nazi"), peppering them with Internet initialisms (likewise "trump is literally a nazi lol, but w/e idk") or textual recreations of uptalking ("umm, did you miss that trump is literally a nazi??"). I agree that this a phenomenon more closely associated with women and gay men than with other demographics. But in my experience, I haven't really encountered young people using ellipses in this context so far as I remember: with only a handful of exceptions I can recall, they're always used by people significantly older than me.

You know, I haven't even seen that movie, and yet that clip popped into my head when I was writing the comment.

I want more neoliberals policy wonks in here.

Hello there.

If done about a specific individual, I think "emotional blackmail by proxy" would be a more accurate designation. "Distributed emotional blackmail" sounds more like "you need to do X for $Community, because think of how it'll hurt the feelings of the members of $Community if you don't".

I don't know if this has been mentioned in any of these discussions about AI-generated art, but on the off-chance it hasn't, it would be remiss of me not to mention that Roald Dahl (author of Charlie and the Chocolate Factory, Matilda, The BFG and just about every other non-Harry Potter book you loved as a child), predicted the use of generative AI to compose fiction. In 1953 (probably a few years earlier, in fact).

I highly encourage you to read the linked short story, "The Great Automatic Grammatizator": like most of Dahl's numerous stories for adults it's wickedly funny and creepily unsettling. A short synopsis: a talented mechanical inventor named Knipe harbours frustrated literary ambitions, and after yet another rejection from a publisher, he has an epiphany in which he realises that the rules of grammar governing the English language are almost mathematical in their strictness — hence, it should be trivial to design a machine which, once assigned some input parameters, can "compute" a passage of text in much the same way that a calculator computes a mathematical formula or equation. As a tremendous act of revenge against the publishing industry, Knipe sets to work building the machine and presents it to his boss, Mr. Bohlen, who is initially sceptical but eventually converted when literary magazines buy the short stories (attributed to pen names) produced by the machine. Knipe later sets about modifying the machine so that it can compose novels in addition to stories.

The most ingenious touch, I thought, was the end of the story, when the narrator explains that Knipe and Bohlen eventually expanded their operation to buy the rights to use real authors' names and likenesses, attaching them to books composed by machine.

But on the whole, it was a satisfactory beginning. This last year—the first full year of the machine’s operation—it was estimated that at least one half of all the novels and stories published in the English language were produced by Adolph Knipe upon the Great Automatic Grammatizator.

People often talk about how AI-generated art is soulless and lacking the creative spark necessary for authentic human emotion. I won't pass comment on whether this is the case. If someone hasn't yet been moved to tears by an AI-generated passage of text or piece of music, I have little doubt that such a thing will happen before the end of the decade. "So what? People have been moved to tears by Twilight."

Fair enough. But my question is this. Right now, you can get an AI to generate a sample of Taylor Swift's voice which, to anyone who isn't a professional musician or trained sound engineer, sounds indistinguishable from the real thing. As the technology improves and with access to more and better training data (say, exclusive access to Swift's own master recordings, archive of unreleased songs, and isolated vocal takes), even sound engineers are likely to be taken in. The technology to generate musical instrumentals is likewise getting better every day.

In a world where Taylor Swift goes into business with OpenAI and grants them the right to use her name and likeness on musical releases generated by ChatGPT (or whatever dedicated music-generation software they use), do you really think you could tell the difference? Or what about AI-generated novels attributed to James Patterson, with his permission? I mean, it's only one step removed from using ghostwriters, something Patterson is open about doing. Of the novels published under the Tom Clancy banner, a majority were written by other writers, and a significant chunk of those published postmortem, meaning Clancy could not have been involved in their composition even in principle (not even giving them a cursory once-over before typesetting).

Better yet — how do we know this hasn't already happened? Taylor Swift's latest album is her most poorly-received release since 2017's Reputation, with neither the album nor any of the songs from it receiving Grammy nominations (the 68th edition of the Grammys will be the first time Swift hasn't been nominated for anything since 2017), and many reviews commenting on it sounding creatively exhausted, predictable and lacking novelty or dynamism. Isn't a solid, pleasant but unremarkable and creatively sterile album exactly what we'd expect from an album generated using Taylor Swift's previous albums as training data? When Swift is rehearsing for the next album tour, isn't it possible she'll be learning to sing the songs on the album for the first time?

I'm not yet concerned about artists being supplanted entirely by AI-generated artwork: I think an author or musician's name recognition is still a vital part of what makes their releases commercially successful (which is why the names of James Patterson, Stephen King, Tom Clancy or Danielle Steel are always in much larger text on the covers of their novels than the title). But the ending of "Great Automatic Grammatizator" sounds eerily plausible to me: a world in which an author writes and publishes one or two hit novels the traditional way, a major publisher takes notice, and gets the author to sign a contract granting the publisher exclusive rights to publish AI-generated books under the author's name, in perpetuity. There will inevitably be gaffes in which the author is being interviewed about what their latest book is about, and it will become glaringly obvious that the author doesn't know what it's about, because they haven't read it, never mind written it. Or the author will be approached by a fan at a convention who'll ask them to sign a copy of a book published under their name, and the author won't have even heard of this book.

Contrary to your claim that Cambridge recognises no distinction between the two terms, the page for "jealous" acknowledges a secondary meaning: "upset and angry because someone that you love seems interested in another person". This secondary meaning is absent from the definition of "envious". This obviously implies that the page for "jealousy" is incomplete, as for consistency's sake it ought to include a secondary definition along the lines of "the state of feeling upset and angry because someone that you love seems interested in another person".

The Merriam-Webster article you linked cites no sources for its claim that the two words have always been used interchangeably, but quotes multiple scholars who argued that the two terms are not synonymous.

Wikipedia claims that "jealousy" has always been distinct from "envy", and notes that the original root of the word is the biblical "zeal" which at the time meant "tolerating no unfaithfulness". Another claimed root is the word "gelus" which likewise meant "possessive and suspicious".

"Arguing in bad faith" means arguing dishonestly and using arguments you don't really believe, in which the goal is to frustrate or antagonise your interlocutor rather than engage in earnest truth-seeking.

Based on the way the phrase is used on social media, you could be forgiven for thinking it means "you believe something that I don't" or "you expressed a non-woke opinion".

An extremely niche one I've complained about before is members of the rationalist community using rationalist lingo in contexts in which it is obviously inappropriate, as part of some kind of weird cargo-cult approach to in-group membership. I once saw a guy saying that he had an "irrational prior" on believing X over Y.

If it's irrational, it's not a prior. Stop it.

When I was in college, I noticed a lot of people using "excessive" to mean "a great deal". It means "too much". That one seems to have fallen out of favour, thankfully.

If the average person uses a word to mean X, then the word means X, surely?

DESCRIPTIVISTS, BEGONE

Many common expressions only make sense if "jealous" has a distinct meaning from "envious". "To guard sth jealously" — how can you "guard" something if someone else has it? Likewise "a jealous husband".

Not that I recall, no. I went on two dates with a doctor a few years ago, but I don't remember her mentioning anything about a car accident.

Did you find them helpful? Asking for a friend, as the Redditors say.

In this case it's a colleague of mine in her fifties. We have a rota in which each department is responsible for buying milk for the office each week, and my colleague (who we'll call T) was offering to do it on behalf of our team next week. I don't think any passive-aggression was intended or implied, which makes the choice of punctuation all the more baffling.

No comment.

Less than a hundred pages from the end of The Story of a New Name. All Napoli men are bastards.

The female version of my real name is significantly more common in Ireland than the male, and is so common in the broader Anglosphere that I'm sure many Brits and Americans would actually be surprised to learn that it's an Irish name, whereas my name is practically unheard of outside of Ireland. As a consequence, I routinely get emails addressed "Hi [female version of my name]", even if they're direct replies to emails I sent them in which my name is clearly indicated in the From field, the email signature, and the profile photo is of a tall, bearded man wearing a shirt and tie.

This is bad enough when it's Brits or Americans misgendering me: it's inexcusable when my fellow Irish do it.

For some reason, overuse of ellipses is as big a giveaway of someone's age as overuse of em-dashes is for ChatGPT.

Many people I know over the age of fifty seem constitutionally incapable of writing "I'll buy some milk on Monday" or "I'll buy some milk on Monday.", instead feeling this weird compulsion to jazz it up with "I'll buy some milk on Monday..."

I'm far from the first person to notice this peculiar generational touchstone. I have no idea what this is intended to convey but it creeps me out.

Even worse is those people who haven't realised that an ellipsis contains three full stops, no more, no less, so you end up with even weirder constructions like "I'll buy some milk on Monday.." or "I'll buy some milk on Monday............."