ThenElection
No bio...
User ID: 622
+1.
People often claim that tests are an ideal use case for AI, but it's not at all my experience. My experience is more the opposite: it will write plausible or even correct code for a bug or feature but then write really bad tests.
I still use AI for them, but usually have to explicitly describe, in sometimes painstaking detail, what needs to be tested. I think that still saves time, but I'd guesstimate on average it ends up being half of my time (prompting, re-prompting, etc) to use AI vs just coding it up myself.
My experience: I would estimate around 80% of the code I submit today is LLM generated. It's very useful for reducing the time I spend actually writing code, as a SWE. But 1) that is a minority of the time I actually spend at work and 2) it's eliminating the part of the job that's most calming and pleasurable and even meditative for me.
I have quite good results with a workflow where I painstakingly describe what I want done. I'll spend a lot of time understanding what needs to be done, and then 15-30 minutes describing, in detail, what needs to be done, and supplying the necessary context. It's not quite literal NL-to-code--the LLM doesn't need to be told line by line what needs to be done--but I do not give it space to make architectural or design decisions. It then can more or less one-shot it, but not consistently enough where I don't need to review the code before sending it out for review. And, when it comes to testing, they're surprisingly bad: when I do change code it's written, it's typically adding new tests or deleting irrelevant ones (though admittedly through telling the agent that it's a retard and needs to do a specific different test).
So, my velocity is increased. But, at least for me, it means less time spent in the most pleasurable part of the job, and more time spent in requirements gathering, navigating bureaucracy, updating spreadsheets for leadership. I fucking hate it. Even though it's probably good for my employer, I have to shed a tear for the death of coding/implementation as an important job skill.
I can imagine LLMs, in the next two years or so, supplanting design/architectural decisions. That makes the situation worse: I'll not be a software engineer, but an engineering manager supervising LLM agents. That's a deep loss to me, and I'm happy that my target retirement date is in 5 years or so.
And the correct answer is yeschad. They were the good writers.
They are disproportionately the best writers: e.g. no writers really compare with McCarthy or Pynchon IMO. But it's a continuum, and there are non white male writers who are genuinely great. E.g. Didion, O'Connor. Still absolutely worth setting aside a couple hours for (and worth your time far more than another round of Netflix slop or shit posting).
Although, I appreciate the idea of saying "fuck you" to people who say I'm morally flawed unless I implement affirmative action in my reading choices.
Close to the Machine, by Ellen Ullman.
Christianity goes a lot further than "be kind to the less fortunate," though. The last shall be first, the meek will inherit the Earth, God chose the weak things to shame the strong, etc. That does seem like a radical change from, well, the history of the universe, and it doesn't seem crazy to see a connection between that and Wokeness.
Not at all a gamer, but I am an avid reader. And contemporary literature has many of the same issues (with different inflections) as video games.
My solution: exit. For the past year, I've only read books written in the 20th century, and it's been such a breath of fresh air. Instead of endless variations of progressive morality tales adapted to different settings, you get genuine variety of perspectives. Mentioning this elsewhere, the usual response is "oh, so you're just reading dead white men instead," but it's not at all that. You get writers of both sexes and all races bringing new perspectives to the table. Currently I'm reading an excellent memoir by a bisexual, Jewish, female software engineer, and you get none of the drivel that would be put to the page today.
This may have limited applicability to gamers: games are more social, require a much greater investment to produce, and the average game in 2025 is better (I assume) than the average game in 1995, despite wokeness. Which points to the problem for people wanting better video games today. So long as people are buying the ones produced, that's what you're stuck with, and there's not much you can do besides quit altogether.
The Official Account (Bugliosi) is that Manson was trying to trigger a race war.
The full schizo account is that he was a subject of MKULTRA and CHAOS, and the CIA nurtured him to discredit the hippie/anti-war movement.
What's interesting is that if you dig into his time in San Francisco, you do find some pretty weird stuff related to his parole officer (a grad student at Berkeley, studying drugs and collective violence, who had a single parolee he was supervising and who managed to keep Manson out of prison for violations as diverse as grand theft, drug dealing, and rape) and the clinic Manson and his followers went to.
In all likelihood, Manson wasn't some kind of intentional project, but a series of irresponsible fuck ups by government. And his killings weren't an attempt to start a race war, but some combination of an attempt to distract from a prior crime and a drug deal gone bad.
- Prev
- Next

Per context window (though for projects I have a template). Psychologically, the process of re-prompting after failure is intolerable to me; probably I'm overdoing it, but it makes the interaction more pleasant to me.
I agree about the creativity it enables: every one or two weeks, my wife asks me for some browser extension or script for a narrow use case, and it's incredibly gratifying to be able to send her a solution in ten or fifteen minutes.
More options
Context Copy link