site banner

Small-Scale Question Sunday for January 18, 2026

Do you have a dumb question that you're kind of embarrassed to ask in the main thread? Is there something you're just not sure about?

This is your opportunity to ask questions. No question too simple or too silly.

Culture war topics are accepted, and proposals for a better intro post are appreciated.

1
Jump in the discussion.

No email address required.

Does anyone have their own equivalent of a personal "antimeme", a concept you familiarize yourself with (potentially with difficulty) and then inevitably forget unless you make an intentional effort to look it up?

In no particular order:

  • I often have to look up whether I need an x86 or x64 executable when I need to download a program
  • ECGs. Fucking ECGs. I get good at understanding them when I absolutely have to (before exams), but guess what, by the time the next one rolls around, it's all out of my head.
  • Fourier transforms (how they actually work, and not the conceptual strokes)
  • And many more, all of which stubbornly refuse to come to mind, because of course they do.

I generally have an extremely good memory that I've been semi-coasting on most of my life. It made tests in school trivial but also drove a lot of boredom as I'd read the entire book for class immediately then get bored a couple weeks later.

Anyway, I can potentially have an issue with any piece of information if I initially learn it wrong. The factually incorrect memory sticks just as strongly as the accurate memory. I have to "patch" over the bad memory with a new, disctinct "correcting" memory. Forever. The best example of this: I learned east & west backwards when I was 6 years old. I realized around 9 or 10 I'd done this; they painted a giant map on the school playground with the directions labeled. 43 years later west and east are still backwards, but I have a second memory that overrides the first one: "you learned this wrong, you have them reversed". There are probably 10-15 things like this. I've got Slovenia and Slovakia locations switched (I grew up around both groups of immigrants and have some Slovak ancestors, it wasn't just trivia).

I also can't remember names. They're too arbitrary; a series of mouth sounds/written symbols that represent a person with no real connection to the person. People's names might as well be 4 digit numbers for all the connection they have to the person and my ability to remember them. I have to make flashcards when I move teams at work or there are new people I've met that it is important I remember them. I have spent probably 50+ hours of my life over the last 40 years intentionally drilling myself to remember important names. I can also learn these wrong like in the first example; I try to just stop using the names of these people if I can. I also have trouble telling people apart. Its not full blown prosopagnosia, I can tell different people apart generally ok. But if they are the same age, gender, race, and culture I will struggle. I actually do worse with members of my own race/culture. I also tend to find the overwelming majority of people to be rather interchangable, non-unique, and frankly boring at a personal level, which doesn't help.

On a side note, if someone's name is in a song, I can relibably get that song to start playing in my head when I see them. Infinitely easier than remembering their name.

I can never reliably spell "manoeuvre" ("manoeuver" for the Yanks) without looking it up.

("manoeuver" for the Yanks)

Maneuver. Americans dropped the o.

Thereby further demonstrating my point.

I felt like a lot of my job as a programmer was always a few months of mental atrophy away from being totally forgotten. And now that I've not had a real programming job for a while, I'm pretty sure that is exactly what happened. I remember in broad strokes what I've done. But the specifics have to be relearned almost every time. The only skill I ever had as a programmer was being fast at looking up how to program things. Now that coding AIs exist that skill isn't even unique.

Finance and tax related things are also things I have to constantly relearn or look up. The only financial things I'm good at remembering are things that have zero personal utility.

Embarrassing is embarrassing for me to spell.

This effects? affects? effects? IMPACTS my willingness to use the word in writing.

If X changes Y, then X has affected Y.

The change that X wrought on Y can be described as the effect that X had on Y.

Additionally, if X pretends to be Y, we can say that X affects Y (normally used in the noun form "affectation").

But although "affect" is usually used as a verb and "effect" as a noun, both can be used as verbs and nouns.

If X sets out to change Y, then it can be said that X effects change in Y.

And the emotional state of X is also known as X's emotional affect. One of the diagnostic criteria for psychopathy is a "blunted" or "flat" affect.

Yeah I can sit and puzzle it out, but when I'm writing quickly I stall thinking about it and then just use a different word entirely to keep momentum going.

jq

as in the programming language.

whenever I need to work with some JSON files (often messy) I have to take 30 minutes to re-learn it from the ground up. Then I am off and running. But the retention after more than a day not using it is zero.

This is notable because for work in general purpose programming languages, especially the venerable old python, all of the bread and butter things come back to me even if I haven't written anything in months.


Tiny little end note: this is all now no longer relevant because of claude code

Tiny little end note: this is all now no longer relevant because of claude code

I need to get a subscription. I spent an hour this morning trying to get Gemini to write some GCP MQL queries and each time it either mangled the syntax, hallucinated functions that don't exist, or changed my endpoint names from camelcase to lowercase (wtf). I hope Claude Code is better because I hate writing monitor queries.

Do it! You don't need the $200 / month out of the gate. Pro is $20 (I think?) and it's hard to burn through a days limit unless you're hammering on it constantly or using sub agents.

Take a half a Saturday / Sunday and read a good prompt engineering guide. "Spec Driven Development" is one of the good keywords to use on Google or X.

Spec Driven Development

Awesome, I had not heard of this. Looking into it this week.

Plenty. Most of the program/systems at my previous job had a bunch of automation I setup/modified and left alone for years at a time, only for something to change and I'd have to go back and re-trace my steps to get to where I was previously.

(Or they suddenly require a new input/ID number that I have no clue where it's pulled from and have to go on a deep-dive. Cripes.)

I'm sure if I was doing it day-in-day-out I'd be able to knock it out in 5 minutes, but...

I can do TIA 586B in my sleep (orange-white orange, green-white blue, blue-white green, brown-white brown), and only get TIA 586A right without looking it up 50% of the time. That's not a good combination.

Constantly get df and du commands mixed up. RAID 0 versus 1. Vi keyboard shortcuts.

For historical reasons, a lot of FAA regulations go by the name 'Part XXX'. I can tell you the differences between Part 61 and Part 141 flight schools at length, so long as you don't ask me which is which. Sometimes I'll even throw in Part 151, which isn't even a relevant thing. Likewise, tach time versus hobbs time drives me up the wall, and I have to derive tach->tachometer->turning->engine time every time.

Sin and cos. And I use them a lot.

I can do TIA 586B in my sleep (orange-white orange, green-white blue, blue-white green, brown-white brown), and only get TIA 586A right without looking it up 50% of the time. That's not a good combination.

Okay, funnily enough, while I do have both schemes straight in my head (we had one building that for whatever reason seemed to have constant network issues that we largely solved by going from 586B to 586A, so I got used to the difference pretty quickly as we reworked the wiring) I swear I had 586B as 586A in my head and vice versa until seeing this post and reminding myself that no, we were using A in that clinic and not B. Doh! facepalm

Vi keyboard shortcuts.

The "best" way is to train literal muscle memory. vi/m "grammar" is kind of a brain breaker if you try to front of mind process it. But if you just let your fingers fly to move line 38 to mark b, it usually works.

Sin and cos. And I use them a lot.

SOHCAHTOA coming in clutch for an entirely different kind of Indian.

ECGs. Fucking ECGs. I get good at understanding them when I absolutely have to (before exams), but guess what, by the time the next one rolls around, it's all out of my head.

I am in this comment and I do not like it.

Et tu, Brute? I keep imagining there's some mysterious phase change where repetition makes it stick.

I had a very awkward referral once, for a patient with a TCA overdose. I looked at it, knew what it was, but when the person taking the referral asked me to describe it, I was "uh... Those T waves look tented?"

Once you get something down you realize you have some new bizarre shit to memorize. It is endless.

How many days are in each month. I was once 'taught' how to do it using the knuckles on my hands, but it didn't make an ounce of sense to me at the time. I remember that December has 31 (due to NYE), and January also has 31 and February has 28 (unless it's a leap year). I'll struggle with the rest.

Mnemonic: thirty days have September, April, June, and November. And February is fucked. Have to really emphasize the rhyme, but it's the only way I've gotten it to stick.

Thirty days have September, April, May, and December. All the rest have thirty one, save February, which is "fun".

December has 31 days.

This does point out a slight problem with the mnemonic though, which is that every month mentioned has multiple other months that read the same way.

Thirty days have December, August, May, and September. All the rest have thirty-one, except for January, which has 26.

There's no way to correct from something wildly wrong like this to the correct rhyme (other than looking them up to check, which defeats the point of the exercise), because this rhymes and scans just as well. At the end of the day you're just memorizing the right months and numbers.

But the rest of the rhyme is correct.

No, because May also has 31 days lol. The rhyme goes "September, April, June, and November".

Thirty days hath September/April, June, and November/All the rest have thirty-one/Except Januarifebruary, which has seventy-eight.

I didn't bother to memorize the number of days in each month until I started playing Paradox games that encouraged me to constantly keep an eye on the calendar for the "new-month tick". So try spending a few hundred hours on playing Paradox games.

@sun_the_second

I still have to do the knuckles every time.

How it works is you count knuckles and valleys between them. Knuckle = 31, valley = not 31. Once you reach little finger knuckle, go straight back to index finger knuckle (July-August).

Aha! Thanks!

Whenever I read about the various generations that have followed after my own Boomer cohort, I have trouble remembering which one my children are in. I must have looked it up a dozen times and I still can't retain it.

I often have to look up whether I need an x86 or x64 executable

It doesn't help that sometimes they refer to x86-64 as just x86 (assumes 32-bit address space are obviously deprecated) or AMD64 (as if that tells you anything about the instruction set). I suppose this is also a product of age and the computer market you grew up in. There was a time in the US when the IBM PC running an 8086 was the personal computer, and the fact that all other 86s descend from there feels natural. There was also a time when a 64-bit CPU felt like you were living in the future, e.g. Nintendo 64.

ECGs

I leave this one to the physicians. Small related story, though. In an effort to get me to stop bothering them, I once had a circuits lab TA tell me to go off and build an ECG. I did at some point succeed at "building" the world's shittiest ECG; at least it made an appropriately squiggly-looking line (relying on the oscilloscope for 98% of the work, of course). I'm pretty sure that experience has only left me more mystified about what an ECG is supposed to do.

Fourier transforms

Two useful notes here.

  1. The vast, vast majority of applied math at this level is just linear algebra with a Scooby-Doo mask on.
  2. If you're looking for a 'picture' to hold in your head, this 3Blue1Brown is a classic. Surprisingly appropriate for a huge range of mathematical sophistication.

And many more, all of which stubbornly refuse to come to mind, because of course they do.

This happens to me all the time, which calls back to my annoyance with LLMs sometimes. I'm sure it's partially a problem of imprecise prompting.

Often I will ask: "I'm trying to recall the name for something that is like X, Y, and Z. Can you help me determine what concept I'm looking for?"

Reply: "The concept is called XYZ and it works by X, Y, and Z." Entirely a hallucination when you then go to search for XYZ.

AMD64 (as if that tells you anything about the instruction set)

Was there a second 64-bit instruction set invented by AMD?

I was being imprecise here, and I do not have all that extensive of knowledge of the landscape of instruction sets and architectures out in the wild.

That being said. If you already know what you want, AMD64 is unambiguous and interchangeable with x86-64. As a name it is less legibly part of the x86 lineage than e.g. 8086, iAPX 286, i386, etc.

I am not aware of a second 64-bit instruction set invented by AMD. It is plausible there exists some highly specialized instruction set out in the wild invented by AMD that is 64-bit, but no one would reasonably assume you were talking about that if you referenced AMD64. AMD the manufacturer does or has produce other 64-bit instruction set processors e.g. the AMD Opteron A1100, which uses the ARMv8-A instruction set.

Yeah, that last bit is a more common point of confusion than you might expect for normies: Linux software supporting both AARCH64 and x86-64 took off for single-board computer support, and a surprising number of people saw AMD64 and thought it meant the former.

I did at some point succeed at "building" the world's shittiest ECG; at least it made an appropriately squiggly-looking line (relying on the oscilloscope for 98% of the work, of course). I'm pretty sure that experience has only left me more mystified about what an ECG is supposed to do.

The heart goes through sequential contraction and relaxation phases, with the upper atria and lower ventricle being out of phase. This is governed by electrical waves propagating roughly top down. Since we're talking about a chemical process (ions crossing membranes), there's noticeable conduction delay.

Roughly speaking, it kicks off near the top of the heart, and has a "highway" of rapid conduction down the middle. There's increased latency the further you go.

We place multiple electrodes on the limbs and chest:

*The leads placed on the chest measure changes in voltage propagating perpendicular to the skin (front and lateral).

  • The axial leads measure measure the projection of the heart's electrical axis to the vector connecting the leads, going ~left to right and top-bottom.

You draw a chart. Leads V1 and V2 focus on the anterior-right of the heart, 3 and 4 are a bit lower and right above the heart, so you get the anterior picture, 5 and 6 show you what's going on in the sides. The limb leads help figure out the inferior bit.

Once we have established a baseline, then we look at a patient's ECG for deviations from the norm. Too much or too little voltage, or an unusual delay between phases, these can all point to cardiac pathology, and we can localize based on which views are aberrant. For example, in a heart attack, the leads reading anteriorly will, badum-tss, be the ones most out of whack if the damage is on the anterior aspect of the heart (anterior myocardium/muscles), and so on. And those delays in conduction point towards something wrong with the inbuilt cardiac pacemakers or that highway I mentioned.

In effect, an ECG isn't just a single image, it's closer to tomography. The additional leads provide clear advantages over just attaching a potentiometer to someone's toes and fingers.

Of course, it gets much more complicated in practice. Especially when a patient has multiple heart conditions at once, I start sweating when I have to interpret those even when I'm fully up to speed. And it's all the worse in psychiatry, because you can't rely on the patients to be particularly cooperative. And it hurts when you pull off the adhesive on the cups and it takes chest hair with it.

If you're looking for a 'picture' to hold in your head, this 3Blue1Brown is a classic. Surprisingly appropriate for a huge range of mathematical sophistication.

But Pagliacci, I've tried clown therapy :(

3B1B is excellent, and his video on the FT is my go to. It's just that I forget the details beyond "you can decompose arbitrary analog signals into a sum of sine waves".

Reply: "The concept is called XYZ and it works by X, Y, and Z." Entirely a hallucination when you then go to search for XYZ.

Which model? Hallucinations have become quite rare on the SOTA models, especially the ones with internet search enabled. It's not like they never happen, but I'm surprised that they're happening "all the time".

Yes, this does look like what you read when you look up what it's supposed to do.

And it hurts when you pull off the adhesive on the cups and it takes chest hair with it.

My electrodes may or may not have been bare stranded copper wire duct-taped to myself. I still can't believe what we used to get away with.

You can decompose arbitrary [X] into a sum of [Y basis]

It is in fact all linear-algebra all the way down.

SOTA models

I don't think it's so much a problem with the power of the model, but rather my own vagueness with recollection and prompting that I get back out what I put in.

  • Every year, without fail, I'm surprised to learn that there are 31 days in January. It always "feels" like a month with only 30 days in it.
  • I can reliably state the birthdays of my brother and older sister without checking. For my mother, father and younger sister, if they weren't in my calendar I wouldn't have a clue.

There’s a bunch of Japanese words that will not stick. I’d list some but I don’t remember them…