site banner

Small-Scale Question Sunday for January 18, 2026

Do you have a dumb question that you're kind of embarrassed to ask in the main thread? Is there something you're just not sure about?

This is your opportunity to ask questions. No question too simple or too silly.

Culture war topics are accepted, and proposals for a better intro post are appreciated.

Jump in the discussion.

No email address required.

Does anyone have their own equivalent of a personal "antimeme", a concept you familiarize yourself with (potentially with difficulty) and then inevitably forget unless you make an intentional effort to look it up?

In no particular order:

  • I often have to look up whether I need an x86 or x64 executable when I need to download a program
  • ECGs. Fucking ECGs. I get good at understanding them when I absolutely have to (before exams), but guess what, by the time the next one rolls around, it's all out of my head.
  • Fourier transforms (how they actually work, and not the conceptual strokes)
  • And many more, all of which stubbornly refuse to come to mind, because of course they do.

I can do TIA 586B in my sleep (orange-white orange, green-white blue, blue-white green, brown-white brown), and only get TIA 586A right without looking it up 50% of the time. That's not a good combination.

Constantly get df and du commands mixed up. RAID 0 versus 1. Vi keyboard shortcuts.

For historical reasons, a lot of FAA regulations go by the name 'Part XXX'. I can tell you the differences between Part 61 and Part 141 flight schools at length, so long as you don't ask me which is which. Sometimes I'll even throw in Part 151, which isn't even a relevant thing. Likewise, tach time versus hobbs time drives me up the wall, and I have to derive tach->tachometer->turning->engine time every time.

Sin and cos. And I use them a lot.

Sin and cos. And I use them a lot.

SOHCAHTOA coming in clutch for an entirely different kind of Indian.

ECGs. Fucking ECGs. I get good at understanding them when I absolutely have to (before exams), but guess what, by the time the next one rolls around, it's all out of my head.

I am in this comment and I do not like it.

Et tu, Brute? I keep imagining there's some mysterious phase change where repetition makes it stick.

I had a very awkward referral once, for a patient with a TCA overdose. I looked at it, knew what it was, but when the person taking the referral asked me to describe it, I was "uh... Those T waves look tented?"

Once you get something down you realize you have some new bizarre shit to memorize. It is endless.

How many days are in each month. I was once 'taught' how to do it using the knuckles on my hands, but it didn't make an ounce of sense to me at the time. I remember that December has 31 (due to NYE), and January also has 31 and February has 28 (unless it's a leap year). I'll struggle with the rest.

Mnemonic: thirty days have September, April, June, and November. And February is fucked. Have to really emphasize the rhyme, but it's the only way I've gotten it to stick.

I didn't bother to memorize the number of days in each month until I started playing Paradox games that encouraged me to constantly keep an eye on the calendar for the "new-month tick". So try spending a few hundred hours on playing Paradox games.

@sun_the_second

I still have to do the knuckles every time.

How it works is you count knuckles and valleys between them. Knuckle = 31, valley = not 31. Once you reach little finger knuckle, go straight back to index finger knuckle (July-August).

Aha! Thanks!

Whenever I read about the various generations that have followed after my own Boomer cohort, I have trouble remembering which one my children are in. I must have looked it up a dozen times and I still can't retain it.

I often have to look up whether I need an x86 or x64 executable

It doesn't help that sometimes they refer to x86-64 as just x86 (assumes 32-bit address space are obviously deprecated) or AMD64 (as if that tells you anything about the instruction set). I suppose this is also a product of age and the computer market you grew up in. There was a time in the US when the IBM PC running an 8086 was the personal computer, and the fact that all other 86s descend from there feels natural. There was also a time when a 64-bit CPU felt like you were living in the future, e.g. Nintendo 64.

ECGs

I leave this one to the physicians. Small related story, though. In an effort to get me to stop bothering them, I once had a circuits lab TA tell me to go off and build an ECG. I did at some point succeed at "building" the world's shittiest ECG; at least it made an appropriately squiggly-looking line (relying on the oscilloscope for 98% of the work, of course). I'm pretty sure that experience has only left me more mystified about what an ECG is supposed to do.

Fourier transforms

Two useful notes here.

  1. The vast, vast majority of applied math at this level is just linear algebra with a Scooby-Doo mask on.
  2. If you're looking for a 'picture' to hold in your head, this 3Blue1Brown is a classic. Surprisingly appropriate for a huge range of mathematical sophistication.

And many more, all of which stubbornly refuse to come to mind, because of course they do.

This happens to me all the time, which calls back to my annoyance with LLMs sometimes. I'm sure it's partially a problem of imprecise prompting.

Often I will ask: "I'm trying to recall the name for something that is like X, Y, and Z. Can you help me determine what concept I'm looking for?"

Reply: "The concept is called XYZ and it works by X, Y, and Z." Entirely a hallucination when you then go to search for XYZ.

AMD64 (as if that tells you anything about the instruction set)

Was there a second 64-bit instruction set invented by AMD?

I was being imprecise here, and I do not have all that extensive of knowledge of the landscape of instruction sets and architectures out in the wild.

That being said. If you already know what you want, AMD64 is unambiguous and interchangeable with x86-64. As a name it is less legibly part of the x86 lineage than e.g. 8086, iAPX 286, i386, etc.

I am not aware of a second 64-bit instruction set invented by AMD. It is plausible there exists some highly specialized instruction set out in the wild invented by AMD that is 64-bit, but no one would reasonably assume you were talking about that if you referenced AMD64. AMD the manufacturer does or has produce other 64-bit instruction set processors e.g. the AMD Opteron A1100, which uses the ARMv8-A instruction set.

I did at some point succeed at "building" the world's shittiest ECG; at least it made an appropriately squiggly-looking line (relying on the oscilloscope for 98% of the work, of course). I'm pretty sure that experience has only left me more mystified about what an ECG is supposed to do.

The heart goes through sequential contraction and relaxation phases, with the upper atria and lower ventricle being out of phase. This is governed by electrical waves propagating roughly top down. Since we're talking about a chemical process (ions crossing membranes), there's noticeable conduction delay.

Roughly speaking, it kicks off near the top of the heart, and has a "highway" of rapid conduction down the middle. There's increased latency the further you go.

We place multiple electrodes on the limbs and chest:

*The leads placed on the chest measure changes in voltage propagating perpendicular to the skin (front and lateral).

  • The axial leads measure measure the projection of the heart's electrical axis to the vector connecting the leads, going ~left to right and top-bottom.

You draw a chart. Leads V1 and V2 focus on the anterior-right of the heart, 3 and 4 are a bit lower and right above the heart, so you get the anterior picture, 5 and 6 show you what's going on in the sides. The limb leads help figure out the inferior bit.

Once we have established a baseline, then we look at a patient's ECG for deviations from the norm. Too much or too little voltage, or an unusual delay between phases, these can all point to cardiac pathology, and we can localize based on which views are aberrant. For example, in a heart attack, the leads reading anteriorly will, badum-tss, be the ones most out of whack if the damage is on the anterior aspect of the heart (anterior myocardium/muscles), and so on. And those delays in conduction point towards something wrong with the inbuilt cardiac pacemakers or that highway I mentioned.

In effect, an ECG isn't just a single image, it's closer to tomography. The additional leads provide clear advantages over just attaching a potentiometer to someone's toes and fingers.

Of course, it gets much more complicated in practice. Especially when a patient has multiple heart conditions at once, I start sweating when I have to interpret those even when I'm fully up to speed. And it's all the worse in psychiatry, because you can't rely on the patients to be particularly cooperative. And it hurts when you pull off the adhesive on the cups and it takes chest hair with it.

If you're looking for a 'picture' to hold in your head, this 3Blue1Brown is a classic. Surprisingly appropriate for a huge range of mathematical sophistication.

But Pagliacci, I've tried clown therapy :(

3B1B is excellent, and his video on the FT is my go to. It's just that I forget the details beyond "you can decompose arbitrary analog signals into a sum of sine waves".

Reply: "The concept is called XYZ and it works by X, Y, and Z." Entirely a hallucination when you then go to search for XYZ.

Which model? Hallucinations have become quite rare on the SOTA models, especially the ones with internet search enabled. It's not like they never happen, but I'm surprised that they're happening "all the time".

Yes, this does look like what you read when you look up what it's supposed to do.

And it hurts when you pull off the adhesive on the cups and it takes chest hair with it.

My electrodes may or may not have been bare stranded copper wire duct-taped to myself. I still can't believe what we used to get away with.

You can decompose arbitrary [X] into a sum of [Y basis]

It is in fact all linear-algebra all the way down.

SOTA models

I don't think it's so much a problem with the power of the model, but rather my own vagueness with recollection and prompting that I get back out what I put in.

  • Every year, without fail, I'm surprised to learn that there are 31 days in January. It always "feels" like a month with only 30 days in it.
  • I can reliably state the birthdays of my brother and older sister without checking. For my mother, father and younger sister, if they weren't in my calendar I wouldn't have a clue.

There’s a bunch of Japanese words that will not stick. I’d list some but I don’t remember them…

Do you eat raw garlic by itself?

Yep, seems to help mitigate colds.

Sometimes but I chop it up first. Feels cleansing. Also ginger.

I once ate a raw lemon and quite enjoyed it. It’s… bracing.

It's a bit of a bumpkin thing here, dipping a raw clove into a salt box and eating it.

Are you trying to root out the vampires on the forum?

I munched on some and realized I don't know if it's an ordinary snack or something weird to eat on its own.

Certainly weird. Shame on you.

AI is business class. What I figured out recently is that while AI is making my development more comfortable it doesn't make me more productive. For me dev work has always been primary problem solving and writing the code secondary. So I spend more time on logic, understanding and occasionally debugging. And human brain does seem to have limited capacity for decision making and understanding per day. It is like long haul flight - arrive at the same time at the airport. But the seats are nicer and you can nap in the full flat ones.

If I could pay $20 for an upgrade to business class, you bet I would.