site banner

Small-Scale Question Sunday for August 10, 2025

Do you have a dumb question that you're kind of embarrassed to ask in the main thread? Is there something you're just not sure about?

This is your opportunity to ask questions. No question too simple or too silly.

Culture war topics are accepted, and proposals for a better intro post are appreciated.

1
Jump in the discussion.

No email address required.

It's a laptop cpu...? Do people buy expensive laptops in order to run local llms on them? Just curious.

It's a workstation laptop CPU that is faster than my 5600X and a bunch of Chinese manufacturers (plus Framework) are making mini-desktops around it.

Heading off on a tangent, mini-desktops are pretty good now. I wouldn't want one as my daily driver, but they're completely capable of running a web browser, and therefore 95% of everything most people do.

Even Intel N100 is good enough for Chrome/Office/Netflix and you can build a pocketable fanless mini-PC with one.

You see it on /r/locallama a bit. It’s usually slow, but for async tasks that may not matter as much, and being able to run higher bpw helps a lot.

If you can use online services, they’ll absolutely paste most local llms at this scale, but there’s a lot of use cases where online services aren’t an option, or philosophically unpalatable.