site banner

Friday Fun Thread for October 31, 2025

Be advised: this thread is not for serious in-depth discussion of weighty topics (we have a link for that), this thread is not for anything Culture War related. This thread is for Fun. You got jokes? Share 'em. You got silly questions? Ask 'em.

2
Jump in the discussion.

No email address required.

Not OP, but depending on what you are doing and your config you could almost certainly run glm-4.5-air or openai/gpt-oss-120b, which are roughly 100B class models.

A name brand box with a AMD Ryzen AI MAX+ 395 and 128GB of unified LPDDR5 RAM would probably be just over that now, but you can probably find a no-name box on sale for around $1.5K from time to time. Performance would obviously be worse than duel RTX 4000 Adas or something, but a lot cheaper.

The use case I'm imagining is like a background task doing a code review or auditing a highly sensitive code base to check for potential vulnerabilities, intentional or accidental. I could also imagine using something like that to slowly scrub through heath, financial, or other sensitive files. Either for auditing purposes or converting to structured data.

It would probably be a bit slow, but for anyone who has to work in an air-gaped environment it seems like it would actually be supper useful. It saves you having to send a query to the public internet the majority of times you have to look things up. Just replacing google searching, or (bleh) having to look something up in a paper book. It doesn't take that many uses from an engineer making 200k a year saving a few minutes to make it ROI positive for a business. Even just the time it takes to transcribe something you looked up from the internet facing machine to your offline machine. I suppose it depends on how many people are in the working group whether it would be more efficient to have some beefier centrally hosted machine on the intranet.

Even if it doesn't have to be air-gaped, I imagine if you have like 100+ employees dropping like $20k might still be cheaper over like 1-year than paying for an API provider. Especially if there are a bunch of compliance problems with sending things off site.