site banner

Friday Fun Thread for January 30, 2026

Be advised: this thread is not for serious in-depth discussion of weighty topics (we have a link for that), this thread is not for anything Culture War related. This thread is for Fun. You got jokes? Share 'em. You got silly questions? Ask 'em.

Jump in the discussion.

No email address required.

If you're interested in running LLMs locally, but you don't care about playing ultra-graphics-heavy games, then you can buy a special AI-focused graphics card that has gigantic VRAM but a weaker processor. According to PassMark and Newegg:

GPU3D graphics ratingComputation ratingVRAM (GiB)Price (k$)
Nvidia GeForce RTX 5060 Ti2311160.6
Nvidia GeForce RTX 50903925323.7
AMD Radeon RX 90702515160.6
AMD Radeon AI Pro R97002715321.3

The R9700 is just an RX 9070 with extra VRAM stapled on! (Alternatively, if your computer has a full-size motherboard, you can combine two RTX 5060 Ti cards. This cannot be done with two RX 9070 cards.)


The World Wide Web Consortium has just published a very interesting note on how text-to-speech programs should deal with ruby/furigana text (tiny kana/pinyin/bopomofo characters that sometimes are written alongside kanji/hanzi as pronunciation aids; 1 2 3).


Premise: You want to make a personal copy of an online story that you're enjoying reading.

Problem: Story-hosting websites are likely to have anti-bot measures—or you just don't want to spend time figuring out the intricacies of automatic downloading tools.

Solution: Just manually smash that Ctrl-S key combination in your Web browser before reading every chapter! Once the story has ended (or has become boring), manually opening your several dozen HTML files in your favorite plaintext editor, deleting all the irrelevant cruft (Javascript, website infrastructure, reader comments) above and below the text of each chapter, adding proper heading elements, and assembling the results into a single unified file (and optionally compressing the HTML file into an EPUB file; note that this does require XML compliance, which can be a mild hassle for certain websites that disgustingly fail to properly close their p and hr elements) is the work of less than an hour.

(RoyalRoad does insert into each chapter an anti-piracy warning ("Ensure your favorite authors get the support they deserve. Read this novel on the original website.", "Unauthorized use of content: if you find this story on Amazon, report the violation.", "Royal Road is the home of this novel. Visit there to read the original and support the author.", et cetera) that is hidden by the site's CSS and therefore is revealed in your CSS-free local copy. But, since you aren't a bot, this warning is fairly easy to remove with a simple Ctrl-F for the warning's telltale HTML pattern (e. g., </p><span or </p><p), which appears nowhere else in the files. And, if you're so extremely lazy that you don't care about seeing an immersion-breaking sentence or two in every chapter, then there's really no need to remove the warning at all, since you aren't uploading this file to Amazon (whose own automated systems presumably would see the warning and reject the upload) for the purpose of the industrial copyright infringement that I've seen authors complain about.)

Fanficfare works pretty well to create an epub from most popular story hosting sites. I believe it also has a Calibre plugin. It has worked pretty well for me in the past to download from ao3, ff, etc.

I started using this manual procedure many years ago, after Fanfiction.net blocked the automatic-download website that I was using at the time and then blocked the automated-download program that I clumsily wrote for myself in Baby's First Java.

ArchiveOfOurOwn already has built-in downloading capabilities.