site banner

Small-Scale Question Sunday for February 15, 2026

Do you have a dumb question that you're kind of embarrassed to ask in the main thread? Is there something you're just not sure about?

This is your opportunity to ask questions. No question too simple or too silly.

Culture war topics are accepted, and proposals for a better intro post are appreciated.

2
Jump in the discussion.

No email address required.

The upcoming AI coding languages. Okay - what do you think the next gen of languages will be. After all the history of computer languages is taking freedom away from programmers - from assembler, trough C, trough protected memory, then we have shitty OO (C++), even shittier OO (Java), on top of those we have tried to put all kind of straitjackets - hibernate, DI, scripted languages move to be more and more strongly typed. And now we have AI that excel in building workable, unmaintainable code. The only thing that could reign in AI at the moment is Haskell. Probably. So we urgently need new ways to limit the chaos.

The latest generation of languages had to provide a dependency manager, a build system, a language server protocol and comprehensive documentation just to get people to look at them.

I'm afraid the next generation will have to provide trained LLMs that can write it as well, which means only the biggest AI players will be able to innovate. No more BDFLs or university professors coming up with brilliant ideas and implementing them in a new programming language.

No reason people couldn't just ingest language documentation into the context, or even fine tune an existing base model for their language.

It's still an uphill struggle against literal gigabytes of source code in Python, Java, JS and so on that LLMs have been trained on.

This is a fully general argument against any new APIs of any kind - after all, existing APIs are already in the training set.

Nevertheless, LLMs can learn to use APIs they haven't seen in pretraining.