This weekly roundup thread is intended for all culture war posts. 'Culture war' is vaguely defined, but it basically means controversial issues that fall along set tribal lines. Arguments over culture war issues generate a lot of heat and little light, and few deeply entrenched people ever change their minds. This thread is for voicing opinions and analyzing the state of the discussion while trying to optimize for light over heat.
Optimistically, we think that engaging with people you disagree with is worth your time, and so is being nice! Pessimistically, there are many dynamics that can lead discussions on Culture War topics to become unproductive. There's a human tendency to divide along tribal lines, praising your ingroup and vilifying your outgroup - and if you think you find it easy to criticize your ingroup, then it may be that your outgroup is not who you think it is. Extremists with opposing positions can feed off each other, highlighting each other's worst points to justify their own angry rhetoric, which becomes in turn a new example of bad behavior for the other side to highlight.
We would like to avoid these negative dynamics. Accordingly, we ask that you do not use this thread for waging the Culture War. Examples of waging the Culture War:
-
Shaming.
-
Attempting to 'build consensus' or enforce ideological conformity.
-
Making sweeping generalizations to vilify a group you dislike.
-
Recruiting for a cause.
-
Posting links that could be summarized as 'Boo outgroup!' Basically, if your content is 'Can you believe what Those People did this week?' then you should either refrain from posting, or do some very patient work to contextualize and/or steel-man the relevant viewpoint.
In general, you should argue to understand, not to win. This thread is not territory to be claimed by one group or another; indeed, the aim is to have many different viewpoints represented here. Thus, we also ask that you follow some guidelines:
-
Speak plainly. Avoid sarcasm and mockery. When disagreeing with someone, state your objections explicitly.
-
Be as precise and charitable as you can. Don't paraphrase unflatteringly.
-
Don't imply that someone said something they did not say, even if you think it follows from what they said.
-
Write like everyone is reading and you want them to be included in the discussion.
On an ad hoc basis, the mods will try to compile a list of the best posts/comments from the previous week, posted in Quality Contribution threads and archived at /r/TheThread. You may nominate a comment for this list by clicking on 'report' at the bottom of the post and typing 'Actually a quality contribution' as the report reason.

Jump in the discussion.
No email address required.
Notes -
In the beginning, the C programming language was created, and there was much rejoicing. C is perhaps the single most influential language in the history of computing. It was "close to the hardware"*, it was fast*, it could do literally everything*. *Yes, I am simplifying a lot here.
But there were a few flaws. The programmer had to manage all the memory by himself, and that led to numerous security vulnerabilities in applications everywhere. Sometimes hackers exploited these vulnerabilities to the tune of several million dollars. This was bad.
But it's not like managing memory is particularly hard. It's just that with complex codebases, it's easy to miss a pointer dereference, or forget that you freed something, somewhere in potentially a million lines of code. So the greybeards said "lol git gud, just don't make mistakes."
The enlightened ones did not take this for an answer. They knew that the programmer shouldn't be burdened with micromanaging the details of memory, especially when security is at stake. Why is he allowed to call
mallocwithout callingfree?* The compiler should force him to do so. Better yet, the compiler can check the entire program for memory errors and refuse to compile, before a single unsafe line of code is ever run. *Actually memory leaks aren't usually security issues but I'm glossing over this because this post is already long.They had discovered something profound: Absent external forces, the programmer will be lazy and choose the path of least resistance. And they created a language based on this principle. In C, you may get away with not checking the return value of a function that could error. In Rust, that is completely unacceptable and will make the compiler cry. The path of least resistance in C is to do nothing, while the path of least resistance in Rust is to handle the error.
That's what makes Rust a better programming language. And I have to agree with the zealots, they are right on this.
...So I have to be disappointed when they're not.
Rust seems to keep popping up in the news in the past couple of months. In November, a bug in Rust code deployed by Cloudflare took down their infrastructure, and half the Internet with it. (Why Cloudflare even has a monopoly on half the Internet is a controversial topic for another time.) The cause? A programmer didn't handle the error from a function.
Well that's technically not true, they did. It's just that calling
.unwrap(), a function which will immediately abort the application on error, counts as "handling" the error. In other words, the path of least resistance is not to actually handle the error, but to crash. I argue that this isn't a better outcome than what would have happened in C, which would also be to crash. Sure, the crash won't be a segfault in Rust, but that doesn't matter if half the Internet dies.This month, a CVE was filed in the Rust part of the Linux kernel, and it turned out to be a memory corruption vulnerability, ironically enough. "But how could this happen?" Rust has these things called
unsafeblocks that let you do unsafe memory operations, closer to what you would be allowed to do in C (though granted, I have heard convincing arguments that unsafe Rust is still generally safer than C). So the path of least resistance is not to do things the safest way, but to just surround everything inunsafeif you get tired of fighting the borrow checker.I hear the same pitch all the time from Rust advocates. "C is unsafe, programmers are too fallible, we must use a language that forces good code." They consistently blame the language, and don't blame the programmer. So how did they react to the above incidents? Did they blame the programmer, or the language?
unwraplike that." "Duh, don't useunsafe, it's obviously unsafe."If I was one of them, I would throw my hands up and admit that the language didn't have guardrails to prevent this, so if I would blame C in a universe where the incidents happened in equivalent C code, then I should blame Rust here. But then, I wouldn't be a Rust zealot. I'd just be a Rust kinda-supporter. I'd have to carefully consider the nuances of the language and take into account various factors before forming an opinion. Oh no, the horror! And if I went the other way and blamed the programmer, it wouldn't be long before I'd have this nagging feeling that I'm just like a C-nile greybeard, telling the programmers to git gud, and at that point, there seems to be less of a point to using Rust if we just assume that programmers are infallible.
It's a Catch-22, in other words.
To be clear, I'm not saying that these incidents alone mean Rust is a bad choice for anything, ever. I'm not saying Cloudflare or Linux shouldn't use Rust. I'm not telling people what they should or shouldn't use. I'm just pointing out the double standards. Rust people can attack C all day using one set of (IMO, entirely justified) standards, but when they are confronted with these incidents, they suddenly switch to another set of standards. Or to put it more clearly, they have a motte and bailey. Motte: "Rust can't prevent shitty programmers from writing shitty code." Bailey: "C is unsafe, because of all the memory unsafe code people have written, and we should rewrite everything in Rust to fix all of it!"
In defense of Rust:
unsafefrom the perspective of Rust. Like even if all of technology is built from the ground up in only safe rust, well, pesky cosmic rays get in the way and flip a bit somewhere.unsafestuff, and like always, it's git gud time.unsafe/unwrap. This bumps up the chances of bugs being caught before code is even introduced, during review, or even when it gets through, it's easier to find out where.I find criticisms that Rust is not good for exploratory work (data analysis, game development, scripting, etc.) much more persuasive, but then that just goes back to "find the right tools for the right job".
If there is a cultural war element to this, I think broadly people are yet again conflating their distaste of the tool (Rust/gun) with their distaste of the users (Rust community/gun owners).
And maybe a greater technology story of the usual people thinks !new_thing will solve all their problems, but actually !new_thing will only solve most problems and the remaining ones are the really complex ones (leading to a paradox of automation). And then certain people become cynical and disappointed and retreat to their old tools when others younger and newer people just adopts and proliferate the use of the !new_thing and then someday the cynical people wake up and found they missed the boat.
More options
Context Copy link
Preemptively: garbage collection is a collection of garbage and we would do well to rid ourselves of it. I do not consider garbage collected languages a viable option for anything that even vaguely cares about performance, and they are objectively not a viable option for kernel or firmware spaces.
That said... yes, a safer-C would be useful, and it would be nice if Rust could be that, but I don't think it can. C has too much inertia and there are too many places Rust made seemingly-arbitrary-from-the-perspective-of-C-programmers decisions that grind against C-like intuitions for a comfortable swap, and so since the "pain" of C is actually pretty darn low on a per-developer basis (even if the occasional memory safety CVE is a big problem for society) nothing short of an official Software Engineer Licensing system is going to get them to move. Sort of a tragedy of the commons problem. Try again, but be more like C. Maybe then.
There have been plenty of hard real time systems and operating systems using GC.
Common Lisp can dominate benchmarks (over C and Fortran) but often gets kicked out, because they say e.g. in-lining assembly doesn't count even though the CL programmer generates and optimizes that assembly from the REPL (emitting it via compile time macros or such). I've worked on CL HFT systems (n.b. since ~2017 the field's not looked anything like the popular world things, because of regulatory and policy changes.)
APL or BQN are also great and can write compilers at competitive performance.
Various Forths offer different memory management paradigms to C with more safety and reliability (e.g. the
ALOTword). Indeed, the preferred way is for everything to run on the stack alone.There has been better than C for longer than we've lived. That e.g. Lisp required a dozen mb of ram caused cost issues some decades ago, but now that it's cheap...
Anyway, modern C++ memory management's closer to Rust than C, Swift has some nice innovations too. Many things can be done - the OS could even manage it for the program. Research has shown how GC can theoretically surpass manual memory management - and today GCs are faster already, just look at runtime and wall clock time. The developer today chooses when to trade latency for throughput and wall clock speed.
More options
Context Copy link
Bit Hyperbolic no ?
I'd say the opposite. GC languages are only unviable for systems that care about exceptional performance.
Quant trading works with GCs. ML & gaming have a unique preference for C++ because of the ecosystem, so I'll treat them as exceptions. Google uses Go for large scale systems (not the core, but pretty much everything else). Clearly it's good enough for most systems work.
I was writing some code to optimise within constraints - basically just a massive pile of nested loops and if statements. It did well so we ported it to production, rewriting everything in C++.
The result was literally hundreds, maybe thousands of times faster. It went from being something that ran with a visible delay to something I could run in real-time.
More options
Context Copy link
I worked at a startup that was having huge problems with their server responsiveness due to Go's GC. They unfortunately believed the hype that it was a fancy concurrent GC with very small stop-the-world pauses. What they didn't realize was that while the GC was running, many common memory-accessing operations slowed down 10x (due to e.g. memory fences to avoid screwing with the GC). The slow performance would snowball, and you'd end up with the server spending most of its time in an unacceptable high-latency state.
We did manage to get some good performance out of Go eventually, but it was by explicitly writing our code to bypass the GC (including managing our own memory arena). TBH I like Go in general, but I think you underestimate just how costly a GCed language, even with a modern fancy GC, can be.
More options
Context Copy link
Quants care about latency, yes, but they're more than happy to throw a bit more hardware at their problems.
I can see I'll have to be more specific about what I take "performance" to mean. Performance is... efficiency. How much time, how many CPU clock cycles, how much memory, how many watts do you use while performing your task? Latency is one slice of it - a poorly written program will have poor performance on multiple dimensions, including latency - but low-latency alone is not the whole picture. A data center would likely not be happy to know you've reduced their latency at the cost of a large increase in power draw - power and cooling are a major factor in their operations! For game consoles, the hardware is fixed. If you take more compute than the console has to give you to get the next frame ready, your performance is poor. On any platform, if you use more memory than is available, everything suffers as you swap out to disk.
If your overriding concern is latency, to the exclusion of other performance concerns, I guess I can soften to say that GC may be workable.
More options
Context Copy link
More options
Context Copy link
Echoing @Imaginary_Knowledge but on a different tangent, in terms of garbage collection and high performance, the exception is obviously Jane Street with OCaml. Now is this the exception that breaks or prove the rule, I think only the long arm of history would be able to discern.
More options
Context Copy link
Obsolete take.
Have you looked at a modern GC like ZGC? We're talking sub-millisecond pause times now. GC performance isn't a practical problem anymore. You're repeating obsolete 20 year old memes.
Ever use an Android phone? Plenty fast UI. Android is built on Java, and it has a GC. Works fine, even at pretty low levels of the framework stack.
I'm convinced we could push a modern GC to the kernel and it would work fine there too. (RCU is already a beast of complexity and nondeterministic timing and nobody complains about that.)
Please update your prejudices to reject the current state of technology.
Also look at the performance improvements that Microsoft announces with every new version of .NET. Where speed is absolutely critical, there is still usually no beating C/C++/Rust, but C# has become blazing fast compared to how it used to be, and is actually competitive with lower level languages in some cases.
More options
Context Copy link
Gladly!
But more seriously, low latency isn't the whole picture. If I care about performance, why would I have so much spare CPU time laying around that I can essentially pin an entire core to be the GC manager?
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
Rust is an interesting programming language, because it perfected the nanny-state compiler. Rust is infamously difficult to get to compile if you don’t know what you’re doing. You can spam
.unwrap()andunsafeand write unsafe code, but it requires you to at least actively choose to accept these flaws as opposed to passively letting them by accidentally.If AI is going to write code, I think Rust is actually going to point the way toward the future. AI can make writing code very easy but introduces all sorts of potential zero-day bugs and faults. Rust actually solves much of this because many bugs the AI could write in other languages are not even valid Rust. The future of programming languages belongs to whoever develops an even more restrictive and advanced compiler that eliminates whole categories of AI errors from running. (A superset of python or typescript would be very appealing here.)
You mean the language that is de facto completely untyped in the real world and does next to no checks on the code before trying to execute it?
Python is going through a devx revolution right now. Pydantic, Astral and Mojo are the main contributors.
Mojo is typed, compiled and a (claimed) super set of Python. It hasn't seen as much adoption, but has is led by systems Jesus - Chris Lattner. I'm hopeful it will get there eventually.
Astral on the other hand, has transformed the python dev workflow. 'uv' solved python packaging. 'ruff' solved linting and formatting and now 'ty' solves python type-checking. Separately, Pydantic allows data objects to be strictly typed and is pretty much a python built in.
And I know it's customary to throw a bunch of half-baked tools at someone to silence criticism about a language. For years, that was true for python. But no, these tools have genuinely become ubiquitous. The python code-base at my current job is pretty much strictly typed.
In a few years, I'm betting python will become a pleasant language to use.
Pydantic is regularly used, but what about Astra? Are you using astral yourself? Is it in any major open-source projects?
I’ve never seen anyone do package management that wasn’t pip (or conda/apt depending on environment).
Open to it, I’ve just never seen it in the wild.
Uv has only been available for a year or two, but it is being adopted extremely quickly (because pip was just that bad): https://wagtail.org/blog/uv-overtakes-pip-in-ci/
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
Python also lets you commit crimes against humanity and good taste like this with ease: https://www.hillelwayne.com/post/python-abc/
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
Worth reading the post just for this pun.
More options
Context Copy link
I don't know. I find this a topic that it's pretty easy to be nuanced about. Different languages attempt to provide different guarantees to the programmer during their operation. To provide those guarantees they have to be able to understand the code and prove the code satisfies those guarantees. Most such languages provide ways to disable checking those guarantees for particular code sections on the assumption that you, the programmer, have information the compiler lacks that things will work without the compiler having to check. If you, the programmer, tell the compiler you know better and then turn out to be wrong I think it's fine to blame the programmer.
I think everyone has, in their mind, a different idea about the extent to which buggy code should be caught by the compiler and these ideas are what inform what side of the blame the programmer/blame the compiler distinction you fall on. As an example: In college a friend and I had to write some networking libraries in C. At the time we didn't use any fancy editors or anything, just good old gedit and gcc. My friend was writing a function that was supposed to perform an arithmetic operation and return the output but every time he ran it he got a different (implausible) result, even with the same inputs. What was happening is that he had accidentally omitted the
returnstatement for his function, so he was getting back some random garbage from memory on every run. Should the C compiler let you declare a function that returns a value and then let you omit thereturnstatement? Is that mistake your fault or the language's fault? Formally doing this is undefined behavior but that does not always mean crash!In this case I find the behavior of
Option<T>.unwrap()unintuitive, but I am also coming from the perspective of exception-based error handling. As an analogy, C#'sNullable<T>.Valuewill throw an exception if the nullable is actually null. That option obviously isn't available in a no-exception world. Maybe the default behavior should be more like the behavior with the try trait such that it returns the error instead ofpanic? Then let the programmerpanicif the value is error, although that introduces another layer of error checking!I'm a little unsure of the criticism here of Rust as a language. Is it that
unsafeexists? Presumably all the code that is not in anunsafeblock has guarantees that equivalent C code would not. Is that not a benefit? Is the worst case here you wrap all your Rust code inunsafeand then you end up... as good as C?I think there is a more productive discussion here about how language features and guarantees can help protect against writing buggy code and potentially making it easier to review code for bugs. I suppose I think of it by analogy to Typescript and Javascript. All Javascript is valid Typescript but Typescript needs to be compiled to Javascript. That compilation, in my experience, helps avoid whole classes of errors due to the lack of typing in Javascript. Sure you can write Javascript that just doesn't have those errors, and most people do, but Typescript renders them inexpressible. Similarly so for C and (non-unsafe) Rust.
It's the language's fault (that probably should never have been allowed by the standard, and if it wasn't then the compiler could catch it by default) and it's your fault (you shouldn't have written that), and it's other language users' fault.
That third one might take a bit of explanation.
Any decent compiler these days will warn you about that error at compile time, and will stop the compilation if you use a flag like
-Werrorto turn warnings into compile-time errors. So just always use-Werror, right? We could all be writing a safer version of C without even having to change the C standard! Well, "look for functions that declared a return value but didn't return one" is an especially easy error for a compiler to catch, but there are others that are trickier but more subtle. Maybe you add-Wallto get another batch of warnings, and-Wextrawith another batch, and you throw in-Wshadowand-Wunused-valueand-Wcast-qualand-Wlogical-opand ... well, that's a great way to write your code, right up until you have to#includesomeone else's code. At some point your OCD attention to detail will exceed that of the third-party authors who wrote one of your libraries, and you can't always fault them for it (these warnings are often for code that looks wrong, whether or not it is wrong - even omitting a return statement could probably save one CPU cycle in cases where you knew the return value wasn't going to be used!). So, I have special headers now: one to throw a bunch of compiler pragmas before#includeof certain third-party headers, to turn off my more paranoid warning settings before they can hit false positives, then another to turn all the warnings back on again for my own code, like a primitive version of "unsafe".I was once paid to port C code from a system that allowed code to dereference null pointers (by just making the MMU allow that memory page and filling it with zeroes). And so the C code written for that system used that behavior, depending on
foo = *bar;to set foo to 0 in cases where they should have writtenfoo = bar ? *bar : 0;instead. As soon as you give people too much leeway, someone will use it, and from that point onward you're a bit stuck, unable to take back that leeway without breaking things for those users. I like the "nasal demons" joke about what a compiler is allowed to do when you write Undefined Behavior, but really the worst thing a compiler is allowed to do with UB is to do exactly what you expected it to, because then you think you're fine right up until the point where suddenly you're not.This is getting off topic, but I thoroughly enjoy reading Raymond Chen's blog Old New Thing for the many stories of Windows bugs or implementation details or programmer misuses that later became compatibility constraints. When you upgrade your operating system and your Favorite Program stops working people rarely blame their Favorite Program even if it is the thing that was doing something unsupported!
I wonder what fraction of The Motte is software people.
I'd guess 30 to 50 percent
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
AIX did this. I think the first three values were 0, 0xdeadbeef, 0xbadfca11. C programmers weren't supposed to depend on it -- the compiler would use it to avoid short circuiting expressions like:
myptr == NULL || (*myptr == whatever)
which would save branch overhead. And the very common
myptr == NULL || *myptr == 0
could skip the null test entirely.
But I'm sure some programers did depend on it.
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
This is both normatively and positively wrong.
Positively: in C, Undefined Behavior often leads to a crash, but is not actually required by the C standard to lead to a crash. The outcome is literally undefined.
Normatively: If you write code that leads to Undefined Behavior, the C compiler is allowed to and often will emit code that will crash; this is the same outcome as the Rust case, but is still a worse situation because
grep unwrapis a thing andgrep some_regex_catching_all_C_UBis (despite linter developers trying their best) only a dream. The C compiler is allowed to emit code that will make demons fly out of your nose. The C compiler is allowed to, and often will, emit code that will hand control of your computer to the botnet of whichever attacker first discovered how to trigger the UB, at which point if you're lucky your computer is now laundering your electric bill into some mafioso's bitcoin wallet at pennies on the dollar, and if you're unlucky your computer is now an accessory to DDOS attacks or blackmail or financial scams. These are much worse outcomes. Even CloudFlare crashing is much better than CloudFlare being compromised would have been.The second clause here is false IMHO (though bias makes MO very H: I've been writing a little C and a lot of C++ for 3 decades and have no current plans to stop), but the first clause is simply theoretically and empirically true and belongs in the motte.
I do wish the second clause was true, for some language if not necessarily Rust, because I have about a hundred other gripes with C/C++ that can probably only be fixed by someone starting from scratch ... but whenever I investigate a new language that I'm excited to see fixes flaw X, they seem to do it at the same time as they omit all possible support for features Y and Z and end up with something worse (for some of my purposes; there are three other languages I write in for different use cases) overall.
Yeah, my ideal modern language would be a curated version of C++. It'd have a modern package manager, cut out a ton of the language features that are outdated or dangerous or both, and rewrite some existing features (e.g. lambda functions) to be less clunky now that backwards compatibility isn't a problem.
But making something like this wouldn't be very sexy.
More options
Context Copy link
More options
Context Copy link
I'm a SWE that's never worked with Rust (I've mostly been in R/Python, then SQL/Java/C#). I feel like with the advent of LLMs, the choice of programming languages will be so much less important in the near future. Before LLMs, having to learn a new language imposed a lot of costs in how to do the basic stuff, as having 10+ years of experience in a language means you can bust out features much more quickly than someone who has to constantly go to StackOverflow to figure out how to do boilerplate stuff. I feel like a lot of the debates over languages was really just "please don't make me learn this new crap", with people having their preferred language and then actively searching for reasons to defend it. Now you can just have Claude Code easily do boilerplate in any language for you, and focus on testing things instead. I'm converting old SQR code into SQL now, and pre-LLM this would have required me to have at least a basic knowledge of SQR, but that's no longer really the case.
At least today, LLMs can't produce anything which runs in any of the languages I use at work or leisure. An AI should be able to reason from a spec etc. but they're currently slaves to training data alone.
If you've got text-format spec to give them, you can kinda get them to handle esolangs to solve problems that don't exist in the normal corpus for those languages. But I haven't seen one great at it yet.
More options
Context Copy link
More options
Context Copy link
Using it in that direction is fine becauae you can check the output, I'm not sure it's going to work so well in a "I'm used to language X and they're making me write in language Y" scenario.
I would have agreed with you last year, but it's getting easier and easier to ignore learning the language you're working with too. It's obviously still useful to have at least a basic understanding, but I feel we're like <10 years from just trusting LLM code output as much as we trust compiler output. Nobody reads compiler stuff any more.
Clearly you've never worked in a field that cares about performance. People absolutely do read compiler output to see if it did anything too stupid and work around such issues.
We even have fancy new tools for this like Compiler Explorer, which is great for answering "does clang vectorize this like I want it to?".
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
This reminds me of learning BASIC on a Timex/Sinclair 1000, AKA the US release of the ZX81. It had this stupid gel-tab keyboard that made real typing impossible, but it would turn single keystrokes into full commands, even multiple commands per key which it would select contextually depending on where you were in the line you were typing. That and it called out syntax errors on the spot and wouldn't accept them until they were fixed.
This is all relevant to nothing, I'm just waxing nostalgic. For a machine with 2k native RAM (expandable to 16) it was an awesome kid's first computer to learn how to program out of a book.
More options
Context Copy link
Every single time. This happens every time where you try to engineer around the existence of the human soul, and it will continue to happen, forever. There is a war going on between the ensouled and the enslaved, and you can see it playing out here. The enslaved, who occupy places like HR departments, CPS field agencies, reddit moderation discords, city ordinance compliance departments, HOA boards, and Rust governance bodies, fight against the idea that an ensouled human being might have their own ideas about how to live their life, or how to manage the memory on their own computer.
C is god's language, and as counterintuitive as it may sound: so is python. All other languages exist only to build a path towards enslavement.
Yes, some of the ensouled will find a way around whatever barriers you put in place, but at some point you still need to at least try to bend incentive structures to reduce, if not outright eliminate, murders and the like.
Can you explain why for the non-CS-minded (me)?
Python doesn't ever error because it thinks you've made a mistake, it only stops you if it can't figure out what you are asking it to do. It does force you to use garbage collection and the language features love hashmaps but is generally very unopinionated on anything else.
More options
Context Copy link
More options
Context Copy link
If C and python are God's languages, then God asks us to live in caves in the desert while the sinful inherit the Earth. Yes it's holy and virtuous, but you can't build the tower of babel with it.
Hey, C combines the power of assembly with the elegance of assembly, as the joke goes.
Python has completely different problems. On the one hand, the duck typing means that erroneous assumptions about types may go undetected for a long time before blowing up in a completely innocent part of the code. (As far as a weakly typed piece of code can be innocent, that is.)
More critically, it is slow. Reading a field of an object, or calling a function defined in some global scope, both require a lookup in a hash maps, where in C they former would be pure pointer arithmetic and the latter would be resolved by the linker (or earlier) and turned into a constant runtime statement.
More options
Context Copy link
You can build anything with C if you're not a coward.
You can build anything out of toothpicks and tissue paper. Doesn't mean it's a good idea.
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
Every language in which you can say the compiler/runtime - just trust me bro I know what I am doing will devolve into just trust me bro language. This is why you don't have trust me bro sections. So it may be me - but I never really understood the point of Rust. As in why it exists. To me it seems that it combines the clumsiness of golang with the unsafety of C.
Btw - both C and C++ are quite memory safe if you don't try to be clever.
There is a name for a language which does not have 'just trust me bro' sections. It is Java.
If you want to do anything interesting with hardware or squeeze out optimal performance, you will sometimes end up in situations where you are making assumptions which can not be verified by the compiler, which generally is ill-equipped to verify arbitrary mathematical proofs or parse hardware specifications.
Ideally, a language would allow you to specify hardware behavior and include a theorem verifier which you can use to prove that because two variables are co-prime per your precondition, your divisor can indeed not be zero in the next line. Instead, you have unsafe blocks.
Of course, some lazy programmers will decide that unsafe blocks are the path of least resistance. Probably when C came out, some asm programmers decided that they could code "C" by just using inline assembly for everything. If you want to protect a programmer from harming themselves, you need to place them in a safe padded cell like Java does.
The use case of Rust is when you have someone who is actually willing to work with the borrow-checker and only use unsafe in the places where that is not possible. This will make it much easier to audit the code. Imagine having to verify the stories of two suspects. Suspect "Rust" provides you ironclad, notarized evidence for 90% of his claims, while 10% (the unsafe stuff) is unsupported by evidence. Suspect "C" provides you no evidence for any claims. To make sure that their story checks out on a similar level of confidence, you would likely spend 10x as much work on subject "C" (or possibly more because the unsafe code blocks can interact.)
For C, that is a ridiculous claim. You might as well say that the Taliban regime is great for women's rights as long as the woman is willing to submit to her husband and not voice controversial opinions.
Sure, there are plenty of programs in C which are obviously sound. But not every problem is easily transformable into such a program. "Don't be clever about memory management" is not actionable advice if you need to share data with indeterminate lifetime between multiple threads any more than "try to be straight and submissive" is actionable advice for an Afghan butch lesbian.
Array accesses in C are memory unsafe as fuck. Unlike for C++ (_GLIBCXX_ASSERTIONS), the best way to do safe array indexing in C boils down to "wait for clang to implement -fbounds-safety".
If I had a nickel for every
java.lang.NullPointerExceptionI've seen in the wild, I'd probably have at least a few more bucks, and I don't use that many Java applications.C# added nullable reference type annotations a few years ago, and if you're strict about actually using them (my Indian coworkers are not, sadly) then you can reliably eliminate 99% of null reference exceptions. When the C# finally gets around to implementing discriminated unions (in the next decade or three) and we finally have Option<T> there will be actual null safety in C#.
More options
Context Copy link
More options
Context Copy link
Java does have a trust-me-bro mode. See https://developer.android.com/reference/sun/misc/Unsafe
That's Android documentation, but regular Java has the same facility.
Granted, Unsafe is being deprecated, but we'll have equivalently powerful FFI stuff.
More options
Context Copy link
More options
Context Copy link
C++ now has smart pointers and one you get the hang of them, you don't want to go back to the old way of managing memory manually. It's not about the language being intrinsically "safe" or "unsafe", but rather that it enables you to automate memory management and you don't have to think about it unless you absolutely need to. You can just have a small "just trust me bro" section instead of having the cognitive load of having to double check the entire codebase.
More options
Context Copy link
More options
Context Copy link
I spent the last 2 years of my career fighting with Rust, and I detest the language. Unfortunately, I'm aware that I'm in the minority, and a lot of people buy into the hype that it's the wave of the future. In my opinion, it's a language that was a bad idea from the start.
Most programs are not the Linux kernel and do not need 100% safety guarantees. It's honestly fine if a game segfaults once in a while due to C++'s scaaaary "undefined behavior" (which is a crash 99.99% of the time).
And Rust kinda fails at the safety guarantees in practice, anyway. People who aren't experts bash their heads against the language, trying to do simple tasks that in any other language would be done in 15 minutes, and then they finally give up and do one of two things: wrap things in an unnecessary "unsafe" block, or copy a large object that didn't need copying. It turns mediocre programmers into bad programmers. (This isn't rhetoric; in our very real project, I saw how the "unsafe" blocks kept adding up, because we had to actually deliver a product instead of spending our time justifying ourselves to the borrow checker.)
The cost of C++ is all the crashes that we very visibly see everywhere. The cost of garbage-collected languages is sluggishness and, often, software literally just pausing for multiple seconds to free up memory. (When you're playing a game and you see a multi-second pause in the middle of gameplay, that's why.) The cost of Rust is in code that just doesn't get written, because it's 3-5x harder to get anything practical done. And unlike the first two, this is a subtle, invisible cost that is easily overlooked.
The designers of Rust really wanted to make a programming language with formally-verified correctness; there are other programming languages that go all the way on this, but they're all impossible to use. In fairness, Rust is merely difficult to use rather than impossible, but that's because the borrow checker is trying to do most of the proof for you. Unfortunately, the borrow checker isn't very good at it, and fails in many common actually-safe scenarios, and they didn't include any way for you to, well, help it along. (Ok, they do have lifetime specifiers, but those are insanely ugly constructs that provide no actual value to the programmer. They're only there for you to assist the borrow checker - often in ways it should be smart enough to do itself.) Now, Rust does keep improving the borrow checker, so fortunately this improves each year, but I doubt the problem will ever really go away.
The thing that really, really sticks in my craw is that the language is built around a horrible "mutable" keyword that is both ugly and a lie. C++ has "const", which is fantastic and they should have just copied it and been done with it. I actually think their choice not to do that is political, so they could pretend they were innovating over C++. But now, lots of objects have to have "internal mutability" to work, which means that there's no actual way to get, say, a reference to a mutex with the guarantee that you will not modify the object it points to. And you also can't have "const" member variables, so for instance, you can't initialize a temp directory as an object's member and add an in-language guarantee that it won't change until the object is destroyed.
One thing that Rust does really, really, amazingly really well is type inference. And it's nice in a lot of situations, but like the "auto" keyword in C++, I think people abuse it far too much. Types should be explicitly written out in code! They're a very important part of the logic! Libraries should be encouraged to be written so that types are easily readable (not 100-character-long nested monstrosities that both C++ and Rust are guilty of). And Rust uses its type inference to hide just how ugly its design actually is, so you don't have to stare at crap like
MutexGuard<Option<Arc<Map<int,Result<Option<String>,MyError>>>>>>all the time.But oh well. I'm aware that some of this is just my biased old-man get-off-my-lawn outlook. (Also, I'm not a Rust master, so I'm sure that some of what I'm saying above is flat-out incorrect.) But I don't think I'm wrong that Rust has a lot of downsides and is not the right tool for many (or even most) purposes. Ironically, one saving grace is that it's getting popular at the same time as LLMs, which are actually pretty good at navigating its human-unfriendly intricacies. Without ChatGPT I would have been a lot less productive.
Sometimes types shouldn't be explicitly written out in code because they're a very important part of the logic. If I write generic (templated) code that returns the heat capacity of a gas mixture at a given temperature, sometimes I just want that temperature to be a
doubleso I can get a quick answer for a point's heat capacity, and other times I want it to be aVector<DualNumber<double, SparseVector<int, double>>>so I can get SIMD or GPU code that gives me a hundred points' heat capacities as well as their derivatives with respect to the input data that was used to calculate temperature. There's basically no way I'm writing intermediate data types for such a calculation as anything butauto.When designing even simpler library methods I'm also sadly kind of a fan of users writing
autoout of laziness, too. If I ever accidentally expose too much of my internal data structures, use too small of a data type, etc. and have to change the API later, often I can change it in such a way that lazyautousers are still fully compatible with the upgraded version, but users who explicitly wrotefoo::iteratorcan't compile after my switch tobar, and users who explicitly wroteintare now slicing my beautiful newsize_tand are going to be unhappy years later when they run a problem big enough to overflow 2^31.You make some good points, and I don't hate all uses of
auto(I'll usually use it for iterators, too). Though making it so your code silently compiles when a library API changes isn't something I really approve of - it's not hard to see how that can go bad. Note that one of the things Rust does way, way better than C++ is package management - you don't have to update a library until you want to, and you can handle any compile errors at that time. As for your last scenario, another advantage Rust has over C++ is that it won't silently allow it to compile, since Rust has almost no implicit type conversions. I approve: Numbers changing types lossily is something that should be done explicitly.More options
Context Copy link
More options
Context Copy link
Good lord, this, and it's become endemic in C# and Java, too, where it makes some of the absolute least sense.
I agree completely, I get incredibly frustrated every time I see this.
Most of the time I see "var" in C#, it's because the dev didn't want to have to track down the exact details of the type they're making a variable of because they understand the general shape of it but the wrappers and type details are annoying.
But the details of that type are important, and if they can't trivially figure it out when they're first writing the code, the other dev who comes in 2 years later to fix a bug with the code is going to have a way harder time.
In theory it's only supposed to be used for trivially-inferrable types like an int, but I very rarely see that because... if it's obvious it's an int, it takes exactly as much time to type "int" as to type "var".
More options
Context Copy link
More options
Context Copy link
Is your job all Rust, or does it have other redeeming features you stick around for?
If the former, what are your plans?
If none of the above, are you a masochist?
Well, there's the paycheck of course! And I did like my coworkers, even if they were a lot more gung-ho about Rust than I was. But I didn't actually stick around - as of October, I've moved back to Canada (from Silicon Valley) and am giving retirement a try...
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
So, I woodwork, and to keep the fear of merciless spinning metal teeth in me, I frequently watch woodworking safety videos. It's humbling listening to these woodworkers with 30 years experience talk about the time they accidentally fed their thumb through the tablesaw, or the heel of their hand through the jointer, or the tip of their pinky through the router.
99% of the stories start the same way. "I just had to make one more cut." They were in a rush, they were tired, they'd ticked off nearly every item on their todo list for the long day, and suddenly they realize they need one more piece rip cut. There is a safe way to do it, but they are so tired, and so exhausted, and they've done this so many times, they do it the unsafe way figuring it won't be a problem.
It turns out to be a big fucking problem.
Now woodworking is a physical as well as a mental task. There will be physical signs to your exhaustion and wavering judgement. If it's a hobby, you can just decide it's time to hang it up that day. If it's not, I guess that's why people lose thumbs.
A lot of these bugs in Rust code that keep going viral are stupid. But they also stink of a programmer utterly and thoroughly tapped out fighting the borrow checker. So mentally exhausted at the endless walls put up between the simple task they have to do that they've been working through one by one, that at the end of 8 hours their judgement is so impaired they decide "Fuck it, it's just one line of code, it doesn't need to be safe."
I do wonder if we'll start to see more and more problems with Rust code. Not problems caused by negligence, but problems caused by sheer exhaustion. Negligence can be fixed. Exhaustion I'm less certain about.
This isn't the way safety people think. They think the problem is not that people are tired of fighting their burdensome safety measures and so bypass them; they think the problem is that it's possible to bypass their safety measures, and so see this as reason to put in more controls.
That's not my objection to Rust -- Rust was created by and is controlled by my Culture War enemies, who inject their beliefs into their actions a lot more than even Richard Stallman ever did. But it is an objection to Rust.
More options
Context Copy link
Sounds almost like you're advocating SafeStart: Coding Edition.
More options
Context Copy link
More options
Context Copy link
I’m not sure this is culture war, beyond the degree to which Emacs vs. vim is culture war. That said, Rust has never really appealed to me. It strikes me more as a B&D C++ alternative than a C alternative, and I was somewhat surprised when Linus decided to allow it. I think that there is room for a systems language with the spirit of C but fewer undefined behaviors and better ergonomics around things like array bounds and bit bashing, but I don’t think any of C’s would-be successors has quite found the niche yet.
More on topic, I think that @FistfullOfCrows’ observation about Rust’s leadership is apropos. I usually take a code of conduct in open source projects as a statement that this is a self-consciously progressive space and that even relatively tactful (for programmers) dissent is unacceptable. Consequently I assume that I am not wanted there; I may still use the software depending on what it does, but I am not likely to provide bug reports, patches, or donations.
How seriously I take that depends on context. In Python I think that Guido, while a progressive sort of guy, is a restraining force; but since he has resigned the role of benevolent dictator, things have gotten messier. The FreeBSD CoC generated enough backlash and reassurances that I still take its implications with a grain of salt. SQLite’s code of ethics, by contrast, countersignals the code of conduct trend quite strongly, and it even managed to get lots of positive comments on HackerNews doing so.
For Rust, though? It’s not lost on me that Rust used to be a Mozilla project, and everything I see suggests that the culture that pushed out Brendan Eich lives on there. They’re not hiding their power level.
Yeah unfortunately this is very much the case. I like the Rust programming language, but the Rust community is incredibly toxic. One of the worst communities online imo.
More options
Context Copy link
If you are really into this sort of thing, you should consider Ada/SPARK: Rust is cavalier enough to let the programmer engage in potential integer overflows (in default production mode) and doesn't support specifying custom valid ranges (
type my_integer is range -3 to 11).In actuality, I like what they're aiming for, but I expect most of the benefit I'll personally see will be from upping the safety game of C++ (and C, to a lesser extent) via language extensions, automated tooling, and general best practices. I reflexively write tests, use C++11 pointer types, check pointers for
nullptrbefore dereferencing, and use the.at()bounds-checked methods for container elements unless performance is impacted. That said, I do occasionally cause segfaults, still.More options
Context Copy link
Cynical take about the open source programming languages world
The best systems engineers are trans and mentally unwell. Appearing progressive is how you keep them productive instead of spiralling. The 2nd best systems engineers are virgin gooners. Appearing progressive gives them a chance to be around women. The 3rd best systems engineers are m-lady neck beards. Appearing progressive is how they simp.
Everyone else who's good enough to be developing the rust lang is getting paid millions at a quant firm or millions at an llm frontier lab.
Surface level progressivism is win-win stable state for open source PL.
I think being mentally off is the cause of both the systems engineering skill and the trans.
More options
Context Copy link
Objection: Fabrice Bellard is quite obviously not trans and nothing suggests he's mentally unwell either. The guy makes John Carmack look like a noob.
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
It is by no means the path of least resistance to unwrap errors. It's just as easy to write
if let Ok(foo) = barand handle the error in a non-panic way. The simple fact of the matter here is that the Cloudflare programmers went out of their way to crash if the program got to an invalid state. What is the language supposed to do to prevent that, not allow unwrap? That would hardly be an acceptable solution as unwrap is genuinely useful in some circumstances. People are quick to pick on Rust for this but I don't think there's anything Rust, or for that matter any programming language, could have done to prevent that outage.This is a terrible argument. First of all, surrounding things in unsafe blocks doesn't do a damn thing to get rid of errors with the borrow checker. The borrow checker still applies inside unsafe blocks! And the CVE in Linux wasn't caused by "just surrounding everything in unsafe", but by a logic error inside the unsafe blocks they needed to use for their purpose. Again, what is Rust supposed to do? Not allow unsafe? It would be useless for its target audience then.
Lots of Rust people will admit the language has problems. I'm one of them; I think that the language has plenty of flaws. But what they aren't going to do is accept bad arguments that ask impossible tasks that no language can do, or that would render Rust unfit for its domain. Nor should they.
Which is? You were supposed to get a list of features, it's always supposed to be there, what do you do? The person writing that code is thinking corrupted file or broken hard drive, panic is the correct thing to do. People are pointing fingers at that piece of code but to be honest I would point fingers upstream, to either the optimization that introduced the fixed size memory area or to the pipeline producing the features file who didn't check that the size of the file was under the limit.
Speaking of the optimization, the right thing to do there would have been to reallocate the struct to a bigger size if the preallocated one is insufficient. But I bet that would have introduced a bunch of ownership problems, hence we can circle back to this being a rust problem.
This is a problem caused by updating a linked list, a data structure that rust has historically struggled to implement without using unsafe.
I'm not saying "panic is the wrong thing to do" in the situation they had. It might be the right thing to do! I'm just saying that blaming the language for the programmer choosing to panic is a bad line of argument.
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
An engineering organization doesn't have to accept the default compiler behavior for a language. They can use linters and other tools to restrict (or expand!) the kind of code that's acceptable. And they can have a culture that values thoroughness or that values moving fast and breaking things.
I think the best argument for something like Rust is that it makes it easier to guarantee quality where it matters to the organization. If quality doesn't matter to the organization, whether explicitly through tooling and coding standards or implicitly through seeing what gets people promoted or fired, then people will circumvent safeguards whatever language it is.
There lies the rub, though: the way Rust is being introduced defeats the best (and possibly only) argument for it. If they wanted to move fast and break thing, they can just stick to C. Hell in some of these cases the C code is even the thorough option as it's already been in use for many years, so it's well tested. Rewriting it in a completely new language, marketed entirely on memory safety, only to disable the safety features throughout the codebase is supposed to achieve what, exactly?
As a former haskell dev this reminds me why pure functional languages are uncommon in production. Pure functional languages are amazing 90% of the time but are a disaster 10% of the time. Since the 10% can derail a project people don't want to use them. The solution has been integrating functional features into multiparadigm languages so that devs can write 90% functional style code and then use imperative code where functional code just doesn't work well.
Rust's memory safety is great 90% of the time and becomes a blocker 10% of the time. A combination of using rust's memory features and unsafe operations allows for high flexibility and relatively high memory safety.
I don't know if the split is going to be 90/10 when you're messing around with the kernel. Also, when you're rewriting old code from scratch, the risk of introducing new bugs is pretty high. When you want to replace something that's been in production for years, if not decades, you'll need a better argument than "it's perfectly safe 90% of the time".
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
You haven't mentioned that to do anything even slightly kernel-y or touching hardware you would need to have unsafes sprinkled throughout your code. I find Rust to be churn for churn's sake. Additionally infused with woke shit all through out its governance body.
I'm with you there. I didn't mention any of those issues in my post for the sake of time, but the behaviour of activists in the Rust community (granted, like all activists they are a vocal minority) is very off-putting.
Are they, though? It seems like Rust has more of 'em than many other languages and they're at every level from solo projects to people involved with the foundation.
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link