@Gillitrut's banner p

Gillitrut

Reading from the golden book under bright red stars

1 follower   follows 0 users  
joined 2022 September 06 14:49:23 UTC

				

User ID: 863

Gillitrut

Reading from the golden book under bright red stars

1 follower   follows 0 users   joined 2022 September 06 14:49:23 UTC

					

No bio...


					

User ID: 863

I'm seeing some reports (ex: here) that files in the initial dump that contain unredacted pictures of Trump with Epstein are being removed. Just incredible. The picture itself (as best I can tell) is not even particularly incriminating. It looks to me like it's a copy of this picture of Trump, Melania, Epstein, and Maxwell posing together, under some other photos in a desk drawer. Perhaps the image was taken down for unrelated reasons but I'm seeing a lot of speculation that it's because Trump was in the picture, in whatever capacity.

It's a little interesting to contrast this with my perception of Chen's attitude. He clearly was dedicated to making sure software that used to work would continue to work for users. It is basically never the software users fault that the program they bought did things wrong. On the other hand, he has palpable contempt for the developers of user-mode software that took a dependency on some undefined or non-contractual behavior and created these compatibility constraints. Ex: Application compatibility layers are there for the customer, not for the program

Some time ago, a customer asked this curious question (paraphrased, as always):

Hi, we have a program that was originally designed for Windows XP and Windows Server 2003, but we found that it runs into difficulties on Windows Vista. We’ve found that if we set the program into Windows XP compatibility mode, then the program runs fine on Windows Vista. What changes do we need to make to our installer so that when the user runs it on Windows Vista, it automatically runs in Windows XP compatibility mode?

Don’t touch that knob; the knob is there for the customer, not for the program. And it’s there to clean up after your mistakes, not to let you hide behind them.

It’s like saying, “I normally toss my garbage on the sidewalk in front of the pet store, and every morning, when they open up, somebody sweeps up the garbage and tosses it into the trash. But the pet store isn’t open on Sundays, so on Sundays, the garbage just sits there. How can I get the pet store to open on Sundays, too?”

The correct thing to do is to figure out what your program is doing wrong and fix it. You can use the Application Compatibility Toolkit to see all of the fixes that go into the Windows XP compatibility layer, then apply them one at a time until you find the one that gets your program running again. For example, if you find that your program runs fine once you apply the VersionLie shim, then go and fix your program’s operating system version checks.

But don’t keep throwing garbage on the street.

My hot take is that too many programmers use garbage collection as a crutch. GCs free you from some very specific work having to do with allocating and freeing memory but they are not a "get out of jail free" card for ever thinking about memory management or object lifetime again. Can think of a lot of examples of my own work in C# where people write inefficient code in hot paths without worrying about it because they let the garbage collector clean up after them.

This is getting off topic, but I thoroughly enjoy reading Raymond Chen's blog Old New Thing for the many stories of Windows bugs or implementation details or programmer misuses that later became compatibility constraints. When you upgrade your operating system and your Favorite Program stops working people rarely blame their Favorite Program even if it is the thing that was doing something unsupported!

I don't know. I find this a topic that it's pretty easy to be nuanced about. Different languages attempt to provide different guarantees to the programmer during their operation. To provide those guarantees they have to be able to understand the code and prove the code satisfies those guarantees. Most such languages provide ways to disable checking those guarantees for particular code sections on the assumption that you, the programmer, have information the compiler lacks that things will work without the compiler having to check. If you, the programmer, tell the compiler you know better and then turn out to be wrong I think it's fine to blame the programmer.

I think everyone has, in their mind, a different idea about the extent to which buggy code should be caught by the compiler and these ideas are what inform what side of the blame the programmer/blame the compiler distinction you fall on. As an example: In college a friend and I had to write some networking libraries in C. At the time we didn't use any fancy editors or anything, just good old gedit and gcc. My friend was writing a function that was supposed to perform an arithmetic operation and return the output but every time he ran it he got a different (implausible) result, even with the same inputs. What was happening is that he had accidentally omitted the return statement for his function, so he was getting back some random garbage from memory on every run. Should the C compiler let you declare a function that returns a value and then let you omit the return statement? Is that mistake your fault or the language's fault? Formally doing this is undefined behavior but that does not always mean crash!

Well that's technically not true, they did. It's just that calling .unwrap(), a function which will immediately abort the application on error, counts as "handling" the error. In other words, the path of least resistance is not to actually handle the error, but to crash. I argue that this isn't a better outcome than what would have happened in C, which would also be to crash. Sure, the crash won't be a segfault in Rust, but that doesn't matter if half the Internet dies.

In this case I find the behavior of Option<T>.unwrap() unintuitive, but I am also coming from the perspective of exception-based error handling. As an analogy, C#'s Nullable<T>.Value will throw an exception if the nullable is actually null. That option obviously isn't available in a no-exception world. Maybe the default behavior should be more like the behavior with the try trait such that it returns the error instead of panic? Then let the programmer panic if the value is error, although that introduces another layer of error checking!

This month, a CVE was filed in the Rust part of the Linux kernel, and it turned out to be a memory corruption vulnerability, ironically enough. "But how could this happen?" Rust has these things called unsafe blocks that let you do unsafe memory operations, closer to what you would be allowed to do in C (though granted, I have heard convincing arguments that unsafe Rust is still generally safer than C). So the path of least resistance is not to do things the safest way, but to just surround everything in unsafe if you get tired of fighting the borrow checker.

I'm a little unsure of the criticism here of Rust as a language. Is it that unsafe exists? Presumably all the code that is not in an unsafe block has guarantees that equivalent C code would not. Is that not a benefit? Is the worst case here you wrap all your Rust code in unsafe and then you end up... as good as C?

To be clear, I'm not saying that these incidents alone mean Rust is a bad choice for anything, ever. I'm not saying Cloudflare or Linux shouldn't use Rust. I'm not telling people what they should or shouldn't use. I'm just pointing out the double standards. Rust people can attack C all day using one set of (IMO, entirely justified) standards, but when they are confronted with these incidents, they suddenly switch to another set of standards. Or to put it more clearly, they have a motte and bailey. Motte: "Rust can't prevent shitty programmers from writing shitty code." Bailey: "C is unsafe, because of all the memory unsafe code people have written, and we should rewrite everything in Rust to fix all of it!"

I think there is a more productive discussion here about how language features and guarantees can help protect against writing buggy code and potentially making it easier to review code for bugs. I suppose I think of it by analogy to Typescript and Javascript. All Javascript is valid Typescript but Typescript needs to be compiled to Javascript. That compilation, in my experience, helps avoid whole classes of errors due to the lack of typing in Javascript. Sure you can write Javascript that just doesn't have those errors, and most people do, but Typescript renders them inexpressible. Similarly so for C and (non-unsafe) Rust.

I cannot speak for all managers but this description does not sound like any manager I have worked for. My managers have done key work in prioritizing the work for me and other team members. Coordinating work across teams. Translating high level strategy shifts from higher level executives into concrete terms for people like me. Their role has been very, obviously, valuable.

Previously, if someone took the time out of their day to physically travel to you and tell you something, you could reasonably expect it to be important.

It is kind of funny to read this in a world where Office Space exists. It's a satire but I am under the impression the phenomenon it satirizes was real. Was it important to put the cover sheets on the TPS reports?

Sure, but what fraction of jobs does that describe? The examples in the wikipedia article are things like store greeters, lobbyists, academic administrators, and managers. I think it would be pretty hard to characterize those jobs, depending on the specifics, as being like digging a hole and filling it back in!

Perhaps I am too economics pilled but at a sufficiently high level your outcomes (2) and (3) seem like the same thing to me. Or, to the extent they aren't, (3) seems like it contains a contradiction. On the one hand there are still going to be unmet human wants and desires. On the other hand I am supposed to believe there is no scalable use human labor could be put to in order to satisfy those desires. I am skeptical that both these facts can obtain.


I guess I'm also skeptical of the concept of "bullshit jobs" more generally. I have not read Graeber's book but browsing the wikipedia article for some examples does not give me confidence. For basically all the listed jobs it does not seem difficult to me to describe how the people doing the jobs provide value for the people who are paying them. Maybe "bullshit" is supposed to mean in some broader societal sense but then you are just saying you value things other than what market participants value. That's fine, but you shouldn't expect the market to produce outcomes as if it valued something else!