site banner

Small-Scale Question Sunday for April 21, 2024

Do you have a dumb question that you're kind of embarrassed to ask in the main thread? Is there something you're just not sure about?

This is your opportunity to ask questions. No question too simple or too silly.

Culture war topics are accepted, and proposals for a better intro post are appreciated.

1
Jump in the discussion.

No email address required.

Apropos of a conversation I had with my running partner this weekend, how close is the “pixel“ of smell and taste? I.e. is there any way to break down scent and flavor the same way sight and sound are with pixels and the Fourier transform.

You’re kind of touching on two questions.

The thing about images is that the map is not the territory. Concerns like pixels—resolution—only sneak in to quantify the limits of that map.

A mathematical construct like the Fourier transform doesn’t have that problem. The transform of a pure sine wave is the Platonic ideal of a pair of points. But you can’t make such a pair out of samples. You’re forced to approximate, which gives you a resolution.

So question 1 is “do we have a map to quantify smell?” The answer is yes, but no one can agree which is best. Here’s a more recent study which has a bunch of cool charts showing the perceptual space. There’s also the classic OChem Smells Chart.

Question 2 is how good the resolution is for any of these models. For sound and sight, we’ve done experiments to identify how small of a difference can be recognized. Presumably, something similar has been tried in the smell literature. In theory, you could use one of the Question 1 schema to choose several components of smell. Say “edibility,” “temperature,” and “irritation.” Then test different substances on each axis to estimate resolution. That’d give you a map of possible, distinguishable smells.

I’m going to be lazy and assume the same is true for taste.

I’m also lazy. I agree with everything you said, my argument with him was that the difficulty of simulating taste and smell meant that anything short of direct neurostimulation was likely to be uneconomical.

It appears that humans have between 350 and 400 olfactory receptors, so I suppose once we fully describe them we'll have as good a model for smell as we can get. Taste seems to be a lot simpler, and yet people are still finding new receptors there as well (though having tasted salt licorice I would say that's one better left unstimulated).

I would assume taste is much easier than smell, as there is only a handful of things tastebuds can detect. But then you need to combine that with smell…

I would guess that smell would have to be embedded within a higher-dimensional space than sight or sound? But I'm not certain.

There are languages that have fairly developed abilities to describe smell, just English isn't one of them.

I don't know the answer, but I think the first step would be to try to quantify smell and taste as precisely as sight and sound can be quantified.

Sight and sound are relatively easy to quantify. You can quantify sight as a function that maps (x, y, time) tuples to (r, g, b) color value tuples for example. You can quantify sound as a function that maps (time) tuples to (amplitude) tuples.*

As far as I know, no-one has managed to quantify smell and taste in such a way. However, I could be wrong about that.

*(time) and (amplitude) are tuples with only one item each in them, but I am calling them tuples for the sake of consistency. In mathematical parlance, it's still a tuple even if it has 0 or 1 items.

Phone apps are getting really aggressive lately about constantly sending useless notifications. They make it extremely difficult to figure out how to turn them off if they even let you. Threads is especially bad about this. Is there a guide somewhere on how to disable notifications?

The more general problem I've been having lately is that settings menus are now totally unintuitive. I used to be able to find something by just looking through the menu, but now, no matter what aim looking for, I almost always have to Google it, and half the time the instructions will be wrong because the app developers seem to reorganize their menus at least once a year.

Why do they do this? I get far more utility out of the layout of an app staying the same than I do out of any design changes, usually.

You aren’t the typical user and A/B testing probably shows the typical user totally does order takeout food when the takeout app sends a push notification

Android or iPhone?

Haven't had much trouble disabling android notifications. Its all handled in a central area, and android routinely asks me if I want to remove permissions for apps I don't use.

Android

Are you getting phone game 🎮 recommendations from Samsung?

No, it's a Pixel 6 Pro.

Hold down or slide on the notification to go to the notification controls for that app. There, you should ideally have an itemized list of notification categories and you can switch them or just disable all of them for that app.

You can look for the app in the notification manager of the android settings too.

I'm confused; notification settings aren't controlled within the app on Android, they're on the app info settings page which is identical for every app and allows you to disable types of notifications or all notifications for an app. Of course, the app might have overly coarse categories or lie about its notification categories, so you might have to disable more notifications than you intended, I guess.

This is not correct. Where are you getting your information?

That is correct. I'm not sure why you are saying it isn't. For at least a couple of Android versions, you can control notifications on a system level. Go into the system settings for the app (where you would go to control the permissions it has, or to force stop it), and one of the options will be for "Notifications". You will get a screen that looks like the attached image, and you can turn notifications off wholesale or by category.

As @token_progressive says, it's not perfect because it's up to the app to accurately categorize the notifications it can send. But this functionality does exist at the OS level.

/images/17139056367605028.webp

I'm saying it isn't because you can control notifications within apps on Android. For example, on Instagram, I can click my face on the bottom left, then the three lines at the top left, and then notifications. Every app has something like this.

In Instagram's case, controlling notifications through the OS doesn't work because they're labelled too vaguely. They don't even mention Threads.

It does "work" although it might not be at the granularity you want it to be as SF said.

On my Android phone, I can long-press any app notification, which changes that notification to a menu of options on how to treat that kind of notification, as well as a button to go into the settings to control all notifications for that particular app. Perhaps you have an older version of Android?

I have Android 14, but I didn't know it had this ability.

I agree with your general sentiment, but I think he's right? Source: Settings -> Notifications -> App Settings?

I'm not disputing that you can control notifications from within Android's settings. I'm disputing that you can't control from within apps.

comment deleted because I posted in wrong thread.

This is not a small scale question...

The small answer is to apply state force to the defectors regardless of any sympathy inducing specifics. Crush them into dust under the massive boot of Leviathan instead of shaming others for complaining about them. Let some people starve to death. Reinstate an earnest belief in hell. The question of why that's not possible is large.

Caleb Hammer interviews people (in a fairly obnoxious and click-baity style) in significant loan and credit card debt, breaks down their finances, and tries to get them on a budget with a varying amount of success. The most common factor of the guests he has on his show is eating out- for most of his guests, almost 33% of most of their monthly income is eating out at various establishments and other spending that does not significantly increase their quality of life.

Well at least this involves individual choice, not massive government bureaucracy. There are probably people who do in fact spend a lot of money eating out. There seem to be a surprising number of rather pricy restaurants, even in not terribly well off towns, so I suppose people are going there. It's doubtless more entertaining to find and talk to those people than the ones who are in debt because of health problems, or because they want to live in a big city, and are paying 60% of their income in rent. It would be quite the downer to have to tell someone to move to a much cheaper, duller city far away, choose a small apartment near public transportation, sell their car, and get rid of their pet.

So, if one, uh, wanted to get some LSD IVF with polygenic embryo screening, where would you actually go for that? Anyone know a guy?

There's lots of talk about it, but is this something currently available for couples looking to conceive? Assume cost is no impediment.

This article seems to have details about going about it.

In addition togenomic prediction/lifeview, https://www.orchidhealth.com has also recently entered the consumer market afaik. I have little personal info on them, though.

This is the only company that I'm aware of, but I haven't been paying attention recently so there may be newer better options. The tests are still limited, but better than nothing I suppose.

Does anybody like programming?

I have been hired as a sole and lead Python developer in a company. But my Python experience is mostly on Numpy, if anybody has some tips? It would be very appreciated!

Always freeze your dependencies to ensure your system is reproducible and won't randomly break when deployed.

Consider reading Effective Python 2nd edition for an overview of Python's features, though as other commenters have pointed out, you probably don't want to use every feature of Python at once.

Use @dataclass for simple data.

I'm in a similar position to you, I used to mostly program the ML/Data Science stack in Python, but also write a lot of Ruby on Rails and Javascript nowadays because my workplace products backend is in Rails.

Well, what kind of Python development are we talking about? If its web, you might probably have to learn Flask or Django well enough such that you can actually fight the business logic and not the syntax of the package.

If its not that and just general scripting and automation, just do what needs to be done. GPT can help you a fair bit with this.

Nevertheless, I am surprised you got hired at all in this market. No "lead developer" would be asking this question.

No "lead developer" would be asking this question.

Note "sole" part. In this case "lead developer" is puffery that costs them nothing and worker can put it on CV.

You can practice by working on some github issues on some repo like this random one: https://github.com/themotte/rDrama/issues

https://codecrafters.io/ is a bit pricey but fun. You should be able to burn through the "build your own http server" in python one pretty quickly.

I like to manage all of my language versions with asdf since it's one tool for versioning python/go/javascript/elixir/whatever.

Burning through a bunch of leetcode easy problems is a good way to get comfortable with a new language. Or if you've done a bunch of problems already try to translate those into python.

Partly a response, partly hijacking this to ask a question of my own to everyone else: what are you using as a editor/compiler?

I programmed exclusively in Java for years, but my new boss wanted programs in Python so I've been doing that this past year. Using Eclipse, which is wonderful as an editor, since it lets me organize everything and highlights typos that I make and stuff.

Aside a whole lot of friction involving different conventions and abilities, I was annoyed that all of the Python editors people recommended seemed way less functional until I discovered that I can program Python in Eclipse if I do the right stuff. So I've been doing that.

I'm not sure what the general consensus is, because I'm mostly self-taught and program on my own, making mathematical models for research purposes that nobody else has to use or collaborate with, so I've probably got all sorts of weird habits that would make more sophisticated programmers cringe. So I can't tell how much of this is objective and how much is just me being used to Eclipse for so many years and having little experience with anything else. But I tentatively recommend looking into PyDev for Eclipse, because in my opinion it's nice.

Hmm. I’ll have to keep eclipse in mind.

My day job is mostly MATLAB, so on the rare occasion I need to do Python, I use Spyder or one of those similar wrappers. The read/execute/print window is the important bit.

Vscode. Reason.. flexibility.

Sublime Text is top tier. VSCode is good but a memory hog, same for the Jetbrains suite. Nano is fine for making quick edits in a terminal. Vim pales in comparison to the aforementioned GUI editors for editing large files (or large amounts of files), and it's too obtuse compared to nano for making quick edits. I don't think it has a good use case.

Oh, there's no general consensus; to non-nerds the original perpetual internet flame war may have been Kirk-vs-Picard, but to nerds it was vi-vs-emacs.

I'm a happy vim user, but I would recommend it if and only if you expect to spend a significant portion of your life editing text; it's great to use but time consuming to learn.

I have a lot of coworkers, including the ones who wouldn't touch Windows with a 10 foot pole, who are big fans of Visual Studio for C/C++ development, but I don't know how well it works for Python.

While I also love Vim, I want to push back on the claim that it's time consuming to learn. I think you can expect to lose ten or twenty hours of productivity to getting used to it, after which it's a net positive (but a continuing learning curve).

Even more practical is to just use a Vim plugin for a worse more normal editor. This is also a good gateway drug to Vim.

Oh, there's no general consensus; to non-nerds the original perpetual internet flame war may have been Kirk-vs-Picard, but to nerds it was vi-vs-emacs.

I mean... you say that like anyone except nerds was participating in Kirk-vs-Picard flame wars during the early days of the Internet (or ever, for that matter).

Once there were enough non-nerds there, it wasn't the early days of the Internet anymore.

Yeah, seconding both prongs, here: a) IDEs are important and b) Python IDEs near-universally suck. If you're in the Java sphere before, PyCharm is kinda the Intellij-for-Python, for better and worse, and there's a large faction that loves VSCode for eating all of their RAM handling multi-language projects reasonably, but for the love of god don't try to build class-ful python in IDLE.

((I'll generally advocate PyCharm for new programmers, as annoying some of the Intellijisms can be, but if you're more acclimatized to and have already set up Eclipse it's definitely not worth swapping.))

VSCode for eating all of their RAM

I don't know where this myth came from - usually bad extensions are the memory hogs.

this is my VSCode at the moment - 3gb ram - way less than my browsers. And 32 GB ram was baseline dev computer 8 years ago.

Image Commit (KB) Working Set (KB)
Code.exe 229,192 203,380
Code.exe 196,436 181,052
Code.exe 181,540 158,272
Code.exe 146,848 143,044
Code.exe 170,172 146,452
Code.exe 158,840 142,484
Code.exe 116,608 114,484
Code.exe 149,196 117,328
Code.exe 112,392 98,688
Code.exe 90,580 92,056
Code.exe 86,820 97,276
Code.exe 1,423,064 86,692
Code.exe 73,020 76,104
Code.exe 73,356 75,808
Code.exe 56,140 61,656
Code.exe 56,864 59,304
Code.exe 50,748 41,668
Code.exe 39,788 44,236
Code.exe 37,548 43,608
Code.exe 23,656 22,300
Code.exe 22,832 21,832
Code.exe 21,308 20,340
Code.exe 21,208 20,296
Code.exe 20,956 20,192
Code.exe 20,924 22,388
Code.exe 21,160 23,428
Code.exe 17,980 16,276
Code.exe 17,992 16,244
Code.exe 18,004 15,824
Code.exe 15,096 21,176
Code.exe 11,004 9,376

Dude, 3 GB of RAM usage is in no way acceptable. You're saying "I don't know where this myth came from" while providing evidence that it's not a myth at all. VSCode is a memory hog, like all Electron apps.

Since when is 3GB memory hogging?

Since always. Even in the modern day when a system will easily have 16-32 GB of memory, that's 10% (or 20%) of the entire system! It's not remotely acceptable for a single app to take up that much memory.

By comparison, Sublime Text (which is very much in the same ballpark in terms of features) takes up 998 MB including memory shared with other processes. It uses just 210 MB discounting the shared memory!! That's the sort of performance you can get when software is written by people who give a shit, not lazy devs who go "eh Electron is fine, people have lots of RAM these days".

Since always. Even in the modern day when a system will easily have 16-32 GB of memory, that's 10% (or 20%) of the entire system! It's not remotely acceptable for a single app to take up that much memory.

Disagree. RAM exists to be used. There are lots of performance reasons for trading off memory utilization with CPU processing and storage IO, and a complex program which is a primary use case for a PC should make those tradeoffs in favor of more RAM utilization unless operating in a memory-constrained environment.

RAM exists to be used, and the app developer should humbly realize that the user (this is about the user, right?) may have a use for that RAM and therefore optimize the software.

More comments

Since always. Even in the modern day when a system will easily have 16-32 GB of memory, that's 10% (or 20%) of the entire system! It's not remotely acceptable for a single app to take up that much memory.

Except VSCode and Brave and DBeaver are roughly 100% of what I do on a machine while developing.

So it follows that app being developed eats roughtly 0% of memory?

Look, if you're content for apps to hog memory because you use them exclusively I can't really stop you. Go nuts. But to me it's not an acceptable level of performance, because I use my computer for many things and I expect it to be able to support them all at once.

Ten years ago a brand-new processor would have been the Haswell- or Broadwell-era, and while you could get machines that could hold 32GB RAM, the H81 chipset only supported up to 16GB, going to 32GB would not have been standard, and it'd probably cost you upwards of 250 USD in RAM alone.

But more centrally, VSCode's linter and intellisense implementation is perfectly fine for mid-sized projects without a boatload of dependencies in certain languages. Get outside of those bounds, and its RAM usage can skyrocket. Python tends to get it hard (as does Java, tbf) because of popular libraries with massive and somewhat circular dependency graphs, but I've seen large C++ projects go absolutely tango uniform, with upwards of 10GB.

Yes, it is usually an extension problem, but given that you'll end up needing to install a few extensions for almost every language you work with just to get them compiling (nevermind debugging!), and that it's often even Microsoft-provided extensions (both vscode-cpptools and vscode-python have bitten me, personally) , that doesn't actually help a lot. Yes, you can solve it by finding the extension and disabling it, and sometimes there's even alternative extensions for the same task that do work.

The normal case isn't much worse, and sometimes is better, than alternatives like IntelliJ/PyCharm. But the worst cases are atrocious, and they're not just things hitting some rando on a github issue with some weird outlier use case.

going to 32GB would not have been standard, and it'd probably cost you upwards of 250 USD in RAM alone.

My PC built in 2016 with skylake (2015) had 64GB ram. My assembled in 2010 had 32. And with developer salaries being what it is - it was always affordable even in Eastern Europe.

32GB was possible on Sandy Bridge processors (technically 2011), but mid-range Westmere and Nehalim processors only supported 16GB(ish) for most of the consumer market, and even the high-end Bloomfield capped at 24GB. I'm not saying you didn't do it -- I've got a couple Xeon systems from that era floating around that could have -- but it was absolutely not a standard use case.

A more normal midrange system would be closer to 4GB, with 8GB as the splurge. You'd probably end up spending over 400 USD in RAM alone, plus needing to spec up your motherboard to support it (thanks, Intel for the fucky memory controller decision).

32GB was possible on Sandy Bridge processors (technically 2011),

So it probably was sandy bridge. It wasn't xeon with certainty. Too many years. I remember having core 2 duo 2006 or 7 with 8GB, I remember that the PC I built in 2016 had 64 (which I still hasn't changed, the performance growth in everything but the GPUs have been pathetic), and I remember that it replaced a PC with 32 - so it probably was early 2011. Also possible I build one in 2010 with 16 and then one in 2012 with 32.

Anyway RAM was peanuts compared to the payroll for developers so it didn't make any sense to not pump their workstations.

Wth, my 2018 pc only had 16 until I recently upgraded to 32. I think you were in the top fraction of a percent of users.

Do you know if there's a way to.... I'm not even sure what the right language is here.... put different classes in different .py files, or at least different tabs, without running into recursive dependency issues.

Like, in Java, I can make a World class that contains a population from the Agent class, and models an epidemic going through them, and the Agents have a bunch of methods internally regarding how they function as they get infected and recover and stuff. And if I pass a copy of the main World to each Agent when it's created, then when they do stuff in their methods they can call back up to the World, usually for counting purposes, they say "hey I got infected, increment the total infection counter" or "hey someone was going to infect me but I'm already infected, increment the redundant infection counter".

As far as I can tell, in Python I can't do that nicely. If the World class imports Agent, then the Agent class can't import World. I can resolve this by defining both classes in the same .py file, but then all my code is arranged 1-dimensionally and I have to scroll through tons of stuff to find what I'm looking for (or use ctlr F). Whereas in Java each class has its own tab, I can open or close or switch to, so well-behaved ones that I'm not working on don't take up space or get in my way. I'm not sure if this is a Python issue or just a Eclipse issue. Is there a way to split a .py file into multiple tabs so I can organize better?

This sounds less like a Python problem and more like a "you need to learn how to architect projects and write clean maintainable code" problem. You know.. the Engineering part of Software Engineering..

Also, why are you importing Agent or World into each other at all? The World needs to be a Singleton that has-many agents. They should be declared in different files and a third file should manage both of their interactions.

I'd caution that :

  1. Python's support for the singleton pattern is kinda jank, due to lack of first-class support for private constructors or access modifiers.
  2. While there's a lot of arguments in favor of the singleton pattern with an interaction controller for bigcorp work, in small businesses it can be a temptation with serious tradeoffs. Refactoring (whether to add an intermediate object between World and Agent, or if you end up needing multiple World objects such as for a fictional context) can be nightmarish in Python, even if all the interaction logic is properly contained. And it probably won't be properly contained: marketing and customers can end up demanding bizarre requirements on near-zero notice that can require information from multiple different singletons, and if you end up hiring (or taking interns!) as a small business rather than at the FAANG level, those people (and I was one of them once!) will often break around the interaction controller unless aggressively managed.

The World needs to be a Singleton

Eppur si muove!

I'm... not very good with Python, but my understanding, a toy example would be :

main,py:

import agent
import world

agentCount = 20
infectionCount = 25
world = world.World()
print("Starting...")
for i in range(agentCount):
    world.addAgent(agent.Agent(world))

for i in range(infectionCount):
    world.infectRandomAgent()

print("Total Infections :" + str(world.totalInfections))
print("Total Redundant Infections :" + str(world.redundantInfections))
for i in range(agentCount):
    print("Agent #" + str(i) + " Infections:" + str(world.knownAgents[i].countedInfections))

world,py:

import random

class World:
    knownAgents = list()
    totalInfections = 0
    redundantInfections = 0

    def addAgent(self, newAgent):
        self.knownAgents.append(newAgent)

    def infectRandomAgent(self):
        random.choice(self.knownAgents).incrementInfection()

agent,py:

class Agent:
    wasInfected = False
    countedInfections = 0

    def __init__(self, ownerWorld):
        self.world = ownerWorld

    def incrementInfection(self):
        self.world.totalInfections += 1
        if self.wasInfected:
            self.world.redundantInfections += 1
        self.wasInfected = True;
        self.countedInfections += 1

Note that if you're using raw python3.exe or a basic IDE like IDLE, all three files will need to be in the same folder, or you have to treat them like modules. Better IDEs like PyCharm will handle most of this for you, though I'd recommend experimenting before futzing with it a lot.

__init__ is a python builtin capability that's pretty equivalent to Java Constructors. The first argument for any class function will act as a reference to the instance of that class being called for that function, regardless of name -- do be careful getting a convention for that early and often, or it'll drive you up the walls. self is popular in pythonic circles, but I've seen a surprisingly large project that took the convention of this<className>, probably downstream of java or C# devs.

Only your main simulation file really should need to import the files that make up the actual objects. The class objects themselves don't need to know about each other, even if they're calling methods or fields specific to the other class, because that gets looked up during live runtime operations.

(edit: specifically, the class calling the constructor for an instance of an object needs to import that object. You could have, and it would probably be cleaner, to import Agent within world.py and not from within main.py, and do the agent constructor in the form :

    def addAgent(self):
        self.knownAgents.append(agent.Agent(self))

But I've been burned before in python environments where I ended up with my class imports spread throughout for hundred places and it being a nightmare to refactor or rename or handle versioning, so my preference for non-giant projects is to centralize imports, and for giant python projects you probably should be breaking it into modules.

I've been doing it like that, where they're all together and reference each other, it's just that then when Agent has 15 methods because some of them are experimental variations on each other or niche things I wanted to do to see what would happen, then I make another class for graphing scatter plots, and I've got a bunch of methods for (Make a world, then modifier the parameters according to X, then execute Y, then graph the results, then repeat that N times) that would be nice to stick in their own class somewhere, and then I've got a bunch of useful static methods that do stuff like load and save data to CSVs that would be nice to have in their own class for organization purposes. And if I just lay them out linearly (which I mostly have, with a few rare exceptions that definitely have 0 recursive dependencies and I actually have moved them to their own .py file) then I have literally 2000 lines of code I have to scroll up and down just to find the right class whenever I want to check to see what the name of the method I want to call is or something, and then scroll back down to find the spot I'm working on.

There's nothing like the partial class concept from C#, though I agree it would be really nice if there were.

You can kinda fake it by exploiting the heck of out inheritance, in a couple different ways, depending on what level of composition you're aiming to be able to do. If you want selective import of behaviors (and to avoid the diamond inheritance problem, mostly), you can do something like :

agentInfectionLogic,py:

wasInfected = False
countedInfections = 0

def incrementInfection(self):
    self.world.totalInfections += 1
    if self.wasInfected:
        self.world.redundantInfections += 1
    self.wasInfected = True
    self.countedInfections += 1

def infectedCount(self):
    return self.countedInfections

agentFileLogic,py:

def loadInfectionInfo(self):
    temploadInfections = 20
    for x in range(temploadInfections):
        self.incrementInfection()
    # do an actual file load here.

def saveInfectionInfo(self):
    tempfile = self.infectedCount
    # save an actual file here.

agent,py:

class Agent:
    from agentInfectionLogic import infectedCount, incrementInfection, countedInfections, wasInfected
    from agentFileLogic import saveInfectionInfo, loadInfectionInfo

    def __init__(self, ownerWorld):
        self.world = ownerWorld

And then calls like world.knownAgents[0].loadInfectionInfo() or world.infectRandomAgent() would work as normal, and you can even swap between different experimental forms by having from agentInfectionLogic import infectedCount, incrementInfection, countedInfections, wasInfected or from testAgentInfectionLogic import infectedCount, incrementInfection, countedInfections, wasInfected (or even a mix-and-match between the two).

Agent.py has to know about what's going on, but to everywhere else, anything imported into agent.py looks identical to as if it were coded into that file or class. Eventually this turns into a full module, where the __init__.py file holds the glue and then you have better names for your actual logic .pys, but when that makes sense depends a lot on the scale of your project.

Bump, Please someone answer this. I have the exact same issue and both gpt4 and google are not helping.

The term you’re looking for is circular dependency. That should hopefully help you on your Google quest.

Read the book "Clean Code" by Robert Martin.

  1. Use type annotations.
  2. Don't get clever. Python is incredibly flexible, and you should use a very small subset of what's available. See e.g. the Google Python style guide.
  3. Have good test coverage.
  4. Have good version control/dependency management.

I do Python (and could use of job, if you want to get your forum-nepotism on). Python comes with a bunch of footguns, in that you can make the language behave unexpected ways by, for instance, executing arbitrary code at places like member or index accesses, have completely divergent function behavior depending on argument count and type, or change the behavior of existing objects (almost) arbitrarily at runtime. The art of Python programming is to use these features, with documentation, when appropriate but no more. These issues probably play out a bit differently depending on team and codebase size.

All the usual advice about factoring code into small pieces through narrow interfaces stands in any language.

I have no experience being a professional developer, I just do side projects, but Youtube's a great resource for learning. I find you already need some familiarity with the topic before official documentation is understandable, and a youtube tutorial shows you every step, including ones that are just implied in written tutorials, e.g installations.

How did you get the job, let alone a lead position, if, from what you say, your only experience with Python is with Numpy?

In my country (even more in my local area) there are not a lot of people working in the software sector. During the job interview, I told them I am most familiar with Numpy but they probably assumed all Python is the same. Not really nice of me, but the job pays very good!

I'm not much of a Python guy in particular (though I think it's fantastic that the same language is useful for both teaching kids and writing cutting-edge software; when I was a kid we had various forms of BASIC, which were used for and useful for neither).

But my most useful tips are language-agnostic:

Write and comment and document (three separate things!) all your code so thoroughly that even a complete stranger doesn't need to ask you questions to understand it all. This doesn't sound so important for a "sole developer" role, but at some point you'll have to extend some of your own code that you haven't looked at in years and you'll be the complete stranger who can't ask your past self questions.

Cover your code with tests. Set something up to automatically run tests before allowing any new merge (I'm assuming you're using a version control system; if not then let's call that tip #0). You will write bugs, but it won't matter so much as long as you're the first person who's hit by them, because then you have a chance to make sure you're the only person who's hit by them.

So, what are you reading?

I'm on Meyer's In Defense of Freedom. It's an effective statement of right-libertarian ideas, and surprisingly critical of Kirkian conservatism. Meyer's defense of freedom and reason is in large part against "New Conservatism's" defining of freedom as the freedom to do one's duty. It's surprising considering that the system related to his name is "fusionism." I'll have to dust off my Kirk sometime.

Allow me to bring down the intellectual quality with the various sci fi audio book series I've listened to recently. Light spoilers ahead, but I tried not to include anything too major.

The Murderbot Diaries.

The protagonist is a Sentry Bot. Made out of cloned tissue and cybernetics, he's born into servitude as corporate property rented out for security on planetary survey missions. He has recently managed to hack his governor module and has freed himself from control. With his newfound freedom he quietly does his job but spends all of his free time binge watching serials.

A mission goes awry and adventure ensues. 7 books, I listened to them all. Some fun characters. The later books seem a bit padded, there was an arc over the last few books so the endings of each weren't as satisfying. I probably just needed a break from the series.

Bobiverse

In 2016 a Silicon Valley CEO of a mid-sized company signs up to have his head frozen in case of death. He dies. The cryogenic company promised to use his funds to build him a new body, but that didn't work out so well. In 2133 his mind is brought back online as a digital replicant because his personality is seen as a good match for becoming a von Neumann probe.

4 books, I made it through them all. Some very imaginative world building and explorative sci fi. There's some Reddit tier atheism stuff at the beginning but it quickly moves on to more interesting things.

Expeditionary Force

In 2030 bipedal hamster-like aliens launch a surprise assault on Earth. The protagonist, Joe, helps defend a small town at the initial invasion site. Later, some lizard-like aliens recruit them to launch a counterattack. Joe ends up as part of an occupying force on one of the hamster worlds. Joe is a blue collar grunt, and bit boring. The early parts drag. He later discovers an ancient alien AI housed in something similar to a talking beer can. Things pick up after that.

I listened to 3 books of 16. It initially had promise but the alien world building was a bit weak. Skippy's origin is the most interesting plot thread but apparently there's not much progress on that until book 9. It had its moments but I gave up on it.

Starship's Mage

This one was interesting because I didn't think I'd like it. It's essentially hard sci-fi with magic.

A brutal eugenics program on Mars led to the creation of Magi. They are the key to FTL travel and the galaxy opened up to colonization. Some alien ruins have been discovered, but no sign of living aliens. The Mage King controls FTL and thus all trade and travel between worlds.

I really enjoyed the first few books. I made it through 7 of 14. The sci-fi magic just being magic meant that everything not involving a wizard is easy to understand. There are Newtonian space battles similar to the Expanse.

For me, it peaked at book 4, Alien Arcana. The first 4 books had more investigation and mystery. After that the series shifted more to fleet space battles and interstellar politics with anti-mage separatists. The author doesn't have the grasp he needs on things like large scale military production and what the military advisors would be saying for those plotlines to work well.

The first two get recommended on /r/rational periodically, but iI’m sad to say I never got around to them yet.

Is Expeditionary Force Craig Alanson? That one’s on the list. Eyes are open for a hard copy.

Never heard of Starship’s Mage. It does sound rad.

Is Expeditionary Force Craig Alanson?

Yes, Craig Alanson. The first book is Columbus Day.

Say Nothing: A True Story of Murder and Memory in Northern Ireland, which is a narrative history of the Troubles following a lot of the big and small players in the (mostly Provisional) IRA throughout the duration of the conflict. I know next to nothing about the Troubles but it has thus far been a riveting and accessible introduction.

On an entirely different note, I decided to pick up a light fantasy read, and ended up with Kushiel's Dart, a...racy political intrigue set in alternate history France. The lead character is charming and while it is pretty schlocky in general, the plot and character interactions are a lot of fun. Some interesting worldbuilding at work as well.
I have a feeling this is a book that would have a rather harder time getting published today on account of the contents of the first hundred pages or so alone, but maybe I'm simply naive about the nature of the industry. The ringing endorsement from Robert Jordan on the cover gave me a good chuckle in any case.

I'm currently reading Iron Gold by Pierce Brown. I am not really enjoying it, but people have assured me that the next book gets back to the excellence of the first three, so I'm trying to slog through it.

Can you post an update if it picks up? I couldn't get through it and gave up on the series. It'd be good to know if it's worth the slog.

Sure!

Burning Wheel, a roleplaying game manual. It’s incredibly pretentious. At the same time, though, there’s legitimately a lot of good material there? Notes about common pitfalls from RPGs. Systems which look like commentary on familiar games. I get the impression that this was created after a lot of long forum arguments and table experience.

Whether that actually makes a functional game…I’m not sure. There are lots of play-examples, but I’ve never heard of any random person playing it. The provided setting is an archetypal fantasy world which works fine to contextualize the rules, but leaves me cold. Burning Empires is better on that front.

Then again, I don’t usually play games like these. The theory is more fun than the practice. Which makes experimental, abstract books like this one more appropriate.

I'm still working my way through War and Peace, notating it as I go. It's such a tremendous work.

In between I listened to some graphic novel recommendations and read From Hell on my tablet. Really fun work, and fascinating that it is based on a pseudo-legitimate Ripper conspiracy.

I took a beach trip and grabbed a book my wife had bought and had been well reviewed, R.F. Kuang's Yellowface. The best thing I can say about it is that it was shorter than I thought it was going to be, it was a 200 page book with extra large margins and line spacing to make it 300 pages, so that it seems like a real book but is really an overgrown novella. Even in 200 pages, it runs out of ideas midway through. A blank space and a power fantasy where I was told a literary work would be.

I read Chris Jesu Lee's review of Yellowface and thought it sounded like hot garbage.

He was absolutely correct and the hip bookstore employee who recommended it to my wife should get the other half of her hair shaved off in public for this.

Oof, I read through Kuang's Poppy War trilogy and had no desire to read more of her. The first half of the first book is Kung Fu Harry Potter, then it shifts hard into Chinese nationalist fever dream, complete with a few chapters dedicated to the rape of not-Nanking so you don't feel as bad when the main character commits genocide on the not-Japanese. Then she spends two books ping ponging between ruthless sociopathy and helplessness as the plot demands. People called the details of the setting (food, clothing, etc) really well researched, but then the author described the not-Mongols as using huge longbows on horseback and that kinda brought everything into question for me.

The whole book just felt like a thesis length version of "But I have already drawn you as the Soyjak and me as the Chad..."

And I've read Babel, so between us we've got the whole bibliography.

Spoiler: It was bad.

I recently read Записки из подполья/Notes from Underground on a whim and was amazed at how perfectly it describes the POV of an average chud over 150 years later, down to the thought processes. It was actually hard to read at times because the protag is an incorrigible edgelord - which to be fair is easy for me to say because of modern over-exposure to nihilism and contrarian shit - but at the same time his schtick hits pretty close to home sometimes:

  • he's a shut-in who stopped interacting with society, and cannot stop himself from taking petty offenses over minor shit when occasionally forced to interact
  • he's a self-made philosopher and an irredeemable contrarian, opposing some things for nothing but the fuck of it and unironically considering himself oppressed by the laws of reality (e.g 2 + 2 = 4) that prevent him from freely expressing himself
  • he's thoroughly poisoned by the ennui of his existence, at some point admitting that even just being extremely, cripplingly lazy would be better than being inactive out of sheer apathy
  • later sections are dedicated to his encounter with a prostitute, which was very uncomfortable to read (despite having zero lewd details) purely because of how viscerally cringe the underground man's posturing is
  • the last few pages consist of quite literal cope and seethe by the underground man after the girl leaves, featuring gems like "insulting somebody is good actually, it helps them grow" and "at least I pushed boundaries and took things to extremes, you cowards would never dare go even halfway"
  • he admits that he hates the real/"live" life (живая жизнь), was unprepared to handle it when Liza came, and wants nothing more than to return to his "underground"

Good writing really is timeless, I'm not much of a reader but I really should've paid attention in school at least.

It's absolutely brilliant. One of the most uncomfortable reads I've ever had. Resonates so completely through the ages.

The only thing I recall really resonating with me from school was Gore ot uma/Woe from Wit. Naturally, back then I thought Chatsky was a based sigma, whereas now his antics reveal him as a cringelord who can't read the room.

Fish's Clinical Psychopathology, and the Oxford Handbook of Clinical Psychiatry.

The latter, while still quite dry, has informed me that the piccolo gene is implicated in depression, which given what I remember from watching DBZ as a kid, is quite accurate.

The former is indeed about humans, my concerns about how to apply an MSE to a fish are dispelled, though it took a while. I'd be very concerned unless it was a talking bass, or a very particular kind of sushi place, but then again, I don't eat fish.

Just finished Feynman’s autobiography. What a guy! You know a guy is being honest in his autobiography when half the book is filled with memories of blond babes and tits, and the other half is about him solving this or that very difficult physics problem.

Surely You're Joking? Funny story about this book. In high school physics, my teacher would offer students the opportunity for extra credit once a quarter. To get the credit, you had to read a book from her pre-approved list and then have a 30 minute conversation about the book. I, as a bookworm, took advantage of this and read Surely You're Joking. I thought it was awesome as well. In fact, I enjoyed it so much that I read his second autobiography What Do You Care What Other People Think, which, while not as good as his first, was still worth reading.

I didn't learn much physics in that class, but her booklist stuck with me.

I also highly recommend this book. It's hilarious and fascinating.

You should also pretend that the movie doesn't exist.

It was easy to pretend that until you told me that it does :P

Seconded. It really is a hoot. If the guy had been alive today his YouTube channel would be popping.

I got the impression that if he was alive today he would be fired from academia promptly and make a fortune as a quant

Undoubtedly, but that would only make his YT spicier.

Seconded. It really is a hoot. If the guy had been alive today his YouTube channel would be popping.

Not really - he wasn't a self-promoter in that way. SYJ happened because Leighton and Sands made it happen, not because Feynman wanted to do the work of writing a memoir.

I'm currently reading Molecular Biology Of The Cell. It's a big biochemistry text that's over 1,700 pages, a topic which I've long planned to cover in full but have never managed to get the time to do so. I plan to finish it by the end of next month, and have been making notes when I read so as to aid in memorisation of the concepts covered.

In conjunction with this, because deep time is fascinating, I have been reading a large variety of papers on biospheric evolution during the Precambrian while drawing up a timeline of events - there's a long, complex fuse that led up to the explosion of animal life at the beginning of the Phanerozoic and that as far as I can tell is still poorly understood. There is so much from back then that would've been like nothing the world has seen since (the Snowball Earth(s), the Ediacaran biota, and so on - there is even some evidence showing incipient multicellular life all the way back in the early Proterozoic that went nowhere, a dead branch on the evolutionary tree which featured relatively complex lifeforms large enough to be visible to the naked eye). I've been including links in my notes so I don't lose the original sources, I might put it up TheMotte at some point once I'm happy with it.

Done with I, Claudius and onto A Thread Across The Ocean.

A couple thoughts on I, Claudius

  • Historical fiction is a very cool concept and I would like to read more of it. It gives the author a nice structure to work with and he can then just make up interesting stories to fill in the unknown. It's fun to read the Wikipedia entries on all the Roman emperors/politicians after finishing the book.
  • In the same vein, I'm currently rewatching The Sopranos, and I kept thinking how much the palace intrigue and murder in I, Claudius reminded me of the show. There's a scene in the The Sopranos where a couple mafia guys are torturing a Jewish man who refuses to submit and he says "900 Jews held their own against 15,000 Roman soldiers...and the Romans, where are they now?" Tony Soprano answers "You're looking at them asshole." Great bit of writing from The Sopranos, and I like the idea that the mafia are the descendants of these debauched and violent Roman emperors.
  • *Overall, I thought the book was very good, though it did waver a bit at the end when Caligula became emperor. It felt rushed and not fully fleshed out, especially in comparison to the reigns of Augustus and Tiberius.

I'm about a quarter of the way through A Thread. It's about the construction of the first telegraph line across the Atlantic Ocean in the 1860s. It's a pretty interesting bit of trivia.

I just finished I Claudius as well. I was somewhat surprised by how little of a part Claudius plays in the grand scheme of the book, but I suppose the whole point is that he both a distant and all too close observer of the sordid goings on of the Roman elite. I can see how the Caligula parts might feel rushed, but I got the sense that at that point a lot of the big players that Claudius was closing following were just dead, leaving only Caligula's shenanigans to describe. You might know this already but apparently there is a sequel, Claudius the God.

I agree. I kept waiting and waiting for Claudius to finally become emperor and realized about 50 pages from the end that the book would likely end right at the point where he did become emperor.

I think my issue with the book is that once Livia died, the intrigue and backstabbing and villainy became far less subtle and interesting. Caligula was indeed a villain, but a far less interesting one than Livia. His villainy was right up in your face while Livia’s villainy was in the shadows.

I have yet to read I, Claudius, but I did see the TV series in Latin class. I recommend Robert Harris’ Cicero trilogy, it’s a phenomenal series of historical novels which cover Rome in the waning days of the Republic. It’s from the perspective of Cicero’s head of IT scribe/slave.

Thanks for the rec, I've got it saved in my cart.

I'm reading the final book in the Hyperion Cantos series.

As a recovering voracious reader with lots of free time, I've found I'm pretty picky. Almost no sci-fi in the past 10 years has captured me at all, so I'm combing back through the few classics I've missed out on.

Overall I'd give Hyperion a 8/10 which to me is a "Definitely read if you like the genre". However I have to vent about its shortcomings:

  • The author clearly loves historical literature and so has pushed it into many core parts of the story. It doesn't fit in a sci-fi setting as neatly as he imagined it.
  • The world building between ~1990 (when he wrote it) and 2732 (When it's set) is unbelievably sparse. There's maybe one or two wars and authors mentioned for those 700 years, no major advances in religion, etc.
  • The plot armor of the cast after the first book is obvious and impenetrable. I'm not looking for Game of Thrones here, but there has to be a middle ground between that and "I can tell after a single page introduction this guy is going to live the whole series and definitely switch to the be a good guy.
  • Speaking of good guys, the morality spectrum is pretty black and white here. The big bad is the big bad, and the good guys have very few rough edges. Plenty of Deus Ex Machina you can call a mile a way.

All that being said there's plenty of cool concepts and imagery. Each book in the series is pretty different which you normally don't get in a series. May dig into the Battletech books next.

I'm rereading The Human Reach series by John Lumpkin. It's a hard sci-fi Tom Clancy in space series- the atomic rockets guy was illustrator and scientific/technical cosultant- and is at the very least interesting and engaging with no glaringly obvious scientific errors. Ships have heat radiators and space battleship tactics are cognizant of Newton's laws and the tyranny of the rocket equation instead of trying to do Midway or Trafalgar in space.

One thing I think it could benefit from is introducing an economic logic driving space travel and colonization. The series takes place in a world of continued relatively low fertility and tries to use national pride as the logic behind space colonization, which in turn necessitates trade, creating enough of an infrastructure to set space battles and spy stuff against. But it's hard not to notice that the great powers could easily just... not. Not build antimatter factories or fusion fuel production operations or engage in long range exploration. Indeed, it's lampshaded in the books themselves; two of the belligerents are specifically noted for their populations being too low to fill out colonies and the sheer expense of colonization and maintaining a space navy is readily apparent. Every benefit to the homeland is drawn from things based in Earth orbit and not beyond. It's not implausible that the US and Russia and the like would maintain a single colony for national pride, but maintaining multiple and then going to war to own more of them seems to require an explanation, which is lacking in the book.

Isn't that similar to how European colonial empires were a net economic drain? And yet there was something, not measured by that economic equation, that made them want a "place in the sun".

Putting aside motivations like pride and competitiveness, there might be something similar to what Paul Graham wrote (I think), about allowing serendipity. Holing up and focusing on specializations may not be the best investment strategy. Maybe there's a place for trying a number of things that aren't likely to work, in case one of them takes off. (What would the world be like if the circa-1600 UK had decided that this "colonization" thing was economically inefficient?) Maybe it's like the social-capital version of an index fund, or hybridization? Possibly the increased scale provides more options in case something somewhere goes wrong, much like an insurance policy?

(There's room for a counter-argument here, about guaranteeing exposure to disease, political instability, and other problems of heterogeneity.)

I think there were profitable episodes and individual people who made a lot of money throughout, but for the most part most things were a financial drain after the mid-18th century. Even in the Caribbean you had in many cases the classic situation in which profits were privatized but ‘losses’ (paying for defenses like building forts, the various extremely expensive colonial wars, compensating slaveowners) were funded by government borrowing and in most cases taxes on the metropole. While England was much richer than the rest of Europe for almost all of the 19th century, that was probably more to do with the Industrial Revolution than the Empire, and at the height of empire in the early 1920s the UK wasn’t substantially (or at all in some cases) richer than other northwest European countries.

Even where states made a lot of money early on (again, more of an Iberian thing than an Anglo-French one) it was squandered pretty quickly. The Spanish obviously lost it all fighting the Dutch and French. The scale of the public losses are sometimes overstated because a lot of failed Anglo-French investment (eg. the colossal amount of money the British wasted in Mexico, Argentina, Brazil, Chile, Peru and Latin America generally) was private, but that was still a big economic drag. Plus, imperial preference never really worked because the British were worried about another 1776 and so from the early 1880s allowed the colonies to opt out or circumvent a lot of protectionist policy, which meant that the whole system never brought much wealth back to London.

There's another element: discovering wealth does not necessarily make you wealthier. Within 100 years of discovering the Cerro Rico at Potosi which essentially doubled the world's silver supply, the Spanish crown was serially bankrupt.

All the major powers, except possibly the USA, have in-series declining populations who aren’t offered a better life in the colonies(far from it- colonies are shown to be poorer societies which degenerate into shitholes fast without massive subsidies), a major difference from Europe in 1850, and there’s also nothing at all anyone needs in the stars, unlike Victorian Britain which colonized because it was dependent on trade.

Indeed, except for China and Korea, every major nation with colonies is explicitly said to have declined. There are vague hints at ideological reasoning, but an intelligence focused account of a space war could surely round out the motivation behind a colonization effort which is explicitly noted to be an extremely expensive and excessive investment in creating elbow room for a declining population. The science qua physics is like a 10 on the mohs scale; a lack of explanation for major states making highly irrational economic decisions stands out against that backdrop.

Two questions about American colleges:

  1. What are some societal roles universities are uniquely well-suited to fill but just… aren’t, for whatever reason? As someone in the arts, the committed development of new/avant-garde professional work comes to mind.

  2. Based on your moral values, where do you draw the line of how the various strata on a university campus (student, faculty, postgrad, admin, etc) can/should get romantically involved with each other? University dating policies have become vastly more restrictive/protective (based on your value system) in the last decade, especially those between the paying customers and the staff serving them. Is it simply a question of the power dynamic? Age of consent? Moral integrity?

Two important roles that universities successfully fulfilled in the past, still could, but don't:

  • The Liberal Arts College. Elite formation based on a combination of rigorous study of difficult subjects and directed socialisation with other young elites. The original reason why this stopped happening was grade inflation, but to bring it back you also need to fix wokestupid, and to end the rampant dishonesty about young elites imagining themselves as self-made meritocratic strivers. Potential gains: a more cohesive elite that knows important things and has a stronger sense of noblesse oblige.
  • The Research University. The type of curiosity-driven research which is too high-risk for professional (government or corporate) labs without tenure and too remote from practical application for VC-funded startups. Getting this back means fixing publish-or-perish incentives and the PhD overproduction which enables them. Potential gain: the base of pure science that makes spectacular applications low-hanging fruit.

wokestupid

Come on, that’s just lazy.

Anyway, I’d argue that colleges still pursue the latter goal. Even for pie-in-the-sky pure science. But I suppose I’m rather biased, seeing as my sister and I both did our Master’s degrees in these kind of labs. There are two media narratives about university research. And neither “breathless futurism” nor “absurd political sinecures” captures the quiet tide of NSF and corporate money.

I don’t fully understand the incentives. Grad students remain cheaper than full-time employees; employing them on tangential research is a popular way to scout talent. It also interfaces into the reputation games of publishing, trendsetting, and attracting new students. Combine all these, and you get institutions which compete to be known for their pure science.

I have no idea what percentage of university research falls under this umbrella. My school probably had fuel for both media narratives somewhere on campus. But it is a lot closer to the ideal of a Research University than you might expect from a random state school.

Anyway, I’d argue that colleges still pursue the latter goal. Even for pie-in-the-sky pure science. But I suppose I’m rather biased, seeing as my sister and I both did our Master’s degrees in these kind of labs. There are two media narratives about university research. And neither “breathless futurism” nor “absurd political sinecures” captures the quiet tide of NSF and corporate money.

I agree with you that there are plenty of people doing good research in hard science departments - in my foolish youth I wanted to join them* and I still have both the PhD and the physical and emotional scars of getting it. But even in the noughties, most of the good university scientists I worked with were complaining that the incentives were increasingly borked and were driving them towards running their research groups like Fordist paper-factories. There is a lot of useful work that can be done in Fordist paper-factories (the research group next to mine were generating multiple drug leads a year using sweated grad student and postdoc labour), but it is the comparative advantage of government and commercial labs, not universities.

The story I was told by my mentors was that in some unspecified pre-lapsarian golden age the academic career structure had given all scientists the level of academic freedom that (for example) Watson and Crick used to discover DNA even though Bragg would have preferred Crick to work on haemoglobin, but that this was no longer the case and the only way to get that level of research flexibility was to join one of a small number of special institutions like the Cambridge Laboratory of Molecular Biology (the famous LMB, aka the "Nobel Prize factory"). Based on what people are saying online things have got significantly worse since then.

* Solid state physics - the area I worked in (although not the specific problem I was working on) was widely considered cool-but-useless at the time, but is now being used by multiple commercial fusion startups.

Universities should be vastly reduced in size and should mainly consist of men. Any loan system should be abolished but the tuitions should be at a level regular middle class family can pay for their children, or, without family support, one should be able to afford with some part-time plus summer-full-time job. Bureaucracy for choosing research subjects and getting funding should almost entirely be abolished.

Until these things happen universities anywhere will simply be nothing more than rent seekers on a piece of paper that promises but rarely delivers upper-middle-class status, adult daycare, enforcers of bureaucratic power on the elite-minds, and speed bumps on actual science and technology development.

"vastly reduced in size" is antagonistic with preventing rent seeking

What's with the men-only part? I am sincerely asking for clarification because I am not sure of the rationale.

“Mainly consist” was the term I used. It’s a combination of a couple reasons:

  • Female heavy degrees are almost always fake subjects polluting scientific integrity and draining resources. I have witnessed many times in my university life that the worst and most pointless courses/degree specialisations in even the hardest science tracks were devised with the expectation to attract more females. Fields like sociology, psychology, art history etc speak for themselves. Closing down these faculties or only restricting them to rigorous academic work would immediately cut down female population in universities severely.
  • Keeping the intelligent female population of your nation in useless education until late 20s, wasting away the most fertile years of their life is just incredibly bad policy.
  • There are very few significant achievements in human history that doesn’t originate from a tight-knit group of competent men heavy on camaraderie and ambition. Adding females to the equation always erodes this spirit.
  • Endless education as a tool to escape “real life” is a problem afflicting both sexes but especially women fall very hard into this trap. Female brain is much more sensitive to approval from authority figures and the education system with its clear reward feedback loops seems to be almost addictive to a sort of high achieving woman.
  • Women typically makes a lot less use of their education even when they enroll in more sensible degrees. They are easily spooked by competitive environments, they are tricked by social validation that comes with many low value professions, and they take long maternity leaves and work part-time because they enjoy to be with their family more than at work.
  • Female style of office politics is absolutely poisonous to academia. When women takes over administrative positions at sufficient numbers, academic research just gives way to conformism and group-think.
  • The fact that almost every above-average women in the society spends their prime pair bonding years at university campuses, and afterwards develop a refusal to date anyone below their education status, makes university de-facto mandatory for any men with some ambition. There is no reason why someone needs to go through a 4-year degree to become a film director, computer programmer or sales manager. People historically didn’t go to uni for such jobs. But if you try this today you are very likely forfeiting your mate prospects. Even men who don’t want/need uni education to be very successful, have to enter a good one and drop out to gain enough social clout.

I can really just go on. All of these are obviously gross generalisations and often apply to many men in some degrees as well but in the end these effects add up in a big way.

I can't speak for OP, but for my part I assume the rationale is something akin to this.

To sum up the article in a paragraph, women are less pro-free speech and more pro-censorship. In academia, female academics are less likely than male academics to place importance on objectivity and dispassionate inquiry, and more likely to place importance on the ability of their work to be used as a vehicle to deliver views considered "socially good". They are also more supportive of dismissal campaigns and more inclined toward activism. This roughly correlates with the increasing politicisation of the academy as a vehicle for activism, and while the author admits that it is certainly not the only factor contributing to the trend, it is also what you would expect to see when a group with a preference for emotional safety over academic freedom enters a space.

In other words, I don't think it's necessarily a prima facie ridiculous position if OP values academic freedom over censorship and thinks it carries more value for society than having women in academia does. Forcing a state of affairs where the academic environment is mostly comprised of men would be conducive to this goal, and in similar fashion forcing an academic environment that's uncompromising in terms of freedom of speech would disproportionately cause women to self-select out of the academy. Whichever way this goal is reached, greater academic freedom likely entails less women in academia.

This kind of generalization-based quantitative thinking is I think the undoing of the Motte in some ways. I don't disagree with the idea of academic freedom over censorship, but this bean counting assumption-driven basis of policy is to me patently bad policy.

I was trying to provide a steelman, not necessarily forward sets of policy preferences of my own (for my part, I don't think a policy proposal that enforces a mostly-male academic environment is doable in the first place, not because I think it would be bad for society but because it's effectively useless as it's too far out of the Overton Window - the second % female drops below a certain threshold, regardless of the reasons for it people will start taking umbrage at it). I also don't think there would have been an explanation I could've given that would have been satisfactory to your specific set of moral preferences.

In any case, I don't think there's anything wrong with applying quantitative thinking to social issues. Different groups of people are different on aggregate, and they shape societies in distinct ways aligned with their preferences. Trying to ignore that when policy-making is folly, in my opinion.

There is actually another reason to take a series of actions that just happens to make university mostly male, as opposed to doing it explicitly (unfair, gauche, and impossiible).

Actions are roughly as follows:

  • Get rid of mostly female degrees, getting rid of degrees that are academically useless will solve this problem anyways.
  • Make school and college in general less hostile to boys.

Reasons:

  • Stops giving women undeserved status of being "college educated" despite not being any smarter or more useful. It's just status laundering. Will do a lot of work to fix the dating markets.