This weekly roundup thread is intended for all culture war posts. 'Culture war' is vaguely defined, but it basically means controversial issues that fall along set tribal lines. Arguments over culture war issues generate a lot of heat and little light, and few deeply entrenched people ever change their minds. This thread is for voicing opinions and analyzing the state of the discussion while trying to optimize for light over heat.
Optimistically, we think that engaging with people you disagree with is worth your time, and so is being nice! Pessimistically, there are many dynamics that can lead discussions on Culture War topics to become unproductive. There's a human tendency to divide along tribal lines, praising your ingroup and vilifying your outgroup - and if you think you find it easy to criticize your ingroup, then it may be that your outgroup is not who you think it is. Extremists with opposing positions can feed off each other, highlighting each other's worst points to justify their own angry rhetoric, which becomes in turn a new example of bad behavior for the other side to highlight.
We would like to avoid these negative dynamics. Accordingly, we ask that you do not use this thread for waging the Culture War. Examples of waging the Culture War:
-
Shaming.
-
Attempting to 'build consensus' or enforce ideological conformity.
-
Making sweeping generalizations to vilify a group you dislike.
-
Recruiting for a cause.
-
Posting links that could be summarized as 'Boo outgroup!' Basically, if your content is 'Can you believe what Those People did this week?' then you should either refrain from posting, or do some very patient work to contextualize and/or steel-man the relevant viewpoint.
In general, you should argue to understand, not to win. This thread is not territory to be claimed by one group or another; indeed, the aim is to have many different viewpoints represented here. Thus, we also ask that you follow some guidelines:
-
Speak plainly. Avoid sarcasm and mockery. When disagreeing with someone, state your objections explicitly.
-
Be as precise and charitable as you can. Don't paraphrase unflatteringly.
-
Don't imply that someone said something they did not say, even if you think it follows from what they said.
-
Write like everyone is reading and you want them to be included in the discussion.
On an ad hoc basis, the mods will try to compile a list of the best posts/comments from the previous week, posted in Quality Contribution threads and archived at /r/TheThread. You may nominate a comment for this list by clicking on 'report' at the bottom of the post and typing 'Actually a quality contribution' as the report reason.

Jump in the discussion.
No email address required.
Notes -
Diversity is our Strength. Us being whites
At the top of Marginal Revolution today: "How Cultural Diversity Drives Innovation"
I'm a tech development and "innovation" nerd. There's a small, but growing, especially in recent years, online commmunity of people who read organizational histories of places like Bell Labs and the original Lockheed Skunkwords to try and figure out the best ways to do real tech development. Not academic science projects and not VC backed bullshit which is mostly business model innovation (that even more often fails).
You don't have to read the whole study. The abstract itself is either a hilarious self-own or and even more hilarious playing-dumb post.
1850 to 1940. Bruh.
This paper shows that having big time diversity - you know, mixing all those crazy Poles, Irish, French, Germans, English, Welsh, Czech, Slovak, Greek, hell even a few Italians and Spanish in there - was a massive reason the USA was such a technologically innovative place!
The HBDers are going to love this one.
Side note on the hard tech angle: patent issuance used to be a decent enough and standardized enough measure for "innovation." Since the rise of legalism post WW2, however, it's so much more noisy now that it's questionable if it remains a valid "fungible currency" for studying innovation and tech development.
I tire of these posts often and the kind of comments they enjoin from others. The key word and phrase they’re often looking for is “individualism” and the importance of initial conditions.
I’d be curious to know how much innovation is spontaneous in comparison with how much was planned. When William Shockley invented the first transistor, he probably didn’t have the modern computer in mind. Or the digitalization of the world for that matter. A lot of these ideas are germs and some get built on and others don’t. Of those that receive work on them some fail and some succeed due to timing effects, wrong approaches, lack of funding, all manner of different things. New developments to some extent always require the free play of ideas, but there’s no reason why it specifically ‘has’ to appear in one place or the other. China first cast iron a thousand years before the Europeans did and for centuries Europe was the technological underdeveloped backwater of the rest of the world. There’s no reason why it ‘had’ to be that way. The Soviets originally had their own competition to the ARPANET that ultimately went sideways to due to their own ideological beliefs. You could argue there wasn’t enough independence of thought. Or perhaps they had the wrong ideological perspective.
Diversity isn’t a good for its own sake. It has both its upsides and downsides and whatever else your opinion of it, you still have to figure out a way to live with it.
Also as a side note to your side note(!) there was a book recently recommended to me by a friend who is eager to get me a copy and read it so I can give him my thoughts on it. In it, he said the author specifically mentions the patent system as one of the markers of a society’s relative decline in cultural and technological achievement. It’s an interesting barometer and one I hadn’t thought of originally. It probably does yield useful insights.
It's quite close to 0% planned and 100% spontaneous.
Kuhn's The Structure of Scientific Revolutions is probably still the best framework for how human knowledge (science and so, downstream, technology) develops. The long and short of it is that lots of happy accidents often build upon each other. Planning innovation is almost an oxymoron.
The problem then becomes, how do we 'cultivate the garden', so to speak, to make happy accidents more commonplace? Or to shorten the distance between related but unknown nodes that are working on the same problems? The University System and the various Bell Labs / PARC / DARPA orgs of the mid 20th century seem to have done this well. Both had different failure modes which roughly follow red and blue tribe cleavages.
The University System lost to ideological capture but also, more generally, a total remove from practical problems. Instead of a bunch of really smart professors working with Corporations, the Navy, or whomever or an actual problem, "pure" research began to win out. You'd get esoteric improvements in something like photonics that was utterly untenable in a production setting because the supply chain for the super rare materials didn't exist or the apparatus involved couldn't function outside of a clean lab.
The Bell Labs etc. failed because corporations stopped funding them. There's a debate as to why. Some simply gesture at "grrr greedy capitalists" which has never been a satisfying answer for me. The better answer, though still not "a-ha!" level in my mind is that actually novel and meaningful research is getting harder and taking longer. So, while a corporation may not need its R&D department to come up with something new every quarter, it's harder to not want to cut their budget after 10 or 20 years of nothing new. Furthermore, there's a pretty good argument to be made that corporations shouldn't be trying to shoot-the-moon with totally novel ideas but, rather, really be solving the "last mile" problem of new technology - how to sustain it, scale it, and then make it by degrees cheaper and cheaper. The middle ground that's evolving is something like Focus Research Organizations.
The final players - DARPA and other FFRDCs (Federally Funded Research and Development) kind of kept the spirit alive longer. DARPA has a very specific operating model that nowhere else in government replicates. But they fell victim to GWOT funding strategy - let's make everything about terrorists instead of focusing on, I don't know, time travel and teleportation. The FFRDCs became some of the most egregious leeches of Federal R&D welfare dollars. MITRE is quite literally make work jobs for PhDs. If you can endure living in a Kafka novel every day, you can make $200k per year and enjoy Tysons Corner traffic for your commute.
The real "oh, we fucking suck" moment was GPT-2 in late 2022. Almost every other major American technology development since WW2 could be traced back to some sort of federal, academic, or corporate R&D lab. That the Attention Is All You Need paper came out from a some ML engineers at google fucking around was, in my mind, kind of the tombstone on the "trad" R&D ecosystem.
In general "grrr greedy capitalists" is only ever a satisfying answer in the same sense that "grrr Schrodinger equation" is. Technically both ideas explain a whole lot, but if you're ever looking for an explanation for why something changed, say, between 1980 and 1990, you can't solely check in the laws of economics or physics.
In this case, ironically, "Some simply gesture at "grrr greedy capitalists"" might be the explanation. Ma Bell was an enormous company with a quasi-governmental monopoly, so they could expect to be able to capture most of the value of even relatively pure and fundamental research ... and then anti-trust action broke them up into a bunch of Baby Bell companies who could only capture the value of research that was sufficiently applied and peripheral to turn a profit before its patent(s) would expire. By what may have been a wacky coincidence, but of course wasn't, Bell Labs got a ton of funding before the breakup and not so much after.
Despite my snark, I believe it's possible that the loss to research was exceeded by the gains of breaking up the quasi-monopoly. I'm old enough to remember land lines, and adding a second phone to the same line by just adding a splitter and running one cable to another room; a little further back in time, this would have required a call to The phone company to get permission and a technician and an extra monthly surcharge. It's easy to imagine that an indefinite continuation of this state of affairs in the USA could have crippled the nascent internet, which for years was only accessible to most residences via modems piggy-backing data over phone lines.
Ideally, handling the collective action problems of research without a giant monopoly (or, at least, with a giant monopoly we all get to control on election day) is what University research is supposed to be for; we try to give University researchers the proper incentives to try to come up with ideas that will be useful decades down the road, not just years. If we did that right, we should have been able to cut up the fabled goose here without losing out on all the golden eggs. To a great extent, University research works, even! I agree with your suspicions that we didn't entirely do that right, and with your explanations for why it doesn't work as well as it should, but I wouldn't want to come to any strong conclusions without trying to quantify those magnitudes somehow.
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link