This weekly roundup thread is intended for all culture war posts. 'Culture war' is vaguely defined, but it basically means controversial issues that fall along set tribal lines. Arguments over culture war issues generate a lot of heat and little light, and few deeply entrenched people ever change their minds. This thread is for voicing opinions and analyzing the state of the discussion while trying to optimize for light over heat.
Optimistically, we think that engaging with people you disagree with is worth your time, and so is being nice! Pessimistically, there are many dynamics that can lead discussions on Culture War topics to become unproductive. There's a human tendency to divide along tribal lines, praising your ingroup and vilifying your outgroup - and if you think you find it easy to criticize your ingroup, then it may be that your outgroup is not who you think it is. Extremists with opposing positions can feed off each other, highlighting each other's worst points to justify their own angry rhetoric, which becomes in turn a new example of bad behavior for the other side to highlight.
We would like to avoid these negative dynamics. Accordingly, we ask that you do not use this thread for waging the Culture War. Examples of waging the Culture War:
-
Shaming.
-
Attempting to 'build consensus' or enforce ideological conformity.
-
Making sweeping generalizations to vilify a group you dislike.
-
Recruiting for a cause.
-
Posting links that could be summarized as 'Boo outgroup!' Basically, if your content is 'Can you believe what Those People did this week?' then you should either refrain from posting, or do some very patient work to contextualize and/or steel-man the relevant viewpoint.
In general, you should argue to understand, not to win. This thread is not territory to be claimed by one group or another; indeed, the aim is to have many different viewpoints represented here. Thus, we also ask that you follow some guidelines:
-
Speak plainly. Avoid sarcasm and mockery. When disagreeing with someone, state your objections explicitly.
-
Be as precise and charitable as you can. Don't paraphrase unflatteringly.
-
Don't imply that someone said something they did not say, even if you think it follows from what they said.
-
Write like everyone is reading and you want them to be included in the discussion.
On an ad hoc basis, the mods will try to compile a list of the best posts/comments from the previous week, posted in Quality Contribution threads and archived at /r/TheThread. You may nominate a comment for this list by clicking on 'report' at the bottom of the post and typing 'Actually a quality contribution' as the report reason.

Jump in the discussion.
No email address required.
Notes -
I tire of these posts often and the kind of comments they enjoin from others. The key word and phrase they’re often looking for is “individualism” and the importance of initial conditions.
I’d be curious to know how much innovation is spontaneous in comparison with how much was planned. When William Shockley invented the first transistor, he probably didn’t have the modern computer in mind. Or the digitalization of the world for that matter. A lot of these ideas are germs and some get built on and others don’t. Of those that receive work on them some fail and some succeed due to timing effects, wrong approaches, lack of funding, all manner of different things. New developments to some extent always require the free play of ideas, but there’s no reason why it specifically ‘has’ to appear in one place or the other. China first cast iron a thousand years before the Europeans did and for centuries Europe was the technological underdeveloped backwater of the rest of the world. There’s no reason why it ‘had’ to be that way. The Soviets originally had their own competition to the ARPANET that ultimately went sideways to due to their own ideological beliefs. You could argue there wasn’t enough independence of thought. Or perhaps they had the wrong ideological perspective.
Diversity isn’t a good for its own sake. It has both its upsides and downsides and whatever else your opinion of it, you still have to figure out a way to live with it.
Also as a side note to your side note(!) there was a book recently recommended to me by a friend who is eager to get me a copy and read it so I can give him my thoughts on it. In it, he said the author specifically mentions the patent system as one of the markers of a society’s relative decline in cultural and technological achievement. It’s an interesting barometer and one I hadn’t thought of originally. It probably does yield useful insights.
As an aside it's a bit inaccurate, or at least incomplete to say Shockley invented the first transistor. Probably more accurate to say "contributed to the invention of" or "developed the bipolar junction transistor."
From the 1956 Nobel citation:
Shockley's main contribution to the first transistor was suggesting using field-effect to control a junction, but this had already been proposed by Julius Lilienfeld. He probably does deserve much of the credit for the bipolar junction transistor.
This does emphasis the point that a given invention is confluence of a variety of circumstances such that, as you say:
It is quite a testament to Bell Labs that they not only were able to recruit such a large stable of geniuses, but were able to harness that power in a synthesis of cooperation and competition. It can't have been easy to manage so many (justifiably) huge egos.
Yes. Yes, indeed.
I trust you understand my original point though, I hope.
More options
Context Copy link
More options
Context Copy link
It's quite close to 0% planned and 100% spontaneous.
Kuhn's The Structure of Scientific Revolutions is probably still the best framework for how human knowledge (science and so, downstream, technology) develops. The long and short of it is that lots of happy accidents often build upon each other. Planning innovation is almost an oxymoron.
The problem then becomes, how do we 'cultivate the garden', so to speak, to make happy accidents more commonplace? Or to shorten the distance between related but unknown nodes that are working on the same problems? The University System and the various Bell Labs / PARC / DARPA orgs of the mid 20th century seem to have done this well. Both had different failure modes which roughly follow red and blue tribe cleavages.
The University System lost to ideological capture but also, more generally, a total remove from practical problems. Instead of a bunch of really smart professors working with Corporations, the Navy, or whomever or an actual problem, "pure" research began to win out. You'd get esoteric improvements in something like photonics that was utterly untenable in a production setting because the supply chain for the super rare materials didn't exist or the apparatus involved couldn't function outside of a clean lab.
The Bell Labs etc. failed because corporations stopped funding them. There's a debate as to why. Some simply gesture at "grrr greedy capitalists" which has never been a satisfying answer for me. The better answer, though still not "a-ha!" level in my mind is that actually novel and meaningful research is getting harder and taking longer. So, while a corporation may not need its R&D department to come up with something new every quarter, it's harder to not want to cut their budget after 10 or 20 years of nothing new. Furthermore, there's a pretty good argument to be made that corporations shouldn't be trying to shoot-the-moon with totally novel ideas but, rather, really be solving the "last mile" problem of new technology - how to sustain it, scale it, and then make it by degrees cheaper and cheaper. The middle ground that's evolving is something like Focus Research Organizations.
The final players - DARPA and other FFRDCs (Federally Funded Research and Development) kind of kept the spirit alive longer. DARPA has a very specific operating model that nowhere else in government replicates. But they fell victim to GWOT funding strategy - let's make everything about terrorists instead of focusing on, I don't know, time travel and teleportation. The FFRDCs became some of the most egregious leeches of Federal R&D welfare dollars. MITRE is quite literally make work jobs for PhDs. If you can endure living in a Kafka novel every day, you can make $200k per year and enjoy Tysons Corner traffic for your commute.
The real "oh, we fucking suck" moment was GPT-2 in late 2022. Almost every other major American technology development since WW2 could be traced back to some sort of federal, academic, or corporate R&D lab. That the Attention Is All You Need paper came out from a some ML engineers at google fucking around was, in my mind, kind of the tombstone on the "trad" R&D ecosystem.
I am not as knowledgeable about the other labs, but clearly Bell Labs was a particular cultural entity that was of its time, and the answer as to its decline is simply formalization of education and hiring and workplace employee rules means that the kind of lightning they caught in a bottle at Bell can't be caught anymore. Shockley would be run out of a modern day Bell; Bardeen would never have been admitted to Harvard/Princeton and so never would have done his important grad work and been hired by Bell; Brattain's academics would not have gotten him hired; the kind of bouncing around between government and industry that Shannon did is now very difficult to do; Jewett is probably the person with a backround most likely to actually be in the position he was at Bell in the modern day, but his protege/successor Kelly would never have even gotten an interview now.
This is all painfully and utterly correct.
More options
Context Copy link
More options
Context Copy link
In general "grrr greedy capitalists" is only ever a satisfying answer in the same sense that "grrr Schrodinger equation" is. Technically both ideas explain a whole lot, but if you're ever looking for an explanation for why something changed, say, between 1980 and 1990, you can't solely check in the laws of economics or physics.
In this case, ironically, "Some simply gesture at "grrr greedy capitalists"" might be the explanation. Ma Bell was an enormous company with a quasi-governmental monopoly, so they could expect to be able to capture most of the value of even relatively pure and fundamental research ... and then anti-trust action broke them up into a bunch of Baby Bell companies who could only capture the value of research that was sufficiently applied and peripheral to turn a profit before its patent(s) would expire. By what may have been a wacky coincidence, but of course wasn't, Bell Labs got a ton of funding before the breakup and not so much after.
Despite my snark, I believe it's possible that the loss to research was exceeded by the gains of breaking up the quasi-monopoly. I'm old enough to remember land lines, and adding a second phone to the same line by just adding a splitter and running one cable to another room; a little further back in time, this would have required a call to The phone company to get permission and a technician and an extra monthly surcharge. It's easy to imagine that an indefinite continuation of this state of affairs in the USA could have crippled the nascent internet, which for years was only accessible to most residences via modems piggy-backing data over phone lines.
Ideally, handling the collective action problems of research without a giant monopoly (or, at least, with a giant monopoly we all get to control on election day) is what University research is supposed to be for; we try to give University researchers the proper incentives to try to come up with ideas that will be useful decades down the road, not just years. If we did that right, we should have been able to cut up the fabled goose here without losing out on all the golden eggs. To a great extent, University research works, even! I agree with your suspicions that we didn't entirely do that right, and with your explanations for why it doesn't work as well as it should, but I wouldn't want to come to any strong conclusions without trying to quantify those magnitudes somehow.
More options
Context Copy link
I don’t think it’s anywhere as lopsided as you might think. I don’t have access to my copy of the book at the moment, but the last I read indicated a much more complex picture than that. A ‘lot’ of innovation actually comes out of big business. The pharmaceutical industry alone is proof enough of that.
When I say “planned innovation” I don’t mean innovation by committee. What I’m saying is the idea may not have originated with them, but they were the ones who did something with it. Steve Jobs didn’t invent the transistor. He had a vision for the early uses of new technology. He had his plans for the tech that made him who he was.
If you want to remain focused on DARPA or Bell Labs for example, that used to be one of the favorite examples anarchists brought up in support of their philosophical aims. The boundary and restrictions were somewhat wide, but while it’s true that many of the pioneers and innovators within those organizations didn’t invent things through top down directive, they still had to meet certain qualifications that their free exchange of ideas had to be valuable serve the mission statement of the institution. Meaning your work still had to be found useful to the bureaucrats. Otherwise you were out.
Even very politically top heavy countries like China are producing an enormous amount of innovation.
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link