This weekly roundup thread is intended for all culture war posts. 'Culture war' is vaguely defined, but it basically means controversial issues that fall along set tribal lines. Arguments over culture war issues generate a lot of heat and little light, and few deeply entrenched people ever change their minds. This thread is for voicing opinions and analyzing the state of the discussion while trying to optimize for light over heat.
Optimistically, we think that engaging with people you disagree with is worth your time, and so is being nice! Pessimistically, there are many dynamics that can lead discussions on Culture War topics to become unproductive. There's a human tendency to divide along tribal lines, praising your ingroup and vilifying your outgroup - and if you think you find it easy to criticize your ingroup, then it may be that your outgroup is not who you think it is. Extremists with opposing positions can feed off each other, highlighting each other's worst points to justify their own angry rhetoric, which becomes in turn a new example of bad behavior for the other side to highlight.
We would like to avoid these negative dynamics. Accordingly, we ask that you do not use this thread for waging the Culture War. Examples of waging the Culture War:
-
Shaming.
-
Attempting to 'build consensus' or enforce ideological conformity.
-
Making sweeping generalizations to vilify a group you dislike.
-
Recruiting for a cause.
-
Posting links that could be summarized as 'Boo outgroup!' Basically, if your content is 'Can you believe what Those People did this week?' then you should either refrain from posting, or do some very patient work to contextualize and/or steel-man the relevant viewpoint.
In general, you should argue to understand, not to win. This thread is not territory to be claimed by one group or another; indeed, the aim is to have many different viewpoints represented here. Thus, we also ask that you follow some guidelines:
-
Speak plainly. Avoid sarcasm and mockery. When disagreeing with someone, state your objections explicitly.
-
Be as precise and charitable as you can. Don't paraphrase unflatteringly.
-
Don't imply that someone said something they did not say, even if you think it follows from what they said.
-
Write like everyone is reading and you want them to be included in the discussion.
On an ad hoc basis, the mods will try to compile a list of the best posts/comments from the previous week, posted in Quality Contribution threads and archived at /r/TheThread. You may nominate a comment for this list by clicking on 'report' at the bottom of the post and typing 'Actually a quality contribution' as the report reason.

Jump in the discussion.
No email address required.
Notes -
... to steelman the counter-Lincicome argument:
High-grade semiconductors have massive military and economic importance. There are a variety of things that either can't be done, or can only be done on glacial timescales, just going from <10nm to 100nm process scales, due to thermal, electrical, and latency issues. Losing the ability to produce or source those processors would put a country literally twenty years back; having them sourced to a supply chain that's opposed to that nation risks serious security and core functionality threats. AI is the current-day focus, but these tools are necessary for everything from telecommunications to hardware control to transportation in their 'conventional' programmed form.
((It's actually worse than it sounds, because in a lot of military and industry roles you can't just switch out a scavenged motherboard and CPU and call it good enough. In extreme cases, you end up with very specific chips not just to manufacturer or generation but even specific chips. On the upside, the smarter businesses buy in bulk. On the downside, there's a lot of buyers that aren't smarter, or things in-development that can't be undone.))
Lincicome discusses government support for Intel as stiffling third-party competitors, but this is a class that basically doesn't exist, here. AMD and RISC-V and everyone else even remotely in the field either depend on Intel or Taiwan, or on Chinese manufacturers. There are no other <10nm foundries in the United States or even Western Europe, and the closest companies as a process level (STM, Micron,
maybe Motorola if you squintEDIT: definitely not, thanks Skooma) are neither capable of nor interested in micro_processor_ work at the desktop scale. Any attempts, even attempts using third-party foundries oversees, have ended poorly, as anyone who recognizes the name VIA might recognize. Any model for how those systems could respond to a complete collapse of Intel is necessarily guesswork, but I've seen credible estimates of 10+ year timelines to bring even mid-end devices assuming everything works out perfectly the first go.Meanwhile, the competition is not exactly operating from a free market. TSMC is the closest, and it's a national project for Taiwan, unsurprisingly with how much of its economy revolves around the business, and one part of that is Taiwan gave the company a giant pile of seed cash in exchange for just-shy-of-half of its ownership, and other parts involve widespread continuing indirect subsidies on its major material costs. The Chinese government doesn't exactly give out the most honest breakdowns for how its subsidizes foundries, but even ignoring the !!fun!! question of industrial espionage with CCP characteristics, the official numbers are significant and come with very pre-2018-Jack-Ma-sized strings.
Which could be surmountable. Intel, as recently as 2015, was still on top of the world, to a point where people were worried AMD would go under.
But it's gone from merely slightly behind-the-curve in 2019 to consistently the less-good choice across entire fields, often by significant margins and with no or negative price premium, along with a number of serious stability and reliability concerns due to manufacturing defects. And that's worse than it sounds. Chips and foundry technologies are costly not just to produce, but also to fail to sell, both due to how the sales model works and due to rapid depreciation. To skip over a whole bunch of technical details, they're in a cash crunch at the same time that they need a lot of investment to not be in more of a cash crunch unless they want to turn into a second- or even third-rate foundry.
((There's also some messiness involving Intel ARC, which is both strategically very important to the Western world's military, not obvious, and which has an entertainment business case that it's only barely starting to credibly begin to compete with kinda, but is a short investment away from being a really big deal.))
Which might just be the only achievable result, if we trusted Intel to be doing (or trying) the best thing. But there's a lot of reasons to be skeptical. The current CEO and board have been abandoning new development processes since December of last year. Critics have focused on said CEO's ties to the CCP, and to be fair those do exist! But even if Intel was making these decisions from a solely economic basis, they're overwhelmingly emphasizing matters to maintain stock price over either the availability of next-gen onshore foundries or the company's long-term dominance or relevance as a first-tier manufacturer. The actions here are ways to credibly commit both the US government to continued (or starting) the funding it claimed it would provide, and Intel to actually running the things.
It's not that the conventional criticisms of crony capitalism stopped existing! There are significant risks to this sort of investment and (tbf, minimal) control. But there are tradeoffs and risks to non-action, and Lincicome seems neither willing nor able to even consider them.
Would you have been calling for the state taking ownership of the means of production before this had happened? I really doubt many conservatives would have.
Maybe it's motivated reasoning ex post facto trying to justify his behavior or maybe it's just that "conservativism" as a label has already been stolen by people who hate free market capitalism and small government, but it's the exact opposite of traditional conservative ideas.
Which traditional conservative ideas would those be? "Ensure the means to produce an absurdly vital strategic resource with a lead-up time measured in decades remains possible in $country" is something even minarchists believe the role of government should cover.
It's literally the best case scenario; the US only has one manufacturer of space-magic technology that isn't within trivial striking distance of its enemies. Samsung is, TSMC is, even Intel's own fabs are (the ones it built in Israel).
This assumes that everyone agrees government ownership of companies is the correct approach, which they in fact do not and have not historically agreed on.
Maybe we can cut back regulations, support development and building of technology, etc instead of expanding government more in response to expanded government.
More options
Context Copy link
More options
Context Copy link
I'm not sure I'm calling for it now: the above post is a steelman, and one with a number of caveats, qualifications, and carveouts.
There are valid counterarguments, like what extent smaller competitors licensing ARM chips might be able to pull an underdog reversal in a big hurry, or how much a lot of central infrastructure needs modern processing power rather than just having grown like a goldfish to fill it, or whether a failing IBM might fracture such that its foundry side survived rather than got pulled down with the rest. There are some less credible but at least plausible ones: maybe China's Not That Bad after all, or going to collapse under its own inertia before any of this could be relevant, or military/economic considerations are a lot less important than social ones.
But these aren't new considerations, either; they're the sort of thing people were bringing up in response to the CHIPS Act itself, too. It's long been a point of controversy in even libertarian circles what tradeoffs exist between private and public management of matters like disaster response, military readiness, telecommunications, and core public welfare. I'd like if there were simple, easy, Big Head Press-style answers, but if they exist they're not self-demonstrating.
More options
Context Copy link
More options
Context Copy link
Motorola hasn't existed for 17 years and spun off their cpu / microcontroller business back in 2004 to Freescale which was since acquired by NXP in 2015.
And NXP is definitely not even close to a tier-one fab these days; I think they've capped out in the >50nm range. Thanks for the catch, not sure why I thought they still had a telecom branch.
... there's not really any good 'if you squints' left, then. TI have got the process tech to leave 'guy in shed' fully in the dust, but their ARM stuff is more at the embedded systems level from my understanding, in addition to the TI tax. BAE is heavily focused on defense sector, unsurprisingly, which good for them but not helpful for the rest of us. Tower, I guess? I dunno much about their production outputs beyond some cmos stuff.
That's a bit like saying Volvo isn't even close to being a tier-one supercar manufacturer these days...
NXP has never even aimed for the high end application processor market (not that Motorola was relevant in its historical equivalent since the late 80s either). They merged with Freescale for their microcontroller and automotive IC portfolio, not in an effort to compete with Intel or similar cpu manufacturers.
Fair; it's definitely not that they were trying to be the next desktop infrastructure, and it's not like what they're doing instead is easy. It makes sense for them to focus where they've focused. If I ever have the free time, I'd love to get some experience working with the IMX8 stuff as an embedded linux tool.
But even compared to where Freescale was in 2002 versus the market segment NXP is aiming for in 2025, the difference seems bigger. Some of that's just the top of the market has gotten much higher -- Razr mattered, but it mattered pre-smartphone; some of the network equipment goes in a similar boat -- but it's something that separates the business from being meaningful competition for most of Intel's most important stuff.
They're in different industries and have been for decades. The last time Motorola had notions of competing with Intel was 30 years ago. By the time Freescale was spun out in 2004, they had no commonalities (ignoring Intel's doomed to fail attempt at pushing into embedded / mobile processor market with Atom & Galileo). Freescale and then NXP have always been purely in "deep" embedded market where computing performance just isn't that important and is behind many other considerations. You use an iMX8 because you need a large set of integrated peripherals in a small form factor and at low cost. It might run Linux because that simplifies the software development and allows better networking and simpler multimedia display (think map or spotify album art) but you really don't care about how it performs in benchmarks (as long as it passes some minimum bar). Using a larger manufacturing process is an outright positive thing as it allows lower idle power consumption.
Intel OTOH has always been about legacy software support and how many GHz you get in a package / per $$$, considerations that simply don't exist in NXP's market. You'd never put an Intel cpu in an embedded device because it'd be a nightmare to integrate, eat too much power and cost too much compared to an MCU that does that job much better.
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
Huh? I could swear they're still making shortwave radios.
That's Motorola Mobility (subsidiary of Lenovo, smartphones and stuff) or Motorola Solutions (safety and security products). Neither have anything to do with IC manufacturing.
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
This is the first I've heard of a significant military interest in Arc. Could you unpack that?
Certainly the whole computer gaming world has been begging Intel not to kill off Arc before it's reached maturity. Everybody expected it to lose money for the first couple generations, but Intel has been incredibly strapped for cash, so it wouldn't be a shock to see it sacrifice long-term interests for short-term ones.
This is tantamount to giving up its foundries, and I'm surprised not to have seen more analysis. I wonder if he thinks that that portion of the business is totally unsustainable in the long run, or if he's just playing chicken with the U.S. government hoping for more money.
Maybe that's what a government stake in Intel is supposed to resolve?
Uh... technically it's Intel's Flex for the server side, and this is extrapolation rather than anything I know first-hand so it's probably wrong, but :
At the higher confidence level, these boards can run inference comparable to mid-tier nVidia cards, and could potentially be made in Arizona, rather than in TMSC. That's not going to get you massive AGI from LLMs, if such a thing is possible, but there's a lot of video and image data, signals analysis, and more esoteric stuff (HMDs!) that needs realtime or near realtime processing. Yes, most applications would prefer a Jetson over either a normal nVidia card or an Intel one, but since even the Thor-sized Jetsons can't keep up with realtime efforts, they get to compromise. The nVidia boards aren't currently irreplaceable, but military procurement does consider whether there are alternative sources, and Intel is the only even potential alternative.
The... more speculative bit is that Intel's got a number of design opportunities that they were starting to build around. Optimizing data transfer from network card to CPU to GPU is a boring and unsexy thing, but it's actually a big deal for extremely realtime behaviors like streaming video operations, and something where nVidia's offerings (an ARM 'superchip' call Grace) were far behind the competitors... before Intel killed their better solution. There's also some weird messiness with nVidia's solutions for licensing and for chip-to-chip interconnects that are navigable for datacenters for a nightmare for US military procurement.
This isn't readily available to consumers (or even prosumers) yet, because the vast majority of extent AI/ML workloads don't work in Intel ARC environments to start with; those that do seldom even get the full benefit from the GPU's hardware, and even fewer get any serious benefit from GPU/CPU integration. But it's at least a space that would be interesting if Intel could get its core crap together.
Officially, they're just doubling down on the 1.4nm stuff. But I don't think I'm the only person reading it that way. For motive, it's hard to tell. He could even be playing chicken with buyers, trying to pressure them to step up early rather than wait until after engineering samples have already gotten off the floor. But I'm not optimistic given the amount and variety of other cuts.
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link