site banner

Culture War Roundup for the week of November 28, 2022

This weekly roundup thread is intended for all culture war posts. 'Culture war' is vaguely defined, but it basically means controversial issues that fall along set tribal lines. Arguments over culture war issues generate a lot of heat and little light, and few deeply entrenched people ever change their minds. This thread is for voicing opinions and analyzing the state of the discussion while trying to optimize for light over heat.

Optimistically, we think that engaging with people you disagree with is worth your time, and so is being nice! Pessimistically, there are many dynamics that can lead discussions on Culture War topics to become unproductive. There's a human tendency to divide along tribal lines, praising your ingroup and vilifying your outgroup - and if you think you find it easy to criticize your ingroup, then it may be that your outgroup is not who you think it is. Extremists with opposing positions can feed off each other, highlighting each other's worst points to justify their own angry rhetoric, which becomes in turn a new example of bad behavior for the other side to highlight.

We would like to avoid these negative dynamics. Accordingly, we ask that you do not use this thread for waging the Culture War. Examples of waging the Culture War:

  • Shaming.

  • Attempting to 'build consensus' or enforce ideological conformity.

  • Making sweeping generalizations to vilify a group you dislike.

  • Recruiting for a cause.

  • Posting links that could be summarized as 'Boo outgroup!' Basically, if your content is 'Can you believe what Those People did this week?' then you should either refrain from posting, or do some very patient work to contextualize and/or steel-man the relevant viewpoint.

In general, you should argue to understand, not to win. This thread is not territory to be claimed by one group or another; indeed, the aim is to have many different viewpoints represented here. Thus, we also ask that you follow some guidelines:

  • Speak plainly. Avoid sarcasm and mockery. When disagreeing with someone, state your objections explicitly.

  • Be as precise and charitable as you can. Don't paraphrase unflatteringly.

  • Don't imply that someone said something they did not say, even if you think it follows from what they said.

  • Write like everyone is reading and you want them to be included in the discussion.

On an ad hoc basis, the mods will try to compile a list of the best posts/comments from the previous week, posted in Quality Contribution threads and archived at /r/TheThread. You may nominate a comment for this list by clicking on 'report' at the bottom of the post and typing 'Actually a quality contribution' as the report reason.

16
Jump in the discussion.

No email address required.

Is continuous service, or the lack of discrete, typically annualized product cycles, killing innovation in prevailing economic sectors? Can we spur growth by productizing more goods and services into annualized release cycles?

Seeing the recent headlines of large layoffs at tech companies, sometimes accompanied by snide comments that nothing of value was lost due to bloat at fat and efficient firms that enable quiet quitting, I thought about just how much innovation actually takes place in corporations that employ large numbers of knowledge workers.

Certain consumer products prominently feature measurable progress. Smartphones are constantly coming out with improved display (e.g., less bezel, more brightness/resolution), processing (speed), memory (size), camera (zoom, software magic) etc. Laptops can also boast longer battery life and lower weight, while gaming desktops have CPUs and GPUs with FPS improvements in major titles. EVs feature increasing range and acceleration.

But everything else in the economy just feels... stagnant?

Maybe a lot takes place behind the scenes and is invisible to laymen. But it's very unclear to me what all the white collar knowledge workers have created in most other large companies with all of their time. Some ad hoc examples that come to mind:

  • Banking: What has Chase, BofA, Wells Fargo, or Citi done in the last 5-10 years for the retail banking customer? Sure, everyone has a solid mobile app now. But how has core services improved? Is there anything measurable new or improved with checking, savings, or loans? Transfers still do not work over weekends or holidays. I guess maybe the public doesn't really expect innovation from the banks, but then what's with the massive payrolls? What does their R&D do all day/year/decade?

  • Tech services: Google search seems worse now thanks to SEO farms; Facebook is trying to innovate with Meta, but everyone seems skeptical; Instagram seems to be forever catching up to younger upstarts Snap or TikTok.

  • Shipping: Has FedEx/UPS/USPS improved in any measurable way in the last decade? Shipping fees and times for retail customers are as high as ever.

I think I could spend hours outlining all the sectors from the S&P 500. But I think my point applies broadly: when products are released in discrete, annual cycles, consumers can expect real improvements and even leaps. For everything else, mostly goods and services sold as subscriptions or free but supported by ads, real innovation seems intangible. The worst part is not only has innovation been slow historically, there is no real expectation that anything will change in the coming years either.

The cause seems fairly obvious--it's the lack of incentives: when you don't innovate a phone or car, consumers have no reason to buy the new thing, and you make no money. So you are forced to invest in R&D to deliver actual, marketable improvements. Meanwhile, Chase, Google, and FedEx will continue to make the same amount of money delivering continuous service at last year's SLA, and it's far easier to not churn than to convince someone to give you more money. Unfortunately for consumers, traditionally innovative sectors are catching onto the fact that it's easier to deliver continuous service instead--see how car companies are now charging monthly subscriptions for heated seats and remote unlocks.

What if we could change this paradigm entirely? What if a sufficient number of customers displayed willingness to pay more for substantially and measurably improved performance, and upstarts or visionary new CEOs of existing companies catered to this? Could that accelerate society into the future?

I admit it's hard to map out how this would work. Though I would have also said it's not possible for car companies to generate subscription revenue either, and here we are. So maybe the Motte has fun ideas for how at least some sectors can deliver much faster innovation than today by being incentivized to do so by moving from continuous service to discrete product cycles.

When it comes to technological/product innovation, I think it's cars and gaming consoles that are the anomalies/outliers--it's probably normal for most technologies to improve in ways so incremental, irrespective of time intervals, that you'd have to check serial numbers in some cases (I think Teslas are an example of this).

Speaking of serial numbers, firearms are a good example of a technological field that doesn't quite sort into generations (I mean, it kinda does (percussion, metallic cartridges, breechloaders, smokeless powder, automatic rifles), but it seems like there's very much an unevenly-distributed-future effect in many parts of the history, and wars were pretty much the only thing that pushed people to seek improvement there).

As a PC gamer, it upsets me a tiny bit that the history of PC gaming hardware is harder to divvy up like with consoles (after all, once you're into the era of DOS PCs (which already ran concurrently with more fixed-spec computers like the C64 and, later, the Amiga), it gets messy distinguishing between OG IBMs, Turbo XTs, Pentiums and so on), but that's just part of the price paid for being able to stay on the cutting edge on your own terms.

I think trying to revolutionize banking would result in a lot of headache/heartache, it's very constrained and regulated for good reasons. Shipping is also hard, because it requires fundamental technological revolutions that would change a hell of a lot of other things (better drones, easier supersonic flight, subterranian cross-continental hyperloop tunnels, teleportation??).

Sure, everyone has a solid mobile app now.

Banking is an important aspect of one's life that we are now 99% capable of doing w/o setting foot in a bank. Seems like a big win for everybody.

I will caveat this with this seems to have come at a cost with their in person services. When I was applying for a credit card for the first time I went into Chase cause I wanted an employee to walk me through it, but they kept "promoting" that I could just do it online which I didn't feel comfortable doing cause this was a first time for me. So it took some real insistence from me before they just walked me through the process in person. Once they did it was pretty smooth, but it really bothered me how stubborn I had to be to get the bankers there to help me in person

Seeing the recent headlines of large layoffs at tech companies, sometimes accompanied by snide comments that nothing of value was lost due to bloat at fat and efficient firms that enable quiet quitting, I thought about just how much innovation actually takes place in corporations that employ large numbers of knowledge workers.

It's hard to say. Twitter got rid of half its staff , yet nothing seems different in regard to the user experience. I think it's a situation in which 10% of staff produce 80% of the value. The goal of public companies is foremost to generate profits for shareholders. Innovation is secondary.

So maybe the Motte has fun ideas for how at least some sectors can deliver much faster innovation than today by being incentivized to do so by moving from continuous service to discrete product cycles.

Isn't this what apple does with its product launches. Same for Tesla. The changes seem continuous year over year, but the latest iPhone is vastly superior to the original 2007 phone. It makes sense to not change things too dramatically in the short-term, or else consumers may balk. It's too risky. For example, Microsoft in 2012 dramatically changed the interface with Windows 8, which consumers hated. Windows 10 restored some of the older interface.

Does anyone here actually get extra tangible uses out of the new iphone versions?

I have owned the 4, 5, 6, 8, and currently 11. Upon recent reflection, I that all the iPhones past the 5 have added nothing of meaningful value in terms of new features only minor quality of life updates like a better battery and camera, but nothing that has actively changed how I use the product. For this reason I won't be buying any of the newer versions of the iPhone.

So I'm curious if anyone here has an experience where there WAS a new feature added (past the iPhone 5) that has actively changed how they use their iPhone in some tangible way.

that has actively changed how they use their iPhone in some tangible way.

The LIDAR scanner is niche but sufficiently transformative (the FaceID camera can be used in this way as well due to the way its hardware works- and yeah, it actually does require special hardware for the required depth mapping). I'll likely be upgrading to a model that has it, because sometimes I want to 3D print things I find difficult to measure directly (curved internal surfaces are a real bitch to get right the first time).

Of course, that generally requires third-party applications to create and export the mesh as opposed to a first-party "we designed the phone to do this" philosophy Apple has always had, so I'm not entirely sure that counts. And it's not something I do every day, either.

For this reason I won't be buying any of the newer versions of the iPhone.

The same conditions are true for most Android devices; the iPhone just has far more useful lifespan per dollar especially once you're in the sub-500 price range (I legitimately don't understand the point of buying a new phone before the hardware physically dies or the software is too old to run anything, but I'm probably in the minority here).

There is one exception to this, however, and that's the folding-screen phones: they're ludicrously expensive, but being able to have 2 apps open at the same time is a really big deal just like it was for PCs back in the 1980s.

I think Google is wise to go all-in on this tech, even though it's going to take a while to trickle down into the 400-dollar phone market, because this really does bridge the usefulness gap between phone and tablet, and it's easier to stomach replacing a 200-dollar phone every 2 years than it is a 1000-dollar phone because the screen physically can't hold up to being bent that many times.

It's also not something Apple is going to be able to match for a long time if ever; multi-tasking has always been an afterthought for them.

The Face ID thing is kind of nice. But yeah; marginal improvements. Nothing transformative.

Correct me if I'm wrong here, but isn't Face ID completely a software change. So theoretically couldn't apple have put that same software on the iPhone 5? Or would the specs of the iPhone 5 not cut it to run whatever ML model they are running?

Windows is an example of a product that is worse in all the visible ways now, but has substantially improved in the invisible security area.

And, in my experience, in terms of system stability. I can't remember my last blue screen of death experience, which was very commonplace even 20 years ago.

If you're saying that Windows XP was likely to BSOD, I'd disagree. XP was the first consumer version of Windows that wasn't based on DOS (NT was for businesses) and that made it a lot more stable.

I'd also add that Windows "security improvements" are often things that people with computer knowledge told them to do 20 years ago and they just didn't do it for a while.

I might be mixing up XP and 98/ME. It's been a while.

Continuous release is if anything better for innovation. It allows for constant updates meaning continuous innovation. Long development cycles lead to slow feedback, and updates have to be planned long in advance which means it takes years for innovation to reach production and that taking risks with the next big development is scarrier. It is easier to roll back a release when continuously delivering, rolling back a product that has been in development for 18 months is harder. Hardware is more natural to do in larger cycles than software but software doesn't really have any benefits from long dev cycles.

I think we are partially seeing diminishing returns on investment. It is simply much harder to make a freezer 10% more energy efficient today compared to in 1970. Expecting a radically new toaster every year isn't really feasible.

As for the big banks they have so much legacy code and so many entrenched interests that I think it is hard for them to make big changes. The fin-tech scene is moving fast and I think that consumers are going to be using other services soon.

2 claims here:

  1. Your argument makes sense theoretically, but can you name any MAJOR innovations by companies that are known to use this model? Cause I can't think of one.

  2. I understand your argument about diminishing returns on investment, but I'm now reminded of the Ford quote,

"If I had asked people what they wanted, they would have said faster horses.”

And I think that the continuous release cycles reemphasize feedback from people saying they want faster horses, so people make faster horses. You don't get the larger POV that their are fundamentally different ways of approaching a problem which occasionally come with far better outcomes. I don't want to make any claim that continuous development and agile have no place, but it seems to me that it doesn't contribute to larger innovations that can update the fundamentals of society the way the car did.

As for the big banks they have so much legacy code

Legacy code is one problem of the "continuous improvements" style.

I think we are partially seeing diminishing returns on investment. It is simply much harder to make a freezer 10% more energy efficient today compared to in 1970. Expecting a radically new toaster every year isn't really feasible.

That's a good point, but I remember a few years ago, people were lauding that innovations like the self driving car were expected 15 years prior, but the fact was that good software infrastructure hadn't existed which slowed down development. Those people usually were saying that now that there had been two decades of software infra development, we were finally about to see some serious shit come out. To some degree they were right. The self driving car, as an example, really is a reality now, but regulation is now the limiting factor. I'm not sure if this applies to other things as well, though.

I can say at least as someone who's been in software for the last decade, our work is significantly significantly slowed down by regulatory adherence, like more than you can possibly imagine. I'd venture to say that most major systems at big tech companies that I know about probably either spend greater than 50% of their time trying to prove they are compliant, or else are systems that are purely designed to help other systems prove their compliance. Right now, the later case are the systems which provide the most measurable value at big tech companies, by means of accelerating the development of other systems. So maybe that's the next big hurdle, now that we're past the development of good infra systems, we now have the unforeseen task of creating good governance and compliance systems.

Edit: And I don't mean to imply that regulation and governance infra are wasting time and energy, I actually believe that these systems do strongly positively impact end-customer privacy, ensure that the data is being used safely. It's just that it's not very evident to people outside the company exactly where all the time and effort is going. At best, customers can take big tech at their word on their statements and promises about customer privacy, and at worst they'll think they're talking out of their asses.

PG remarked on continuous release as early as 2003:

We knew Lisp was a really good language for writing software quickly, and server-based applications magnify the effect of rapid development, because you can release software the minute it's done. (...) But with Lisp our development cycle was so fast that we could sometimes duplicate a new feature within a day or two of a competitor announcing it in a press release. By the time journalists covering the press release got round to calling us, we would have the new feature too.

Development speed brings a quality of its own.

What has Chase, BofA, Wells Fargo, or Citi done in the last 5-10 years for the retail banking customer? Sure, everyone has a solid mobile app now. But how has core services improved? Is there anything measurable new or improved with checking, savings, or loans? Transfers still do not work over weekends or holidays. I guess maybe the public doesn't really expect innovation from the banks, but then what's with the massive payrolls? What does their R&D do all day/year/decade?

Banks do lots of things and their largest R&D efforts aren't usually targeted that hard front end at retail customers. R&D projects in banks look more like creating a more direct path for newly originated home loans to be securitized because the different branches of the bank independently built siloed data sets on the originations side and the securitization side and they don't have an established pipeline besides treating each other as different companies. The direct retail customer has had some improvements as well, upgraded card security including the move to chip and then tap. I think cashing checks by picture came out this decade. Looking through my chase app it looks like you can buy stocks through the app and do instant direct transfers to other chase customers. So new stuff is definitely coming out.

I'd also say for banks more thought than you might think is put into determining where exactly to build new branches and roll out ATMs, which themselves have seen improvement.

For consumer-facing banking stuff:

Machine learning for detection of credit card fraud. Which is hard to notice, as you just don’t tend to think to yourself “oh huzzah I didn’t get my identity stolen today”. But the banks put a lot of effort into this, because they sure don’t want to be on the hook for a stolen card number.

I imagine there is a lot of money spent making transactions more secure. Customers might not see the effect.