This weekly roundup thread is intended for all culture war posts. 'Culture war' is vaguely defined, but it basically means controversial issues that fall along set tribal lines. Arguments over culture war issues generate a lot of heat and little light, and few deeply entrenched people ever change their minds. This thread is for voicing opinions and analyzing the state of the discussion while trying to optimize for light over heat.
Optimistically, we think that engaging with people you disagree with is worth your time, and so is being nice! Pessimistically, there are many dynamics that can lead discussions on Culture War topics to become unproductive. There's a human tendency to divide along tribal lines, praising your ingroup and vilifying your outgroup - and if you think you find it easy to criticize your ingroup, then it may be that your outgroup is not who you think it is. Extremists with opposing positions can feed off each other, highlighting each other's worst points to justify their own angry rhetoric, which becomes in turn a new example of bad behavior for the other side to highlight.
We would like to avoid these negative dynamics. Accordingly, we ask that you do not use this thread for waging the Culture War. Examples of waging the Culture War:
-
Shaming.
-
Attempting to 'build consensus' or enforce ideological conformity.
-
Making sweeping generalizations to vilify a group you dislike.
-
Recruiting for a cause.
-
Posting links that could be summarized as 'Boo outgroup!' Basically, if your content is 'Can you believe what Those People did this week?' then you should either refrain from posting, or do some very patient work to contextualize and/or steel-man the relevant viewpoint.
In general, you should argue to understand, not to win. This thread is not territory to be claimed by one group or another; indeed, the aim is to have many different viewpoints represented here. Thus, we also ask that you follow some guidelines:
-
Speak plainly. Avoid sarcasm and mockery. When disagreeing with someone, state your objections explicitly.
-
Be as precise and charitable as you can. Don't paraphrase unflatteringly.
-
Don't imply that someone said something they did not say, even if you think it follows from what they said.
-
Write like everyone is reading and you want them to be included in the discussion.
On an ad hoc basis, the mods will try to compile a list of the best posts/comments from the previous week, posted in Quality Contribution threads and archived at /r/TheThread. You may nominate a comment for this list by clicking on 'report' at the bottom of the post and typing 'Actually a quality contribution' as the report reason.
Jump in the discussion.
No email address required.
Notes -
Tyler Cowen has a Conversation with Jennifer Pahika on Reforming Government
I will pull one little segment.
I want to pull on some threads in the vein of my previous comments on military research, development, and procurement. They talked about this some, but were also talking more broadly. I think the problem to be solved is perhaps most clearly cognizable in this domain. Reordering the discussion a bit, we'll start from the outcomes, the things that we're trying to achieve:
As I put it:
Look at the lead time for something like a modern fighter jet. What's the chance that the guy who originally greenlit the program is still around to be 'accountable' if/when it's actually used in a hot conflict, such that its performance can be assessed against the competition? Do you handicap that assessment at all? He made his decision a decade ago, seeing a certain set of problems that they were trying to solve. A decade or two later, your adversaries have also been developing their own systems. Should he be punished in some way for failing to completely predict how the operating environment would change over decades? Suppose he made the decision in Year X, and it came into service in Year X+10. It hypothetically would have performed perfectly well for a decade or two, but you just never had a hot war and never saw it. By the time Year X+25 rolls around and you do get into a hot war, it's now hot garbage in comparison to what else is out there? Is he blameworthy in some way? Should he be held 'accountable' in some way? There's a good chance he's now retired or even dead, so, uh, how are you going to do that?
Obviously, there is a spectrum here, but I would argue that a modern fighter jet is more toward the middle of the spectrum than at the far end. Yes, there are plenty of faster-turnaround things, but there are also lots of long lead time things. Even just think about the components/subsystems of the fighter jet. By the time a decision is made to greenlight the larger project, most of these have to be relatively mature. The gov't and company involved can probably take some risk on some of these, but they can't do too many. They want a fair amount of subsystems that they are confident can be integrated into the design and refined within their overall project schedule. That means that all of that investment had to be done even earlier.
Back to that guy who makes the decision. Who is that? Probably a general or a political appointee. Possibly a group of gov't stakeholders. How does he decide what to buy? Remember, he's trying to predict the future, and he doesn't actually know what his adversaries are going to do in the meantime. He has no direct outcomes by which to do this. He doesn't yet have some futarchy market to somehow predict the future. He basically just has educating himself on what's out there, what's possible, what's at various stages of maturity, and where various people think stuff might be going. As I put it in the doubly-linked comment:
And so, I think Tyler would claim, this fundamentally drives these decisions to be focused on process rather than outcome. The outcome isn't accessible and likely isn't going to be. Instead, people basically just implement a process to ensure that the decisionmaker(s) are talking to the right stakeholders, getting a wide variety of input, not just shoveling contracts to their buddies, etc. Sure, these decisionmakers still have some leeway to put their mark on the whole thing, but what's the plan for adding more 'accountability' to them that isn't just, "Whelp, let's make sure they go through enough process that they don't have the obvious failure modes, and then sort of hope that their personal mark on the process is generally good, because we've built up some trust in the guy(s) over some years"?
Now, think like a company or research org that is considering investing in lower maturity subsystems. It's a hellova risk to do that with such an incredibly long lead time and, frankly, a pretty low chance of having your product selected. You're going to care a lot about what that process looks like, who the relevant stakeholders/decisionmakers are, and what their proclivities are. If you're pretty confident that the guy(s) in charge mostly don't give a shit about airplanes, you're even more unlikely to invest a bunch of money in developing them or their components. Will some crazy company spent thirty years to develop a fully-formed system, getting no contracts anywhere along the way, just hoping that once the generals see it complete and in action (ish, because again, there's not a hot war and you can't really demonstrate the meaningfulness of having a thousand airplanes with your one prototype), they'll finally be 'forced' to acknowledge how effective it's going to be, finally unable to brush it off, and finally actually buy it for bazillions of dollars? I guess, maybe, sometimes. But probably not very often. Thus, I think it's pretty unlikely that the gov't can just completely wash its hands of any involvement in the research/development pipeline and just say, "Companies will always bring us fully-formed products, and we'll decide to buy the best ones." Pahlka touches on a need for the gov't to "insource" at least some parts of things:
Again, I think she's talking more broadly, but that bit about software and operations being very melded is quite poignant when thinking about military applications.
Getting back to the problem of not knowing what's going to be effective in the future, the traditional solution is to just fund pretty broadly, through multiple mechanisms. Not sure about airplanes? Have one guy/group who seem to like airplanes go ahead and fund a variety of airplane-related stuff. Have some other guy who doesn't like airplanes fund some other stuff. There's obviously a bunch of risky capital allocation questions here, and decisions ultimately have to be made. Those are tough decisions, but how do you add 'accountability' to them? I don't know. I think the easy/lazy way is basically a form of just looking at your 'guys' (your airplane guy, your submarine guy, etc.) and ask, "What have you done for me lately?" The obvious concern is that that makes all your guys focus their portfolios much more heavily toward shorter timelines. But part of the point of the government being 'eternal' is that it should be able to be thinking on longer time horizons, because that may end up being more valuable than just short time horizon things that can be more tightly linked to 'outcomes' or 'accountability'.
I started off being a bit taken aback by the idea Tyler proposed that we should almost just abandon accountability. I've generally been somewhat pro-accountability, and I know folks here have talked about it a lot. But upon reflection, with my pre-existing but not previously-connected thoughts on military procurement, it makes a bit more sense to me that there is a real tension here with no real easy solutions.
It’s not “accountability” in some nebulous sense. It’s accountability to having done the right process regardless of what happens. And this does skew things away from actually getting things done because there’s always a chance that doing something will result in a bad outcome that could be prevented by doing the processes. So in order to avoid the consequences of being wrong and held to account for a potential failure, you do processes to cover your own ass and who cares if the project gets done at all. It’s a question of the incentives being put in place such that you avoid actual accountability by abusing the accountability system such that you protect yourself from accountability by doing and creating lots of processes and not actually getting things done.
The solution, to my mind is to shift accountability to the results of the project. If you can’t get the job done, you’re accountable for that, and if you can’t do the project right you’re accountable for that. If the project is building a road, the accountability should not be in filling out forms to authorize the road, or quadruple checking that the processes are followed to the letter. Instead shift accountability to the correct, safe, and timely building of the road.
The issue, as they point out, is that outcomes are heterogeneous. If the outcome is a combination of your decision and random noise and circumstance outside of your control, then outcome will be weakly correlated with the actual value you provide. Half of punishments and rewards will be deserved, and half will be simply responding to the whim of fate.
If your punishment/reward mechanism is long-term enough, like say the profits of a company that can accumulate over time and wash out the negatives with positives, then risky but positive expectation behaviors will work. If your mechanism is "fire any CEO who has a year with negative profit, no matter why it turned out negative" then you're likewise going to incentivize conservative behavior that guarantees the bare minimum at the cost of unlucky but smart people who take risks with positive expected value.
Accountability based on outcomes can also encourage behavior that increases tail risks. In the wake of the 2008 financial crisis, the popular metaphor for this was “picking up nickles in front of a steamroller.” It involves taking risks with a negative expected value, but where the downside is a costly but improbable occurrence. This can appear to work very well for a number of years, until the improbable happens.
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link