site banner

Culture War Roundup for the week of February 20, 2023

This weekly roundup thread is intended for all culture war posts. 'Culture war' is vaguely defined, but it basically means controversial issues that fall along set tribal lines. Arguments over culture war issues generate a lot of heat and little light, and few deeply entrenched people ever change their minds. This thread is for voicing opinions and analyzing the state of the discussion while trying to optimize for light over heat.

Optimistically, we think that engaging with people you disagree with is worth your time, and so is being nice! Pessimistically, there are many dynamics that can lead discussions on Culture War topics to become unproductive. There's a human tendency to divide along tribal lines, praising your ingroup and vilifying your outgroup - and if you think you find it easy to criticize your ingroup, then it may be that your outgroup is not who you think it is. Extremists with opposing positions can feed off each other, highlighting each other's worst points to justify their own angry rhetoric, which becomes in turn a new example of bad behavior for the other side to highlight.

We would like to avoid these negative dynamics. Accordingly, we ask that you do not use this thread for waging the Culture War. Examples of waging the Culture War:

  • Shaming.

  • Attempting to 'build consensus' or enforce ideological conformity.

  • Making sweeping generalizations to vilify a group you dislike.

  • Recruiting for a cause.

  • Posting links that could be summarized as 'Boo outgroup!' Basically, if your content is 'Can you believe what Those People did this week?' then you should either refrain from posting, or do some very patient work to contextualize and/or steel-man the relevant viewpoint.

In general, you should argue to understand, not to win. This thread is not territory to be claimed by one group or another; indeed, the aim is to have many different viewpoints represented here. Thus, we also ask that you follow some guidelines:

  • Speak plainly. Avoid sarcasm and mockery. When disagreeing with someone, state your objections explicitly.

  • Be as precise and charitable as you can. Don't paraphrase unflatteringly.

  • Don't imply that someone said something they did not say, even if you think it follows from what they said.

  • Write like everyone is reading and you want them to be included in the discussion.

On an ad hoc basis, the mods will try to compile a list of the best posts/comments from the previous week, posted in Quality Contribution threads and archived at /r/TheThread. You may nominate a comment for this list by clicking on 'report' at the bottom of the post and typing 'Actually a quality contribution' as the report reason.

15
Jump in the discussion.

No email address required.

I just took another batch of IQ tests for some poor suffering undergrads and felt that they were hilariously swingy.

I've always scored around 130; but one or two questions can drop your right out of/into that bracket , and the pattern recognition questions are easy to get wrong through test format. Eg, in this test there was a sequence of numbers displayed such that the question number looked like it was part of the sequence. I only noticed because it made the sequence all fucky.

This ontop of the other (crapyer) test where you basically play word association half the time, the whole enterprise of IQ testing fills me with doubt. I have no doubt I am "intelligent" enough to be somewhere near where I test on the bell curve, but 97th percentile feels way too high to me. I feel like a better test would be to present my with a piece of broken machinery I've never seen before and timing me while I try to fix it, only that's kinda hard to feed into numpy for pretty graphs.

Are the IQ tests used for actual funded trials different than what is used by starving students, are all of them shakey as hell buzzfeed affairs, or am I just biased against them for some reason?

I just took another batch of IQ tests for some poor suffering undergrads and felt that they were hilariously swingy.

Were they IQ tests, or were they just researching cognition and making sloppy estimates? An actual IQ test is standardized and normed, and administered by professionals (not, generally, undergrads).

I have no doubt I am "intelligent" enough to be somewhere near where I test on the bell curve, but 97th percentile feels way too high to me.

In my experience, it is often incredibly difficult for smart people to truly grasp how stupid most people are. This is an extremely pesky fact because it's almost impossible to discuss without sounding hopelessly pretentious, but the fact remains--a good portion of American college students, who statistically represent the "best and brightest" ~40% of their generation, cannot follow or construct a formal argument without extensive coaching.

And I mean this in the most basic, technical sense of identifying premises and relating them to conclusions. For example, I once made a presentation about LSAT success to a room of undergraduates aiming for law school. I grabbed some sample questions from the LSAC website, that I figured would be easy enough to serve as a clear example of "what to expect" without scaring the audience too badly. In an audience of about 20 upperclassmen, there was one student (a physics major) who was able to answer the questions. Everyone else was totally mystified. For reference, here is a recent LSAC sample question:

The supernova event of 1987 is interesting in that there is still no evidence of the neutron star that current theory says should have remained after a supernova of that size. This is in spite of the fact that many of the most sensitive instruments ever developed have searched for the tell-tale pulse of radiation that neutron stars emit. Thus, current theory is wrong in claiming that supernovas of a certain size always produce neutron stars.

Which one of the following, if true, most strengthens the argument?

A. Most supernova remnants that astronomers have detected have a neutron star nearby.

B. Sensitive astronomical instruments have detected neutron stars much farther away than the location of the 1987 supernova.

C. The supernova of 1987 was the first that scientists were able to observe in progress.

D. Several important features of the 1987 supernova are correctly predicted by the current theory.

E. Some neutron stars are known to have come into existence by a cause other than a supernova explosion.

Now, the LSAT is not an IQ test, but the point is that I work with university students all the time, and I am actively aware of things like how many struggle to pass the logic classes, algebra classes, etc. But even then I constantly find myself overestimating their ability to just engage in basic reasoning tasks. And no, I'm not an ivy league professor, I'm not teaching our nation's elite, but estimating from SAT ranges most of my students are generally within the 110-130 IQ range. With a bit of coaching and regular study, they can be trained to do things like pass an algebra class, though most will forget what they've learned within a decade or two, especially if they don't put it to use in their professions.

In I Can Tolerate Anything Except the Outgroup, Scott Alexander writes a bit about the strength of "filter bubbles" that separate the "red tribe" from the "blue tribe." In my experience, there are also filter bubbles that separate people by IQ. I have red-tribe friends running a wide range of apparent intelligence levels. I also have many blue-tribe friends, but all of them are either lawyers or college professors or similar. Low-IQ blue tribers might as well not exist, as far as my social experiences go. I'm sure they do exist--I am reminded of their existence any time I accidentally read reddit without logging in!

...am I just biased against them for some reason?

If you're testing in the 97th percentile I can't imagine why you'd feel biased against IQ tests, which seem to be telling you something most people want to hear: that you're really special! This may be muted because you don't feel special (possibly as a result of IQ filter bubbles), or because the way in which you are special doesn't amount to much if it isn't paired with some mixture of conscientiousness and luck. But finally it is probably important to remember that, as a statistical measure, IQ is of much greater use when discussing populations than when discussing individuals. Yes, knowing an individual's professionally-established IQ probably gives you some information about them, but IQ ranges in professions are nevertheless quite broad.

With a bit of coaching and regular study, they can be trained to do things like pass an algebra class,

Since you're talking about university students I have to ask: does this mean they've learned how to decompose an Abelian group, or that they've learned the quadratic formula?

though most will forget what they've learned within a decade or two, especially if they don't put it to use in their professions.

I still remember long ago being asked for geometry help by a neighbor, a smart man who owned his own white-collar business, because his business math only required (grade school) algebra and arithmetic, and by the time he was remodeling his sailboat he'd forgotten how to use the Pythagorean Theorem on a real problem. I'd feel more smug, except later there was an entire math class I forgot 90% of, during several years interlude between "I'll take this for fun" and "wait, this could actually be used for something?".

Since you're talking about university students I have to ask: does this mean they've learned how to decompose an Abelian group, or that they've learned the quadratic formula?

This depends to a significant extent on the school--and I don't mean "more prestigious schools will require more." Sometimes it is quite the opposite! There are a variety of contributing factors, here.

For example, faculty at a community college are sometimes less demanding than those at a major university, given the demographics of their student body (which also vary dramatically between community colleges, tending to track the demographics of their respective communities). But community college faculty also typically emphasize teaching over research, and small class sizes over large lecture halls, and actual professors over grad students. As a consequence, a community college in a middle-class suburb is often a much better place to learn algebra than a major research university. This is something middle-class students often consider when weighing the benefits of doing two years of study at a more-affordable community college before transferring to a state university to finish a bachelor's degree.

Or for a different example, the "introductory" math and science classes at a STEM-focused university will often be substantially more rigorous than "equivalent" courses at large state universities. Accreditation processes are supposed to smooth this out somewhat, but in practice the "Algebra 101" class offered by Big State U has to be passable by large numbers of non-STEM students. This can sometimes lead to disputes over general education requirements as STEM programs in large universities like to really load up specialized credit hours; they don't want their students wasting credit hours (as they see it) passing an "Algebra 101" class with content students "should" have already learned in high school (e.g., the quadratic formula). Indeed, it's not unheard of for STEM majors to be expected to pass derivative and integral calculus courses in their freshman year, or else to just plan on spending at least five years completing their studies, including a semester or two of, functionally, remedial mathematics.

Or for yet another example, faced with the foregoing, some universities just outright offer "non-STEM-major" classes. I'm personally aware of this happening in the sciences rather than math--like, "BIO 101 for non-major general education credit, or BIO 110 for STEM majors"--but I have heard of similar things happening with math.

I'd feel more smug, except later there was an entire math class I forgot 90% of, during several years interlude between "I'll take this for fun" and "wait, this could actually be used for something?".

Yeah, I completed a semester of calculus in my university studies, but I didn't understand the use of calculus until many years later, by which point I had long since forgotten how to conduct operations any more complicated than getting the first derivative of x^2. The last time my mathematical abilities really counted for anything more complicated than household finances was probably when I took the GRE.

Indeed, it's not unheard of for STEM majors to be expected to pass derivative and integral calculus courses in their freshman year

My state school alma mater still has calculus I and II as freshman-level (100-series) classes, though they now have a pair of lower classes ("Precalculus" and "College Algebra and Trigonometry" which appear to cover the same ground) which are freshman-level also. Regular algebra is remedial (000-series).

The school has separate calculus curricula not only for non-STEM majors but for life science majors. Not sure if this reflects on the calculus ability of life science majors or is just meant to make sure the women in STEM are kept safe from the bulk of the geeks.

The non-STEM calculus is usually horribly watered down, and in a weaker school may never deal with anything more complicated than polynomials. Life science calc, where it exists, cuts out trig in favor of early work with exponentials and logs, and is much more computational, but at least it doesn’t purge limits from the curriculum.