@Rov_Scam's banner p

Rov_Scam


				

				

				
2 followers   follows 0 users  
joined 2022 September 05 12:51:13 UTC

				

User ID: 554

Rov_Scam


				
				
				

				
2 followers   follows 0 users   joined 2022 September 05 12:51:13 UTC

					

No bio...


					

User ID: 554

Because it's just a clunky way of achieving the same end.

I just finished The NHL: 100 Years of On-Ice Action and Boardroom Battles, by D'Arcy Jenish. Last year I found a copy of David Harris's The League: The Rise and Decline of the NFL, and was captivated by it. It's a history of the power struggle among NFL ownership that culminated with Al Davis moving the Raiders to Los Angeles and Pete Rozelle's authority as league commissioner severely challenged. But it's also a history of ownership and the business side of the league from roughly 1974 to 1982, with the first section covering the "status quo ante" as it had developed since 1960 and a final postscript covering the three years between the immediate aftermath of the move and the time the book went to press. It's a remarkable story, covering the entire history in great detail over its 640 or so pages.

I was looking for something in the same vein and The NHL seemed like it had promise. As a much shorter book (fewer than 400 pages) covering a much longer time period (1917 to 2011), I wasn't expecting the same level of detail. And boy, I did not get the same level of detail. I wasn't really expecting it for the early years of the league, as the author admits that the source material is thin, all the major figures are dead, and the NHL wouldn't give him access to what they had. So when the book seemed to be breezing through the Calder era and including a lot of padding, I sort of nodded along, figuring that by the time we got to, say, the 1960s and the expansion era things would start to pick up a bit. They did, but things were still moving at a pretty good clip, and without records or living witnesses, the task probably wasn't made much easier.

It's once we get to the John Ziegler era that the disappointment started to set in, since he interviewed Ziegler for the book. It seems as though once Ziegler put out all the fires Clarence Campbell left in his wake, very little happened for another decade. Once we get to the Bettman era, though, it takes even more of a nosedive; these are the years I remember paying attention to the league, and while he does a decent job of pointing out all the high points (expansion, lockouts, franchise relocation, etc.), there's not much here that someone buying a book on the subject doesn't already know.

Take the 1994 lockout, for instance. It was the first major work stoppage in league history, it lasted 104 days, and 468 games were lost. This merits fewer than four pages. It is immediately followed by discussion of the Nordiques' relocation to Colorado, which doesn't even get one full page. Major stories of the 1990s, such as John Spano buying the Islanders despite having no money and the Penguins' 1998 bankruptcy (which resulted in Lemieux taking ownership of the team) are not discussed at all. I understand that you can't include everything due to space considerations, but when he spends three pages talking about the on-ice exploits of the 1980s Oilers, and elsewhere discusses the dynamics of various playoff series, it seems disconcerting in a book ostensibly about the business side of the game.

And it gets even worse from there. Once we get past the 04–05 lockout, the final chapter is dedicated to what are evidently magazine articles copied and pasted into the book. There's a section where he discusses the state of the league circa 2012 that centers around an interview with Gary Bettman. This is followed by a detailed description of the War Room in Toronto and a discussion of what's available on the NHL website. Even in the early parts of the book, he leaves threads hanging. For instance, he talks about how competitive balance problems in the early 1950s led the league to institute a draft, but since the good teams wanted to protect their farm systems it was compromised so the losers didn't have access to the really good prospects. As Chekov said, though, if you introduce a gun in the first act, you'd better fire it in the third — the draft is never mentioned again. Obviously, at some point the draft evolved into what it is today where every team has its pick of junior players, but I have no idea how this actually came to be since Jenish forgets about it. This is especially maddening when he's talking about the 70s expansion teams trading draft picks or building through draft picks and I'm left wondering what the system even is at this point. There's stuff like this throughout the book. He also makes one critical omission; when we get to Clarence Campbell's retirement, he chalks it up to his advanced age and inability to keep up with the crises the league was facing. What he doesn't mention is that Campbell announced he was stepping down shortly after he discovered he was under investigation for bribing a senator.

All in all, it's not a bad book by any means, especially if you're just looking for a breezy capsule history of the business end of the NHL, but I'm not sure who it is for. Anyone reading this book already knows 75% of everything that's covered after 1992. Anyone who doesn't probably isn't interested in a book about the business end of pro hockey. Once I read a book on a subject I'm usually ready to move on to something else unrelated, but I just started The Instigator: How Gary Bettman Remade the NHL and Changed the Game Forever because The NHL left me so unsatisfied. It seems promising, but at only 276 pages, I'm not expecting miracles.

This comes up every year around clock change time and perma-DST people and noon is noon people are equally moronic. The mere existence of this debate is proof that time changes are needed. Seriously, if you can't handle two time changes a year maximally coordinated to minimize inconvenience, then you should never be allowed to get on an airplane again in your life. Or stay up past your bedtime. Or sleep in. Or do anything else that results in any mild disruption to your precious sleep schedule.

Losing an hour of sleep on a weekend is something I can deal with once a year. But as a white-collar worker who gets up at normal o'clock, waking up in the dark is something I do not want to deal with on a regular basis, as it is noticeable harder to get going in the morning when it's still dark. I currently have to deal with this maybe a few weeks out of the year. Permanent DST would have me deal with it from the end of October until mid-March, and I really don't want to fucking deal with that. Conversely, if we eliminated DST altogether it would mean I'd forfeit the glorious hour between 8 and 9 in the summertime when it's warm and still light enough to do things outside in exchange for... it getting light a 4 am. To those early birds who think that it getting light a 4 is just as good as it staying light until 9, you either do not have a job, a family, or other real-world obligations. The average person isn't getting up at 3:30 am to sneak a round of golf in before heading to the office. For those of us who don't get out of work until 5 pm or later, that extra hour in the evening is a godsend.

So can we stop this perpetual bitching? Time changes were implemented for a reason, and people who think we'd be better off without them have never actually lived in a world without them. The benefits are all theoretical. When permanent DST was implemented during the 1970s, the program was cancelled within a year because people couldn't abide the first winter. And very few people want to end summer evenings early. This has to be the stupidest debate in American political discourse; just leave things where they are.

To understand Kinkade you have to understand how the art world actually works in terms of tastemaking. In today's visual age, where images are easily reproduced in books and magazines, on television, on the internet, and everywhere else it's possible to reproduce images, we tend to forget that the kind of familiarity we have with art is a new phenomenon. For most of human history, the only way you knew what a painting looked like was if you actually saw it in person. And even that is an easier proposition than it once was, since public museums that hold the great works are a relatively recent phenomenon. In our world, it's easy to ignore art precisely because we're bombarded with it, whether we like it or not. Yet it is he who pays the piper who calls the tune. Every man is entitled to his opinion, but unless you're actually a bona fide art consumer your opinion doesn't count for anything.

To be a bona fide art consumer, you have to be the kind of person who is willing to peruse galleries in your area with the intention of dropping hundreds or even thousands of dollars on a painting, not because it will make a good investment, but purely because you like it. The gallery is an essential part of the system. I have a friend who has the rare distinction of being an art history major who actually works in her field. She worked on the staff of the Andy Warhol Museum and owned a gallery in Pittsburgh for a few years before moving to Texas (and managing a gallery there). The gallery is an essential middleman. With art schools graduating thousands every year, and many more than that selling paintings, it's hard for someone looking to buy art who's not fully ensconced within the art world to know where to start. The gallery owner thus acts as an intermediary, able to identify pieces of sufficient value that she can recommend them to customers without hesitation, yet also in touch with economics and the taste of the customer base that she won't alienate them.

It's worth pointing out that there's no barrier to entering the world of an art consumer other than money and the willingness to use it. The whole concept of a gallery opening is to generate buzz that gets people in off the street. They're essentially parties with free booze and light appetizers, and the people throwing them don't care whether you're actually interested in buying anything or have any pull in the art world (though you should dress appropriately and be willing to mingle with the crowd). I tried to attend as many of my friend's openings as I could, and she was always appreciative, as a full house with no buyers is always better than a sparse turnout. Anyway, this is the way the system is. If you're an artist, you try to get noticed by a gallery owner who agrees to display your work and hopefully sell it. If you make enough sales, you'll get a one-man show, have your work displayed in better galleries, get overseas exposure, and eventually reach the rarefied air of having your work sell on Southeby's for tens of thousands of dollars.

There are some artists, though, who can't cut it in this system. Most artists, in fact. Most of them just keep their day jobs and do art on the side and make an occasional sale; nothing wrong with that. But some of them want to get in so desperately that they open their own galleries. These are called "vanity galleries" and are frowned upon. An artist selling his own work through his own gallery is a tacit admission that you're trying to bypass a world where you couldn't make it by buying your way in. From an economic perspective, Thomas Kinkade's work didn't appeal to bona fide art consumers who bought paintings through galleries. It did, however, appeal to the kind of unsophisticated consumer who was willing to pay 40 bucks for a print and didn't even care if the nameplate artist actually did the underlying painting. Kinkade took the vanity gallery to its logical conclusion by opening a chain of stores where you could buy reproductions of his work in between buying jeans and grabbing an Orange Julius.

Buying real art is an intimate act. You attend a gallery opening where you peruse what's available and probably talk to the artist. If you're interested in buying something you call to make an appointment to conduct business during the week. You get an original work that nobody else will have, that the artist put hours into. And you pay a price that demonstrates your appreciation for those efforts. Kinkade reduced it to a commodity that was as disposable as any other. Of course, some respected artists thought that art should be a commodity, most notably Andy Warhol. This would at first seem to absolve Kinkade, but two things need to be taken into consideration. The first is that Warhol only gets respect for this revelation because it was novel at the time. Other pop artists existed before him, but he was the first to take the ball and run with it, while still straddling the line of whether he was serious or not. Some thought his work was criticism of consumer culture; he insisted that he was dead serious that it was not, but his aloof public persona suggested a hint of irony.

Which leads into the second point about Warhol. By the 1980s it was clear that he indeed was serious, and his stature started to fade. The endless screen prints and commissioned portraits of celebrities may have caused his image to soar among the public, but he fell off with critics. Furthermore, a new generation of artists raised on Warhol took his beliefs seriously and began equating garishness with quality. He died unexpectedly after gall bladder surgery in 1987 which was bad for him but good for his image, as he couldn't spend the next twenty years sullying it even further. While the pop art of the 1980s was mass-produced and kitschy, it was at least popular kitsch. Art may be fashion, but fashion is at least contemporary. Kinkade was just as kitschy, but he didn't even try to be cool. He produced art for the kind of people who collect Precious Moments figurines. And as he got older and more famous his strategy became even more crass. If one goes to his website today, the entire first page is licensed work. If his work wasn't kitschy enough already, you can always add a few Disney characters. What makes this especially egregious is that some of the characters, like Moana, didn't exist until after Kinkade's death, further emphasizing the fact that none of his alleged work has anything to do with him personally.

Years ago, before his popular revival, I told my gallery-owning friend that I wanted to write a critical defense of Bob Ross. When I was in high school, art teachers hated Bob Ross, so I thought I was being edgy. She told me that Ross wasn't controversial and that if I really wanted to ruffle some feathers I should defend Thomas Kinkade. I knew little of his work, but, having since looked... I just can't. It's not even good in a technical sense since he obviously doesn't understand color theory. Everything looks garish. There is no sense of proportion. Robert Hughes of Time magazine was highly critical of contemporary art in the wake of Warhol, and he complained that everything seemed designed to make the biggest immediate impact but had no staying power. Kinkade is no exception; his paintings hit you like a dish where you just threw in a dash of every spice in your cupboard. And this is all in pursuit of nothing more than cloying sentimentality. His works don't have anything to say about life, liberty, and the pursuit of happiness. At least Norman Rockwell led one to consider the meaning of the American Dream, and Warhol sparked discussion of consumer culture and celebrity. But what does Kinkade do? Are his paintings meditations on false nostalgia? Maybe, but I doubt he would have agreed. Gallery owners recognized the vapidity of his work, so he had no credibility. He had commercial success but it was due more to marketing than craftsmanship. One can argue that millions of people find his work visually appealing, but millions more find pornography visually appealing. I'm not trying to argue that Kinkade isn't art, but I'm not trying to argue that pornography isn't, either.

The anti-arbitration memes have given the practice an unfair rap, which has in a perverse way contributed to a self-fulfilling prophecy that was ultimately bad for consumers. Arbitration clauses were added to consumer contracts primarily as a means of preventing class-actions, not as a cynical way to rig the outcomes. We can argue over whether limiting class actions is all that noble a goal, but I can assure you that they aren't initiated by aggrieved consumers but by lawyers who figured out that if consumers were being biked out of 50 cents worth of Cheerios for every box sold, 1/3 or the total payout will be boku bucks. So they file a class action representing anyone who bought Cheerios during the year that their scales were defective and millions of consumers get dollar-off coupons while the lawyers take home a third of the total settlement value.

Anyway, studies came out that showed consumers lost a disproportionate percentage of arbitration cases as compared with regular court cases and people concluded that this must be because the companies choose arbitrators they know will rule in their favor and who have financial interest in not biting the hand that feeds, and since proceedings are secret they don't even have to face public scrutiny. This was a convenient explanation, but someone looked harder at the numbers and found that the study showing arbitration was a raw deal was flawed. It included all cases heard under consumer arbitration clauses, not just consumer-initiated ones. And the bulk of these cases were debt collection claims filed by credit card companies against people who didn't pay their bills. In other words, the numbers were skewered by claims that were vastly different than what one thinks of in terms of "consumer claims", and that would have had the same result in a regular court.

Actually, they would have had a worse result in regular court. In almost all of these cases the debtor has no real defense so they don't bother to fight the charges. In regular court this results in a default judgment. In arbitration, however, the arbitrators actually made the credit card companies prove their case. And they found that arbitrators rarely awarded debtors the full amount. So even in cases that would normally seem hopeless, arbitration was better for the consumer. And it was better for the consumer in other cases as well. I forget the exact numbers, but assuming that the odds of a satisfactory outcome are 50% in normal court, they were like 58% in arbitration. Not a slam dunk, but not exactly strong evidence that the deck is always stacked against the little guy. Nonetheless, companies started including arbitration clauses to guard against class actions. Eventually they became boilerplate, even in contracts that had little exposure to class action. People like Ralph Nader took notice and published studies saying that this was bad for the consumer. Consumers responded by assuming that arbitration claims were unwinnable, and stopped filing them. Companies started including more of them because they became a surefire way of preventing claims. That all of this was bullshit was lost.

It is thus that I present my own personal experience with arbitration, to show you how the process goes. In the winter of 2022 I was driving from Pittsburgh to Colorado to ski, and my right rear wheel started making noise around Kansas City. Suspecting that this may be a bad bearing, and having a long drive to get home, I decided to have it looked at in Denver. I used a shop my cousin's husband recommended (though I found out later that he only named it because it was close to his house). I explained the situation and that I needed it done that day and they quoted my $2400. Not having much of a choice, I agreed to having the work done. When I was driving back I called Subaru in Pittsburgh to get a quote for the work. $1200. From the dealer, with genuine Subaru parts. Needless to say, I felt ripped off.

But what to do? I had agreed to the price. But upon looking at my bill, I was only given a total without an itemized breakdown. Subaru had given me more information over the phone, without my even having to ask for it. So I began looking for something to use as leverage. According to the Colorado Auto Repair Code, the shop had committed several violations, for each of which I was entitled to statutory damages of $500. The most obvious one was that they didn't provide a breakdown of parts and labor costs. When I got home I called them, knowing it was futile. I told them that Subaru in Pittsburgh quoted me the job at half the price. They said things were more expensive in Denver. I asked them if they'd match a quote from Subaru in Denver. They said no. I asked them for the itemized breakdown. The labor costs were actually reasonable, but, for aftermarket parts, they charged me more than double the list price of the OEM equivalent and more than triple the list price of the parts they actually used. This markup over OEM plus the diagnostic fee (which is usually waived if you have the work done) was responsible for the difference between their price and the Subaru price. I explained the code violations. He said that every garage he ever worked at did it that way. I told him that the law is pretty clear and that they're in violation. He told me that if I was threatening legal action he had to end the call.

So I looked at my bill and immediately found the arbitration provision. You have to inform them in writing and wait 30 days before filing a claim. I sent a certified letter explaining the situation. I received no response. 45 days later I opened a claim with JAMS, and arbitration association. The advantage with JAMS over the American Arbitration Association is JAMS requires an in-person hearing in the consumer's county. Within a week, I got a call from the owner, who was very apologetic. I think the reality hit him that it was going to cost him somewhere in the neighborhood of 5 grand to defend this arbitration claim (in consumer arbitration, the consumer is only resoponsible for the initial fee). He offered me $500 plus a waiver of the diagnostic fee. I told him I wanted the difference between what Subaru quoted me and his price, plus $250 to cover my filing fee. He bristled at having to cover the filing fee, and I told him I had tried to resolve the issue with the service manager weeks ago and it could have ended there. In the meantime, it could end here, or we could take it to the arbitrator. Another weeks delay and he'd have to pay $1500 for his side of the initiation fee, which is about what I was asking. He agreed to charge back the amount I was asking for.

If it hadn't been for that arbitration clause I'd have had to go to Colorado and file suit in small claims court there. He wouldn't have had to pay any fees, and it would have been really convenient for him to defend the suit. I don't know if I would have won. I don;t know if I would have won the arbitration hearing either. I do know that a Pennsylvania arbitrator deciding a case involving a local tourist who feels he was swindled by an out of state mechanic who knew he was in a desperate situation is not going to feel too much sympathy for the mechanic. He's also not going to be familiar enough with Colorado law to offer a sophisticated analysis of the legal issues. I might not have gotten what I was asking for, but I would have gotten something. In any event, since the expense is borne by the merchant, there's a strong incentive on their end to resolve the matter quickly. It may not be great for malpractice cases involving hundreds of thousands of dollars, but for little shit like this it works much better than the court system.

Just as a preliminary matter, I looked up some statistics at work today on the issue, and they were surprising. The average malpractice settlement is around 350k, and the average verdict is around 1 million. I thought these numbers were low, but I saw them quoted in multiple sources, so I'm going to assume they're true. I practice product liability and toxic tort law, and while I don't know the total settlement average due to the number of defendants, we usually estimate verdicts in the 2 to 3 million range for someone with cancer, even if it's an older person in bad health. There are very few verdicts we can use for comparison, but they're almost all significantly larger than this. That being said, I saw another statistic suggesting that 80%–90% of cases with weak evidence resulted in defense verdicts, 70% of borderline cases did, and only 50% of good cases did. This suggests that juries really don't like awarding damages, but when they do, they go big. In my line of work a defense verdict is highly unlikely, so 1 million may be a reasonable amount if you consider that the modal jury award is zero.

I also learned that 29 states already have tort reform that limits non-economic damages, including some big ones like Texas and California. These caps range anywhere from 250k to 1 million, but they're still significantly smaller than what you'd expect from a jury. Without non-economic damages, it's pretty hard to get to these huge amounts, since they are by nature designed to put a dollar amount on what's priceless. For economic damages to get truly large you'd have to have something like a high-earning plaintiff who is totally disabled and needs to be in a skilled nursing facility for the rest of their life, and even then I can't see it getting above 20 million or so. To show you how we'd calculate that, say you have a 25 year old who makes 100k a year and is permanently disabled. That gives you 4 million in lost earnings assuming retirement at 65. However, if he's entitled to disability payments totaling $1500/month, you'd deduct that leaving you with about 3.2 million. If the skilled nursing facility costs 10k/month and he's expected to live an additional 50 years, that's 6 million, except medicaid is covering part of that cost so you have to deduct that. Add on the medical bills and other stuff and you might get to ten million, which is steep, but nothing like 70 million for pain and suffering alone. And this isn't something the plaintiff is just going to assert out of thin air; they have to show medical bills and hire an economic expert to estimate future earnings and costs. To address your points:

  • I read an NIH study discussing defensive healthcare, and the results were inconclusive. While surveys showed that something like 75% of doctors agreed that they did it, there was no attempt to quantify it, and the NIH study conceded that relying on self-reported questionnaires has its limitations. A study in I think Florida that compared cardiologists who had been previously sued against those who hadn't showed that ones who had been sued ordered 1% more tests, but again, this kind of thing has its limitations. From what I can tell, malpractice insurance premiums are as much as 50% lower in states with award caps than those without, but whether this has any effect on the amount of defensive medicine practiced is anyone's guess. I certainly haven't seen any suggestion that doctors in California are less worried about malpractice claims just because the risk of real whoppers is limited.

Like I said earlier, trials are rare. Something has to go seriously off the rails for a case to go to trial. While caps eliminate some of the tail risk of claims, they don't seem to eliminate the amount of total claims. It's worth remembering that most claims are going to settle well within any reasonable award cap. Even in states without caps, while plaintiff's attorneys may dream of huge awards, they're really a mixed blessing. A settlement offers cash almost immediately; a jury verdict means waiting months for a shot at a large judgment that may get appealed, keeping the money out of your hands for years. If you get sued as a doctor it's more likely to be the kind of case that settles for 200k than the kind where a jury awards a multi-million dollar verdict. The only thing I can conclude is that even if you were to make large awards impossible, as long as you're allowing any kind of malpractice suits the insurance companies are going to want to limit the risk, and if that means defensive medicine, that's what you're going to have to do.

  • What you're describing here already exists, in a way. They're called arbitration panels. Arbitration is a form of alternative dispute resolution where an arbitrator or panel of arbitrators is selected by the parties to hear the case and make a determination. The arbitrators are attorneys who have experience in the relevant area of the law. The way it would work in a malpractice action is that if a neutral arbitrator is required, an independent agency like the American Arbitration Association would provide the parties with a randomly-selected list of 15 medical malpractice arbitrators. Each side would get to reject 5, and the arbitrator would be selected randomly from among the remaining names. If an arbitration panel is needed, each side would appoint their own arbitrator, and the arbitrators would agree on a third neutral party. The procedures are much more informal than in court. Discovery is limited, the rules of evidence don't apply, and the arbitrators may limit what testimony they'll allow and even if you can cross-examine witnesses. For instance, instead of taking depositions you'll get the relevant fact witnesses to submit written statements, and the expert witnesses will submit their reports and that will be the end of it. There is no right to appeal.

Requiring arbitration isn't something you have to wait around for the state legislature to require; doctors and hospitals can and have put mandatory arbitration provisions in their patient care agreements. If the patient doesn't like it, they can choose another doctor or go to another hospital. But these provisions are actually becoming less common than they were a couple decades ago. Why? Because the average settlements are higher.

For whatever reason, arbitrators (and judges) love splitting the baby. With juries it's all or nothing. With arbitrators, it's like they calculate the damages and make the award based on how strong they think the case is. They aren't going to give out a bonanza in any circumstances, so the ceiling is lower compared to juries. But the floor is higher; a weak case that would result in a defense verdict at trial is going to result in at least some award in arbitration, even if the award is small. And since the process is significantly less expensive than litigation, the whole calculus changes. If I go to a traditional trial I'm going to spend a ton of my own money in exchange for, at best, a 50/50 chance of getting a favorable verdict. In arbitration, the marginal cost of going all the way is lower, and the chance of walking away with something is higher. There's less of an incentive to settle, so if the defendant wants to make the case go away he's going to have to offer something close to what he expects the award to be. Realistically, though, in arbitration the plaintiff has no real motivation to settle, so what you end up with is an arbitration award that ends up being more than you would have paid in a traditional settlement, and since the process is so frictionless for the plaintiffs, they're going to file more suits.

Now, you could say that you meant that this panel should include doctors and not lawyers, or maybe a combination of the two, or maybe that you didn't mean arbitration but a more formal system like trial but with an expert panel instead of a jury, or whatever. Just keep two things in mind. The first is that the system is designed to compensate people for injuries, not to make things easier for doctors. The effects of malpractice suits on medicine are unfortunate, but as long as we believe that people who are injured by malpractice are entitled to compensation, they will persist. You may think certain cases are bullshit, but the plaintiff is still suffering, and I'm saying this as part of the defense bar. The other thing to keep in mind is that there's no reason to believe that some alternative fact finder is going to do better than a jury. You can change things, but you may not like the result.

Tort reform is less low-hanging fruit than a buzzword that refers to a set of vague policy ideas that only have a tenuous relationship to actually reducing the number of lawsuits. The exception is that when it's done really aggressively, in which case it pretty much bars all lawsuits excepting the few that meet stringent criteria. Most of this is based upon a myth that what's driving these costs isn't just lawsuits but frivolous lawsuits. And yeah, any News of the Weird type publication can show you all kinds of examples of clearly frivolous suits, but these are a distinct minority, especially in medical malpractice litigation.

The thing about medical malpractice and most other personal injury suits is that they're already expensive to litigate. Plaintiff's lawyers aren't going to take a case unless the damages are enough to make it worthwhile. Just to start with, you're going to need medical records, which are going to cost hundreds and can easily run into the thousands if there was a lengthy hospital stay. Then you need attorney time to go through these records. You need to depose witnesses and order transcripts; you're looking at least at deposing the doctor you're suing, the Plaintiff (or other damage witnesses if the plaintiff is deceased), and possibly other medical personnel. Once you've gathered this information, it's useless unless you have an expert who can explain to the jury why the doctor's conduct deviated from the standard of care, so add another 10 grand or so to get an expert report.

And this is all just to get to the point where you can talk settlement with the defense. If the case actually goes to trial, tack on another $60,000–$100,000 in time and expenses to see the case through to verdict. The upshot is that very few plaintiff's attorneys are willing to take on "frivolous" cases. Ideas like imposing the English rule where an unsuccessful plaintiff has t cover the defense costs is ultimately irrelevant in a legal environment where 99.9% of cases are settled before trial. It may make some defendants more likely to take a chance on borderline cases, but there aren't many of those.

Not very, since bar associations are professional groups without any power. Attorney qualifications are usually set by the state supreme courts, and the Federal government only requires that their lawyers are barred in one state or DC, so unless everyone is on board it will only lead to inconvenience.

With gun ownership, I think the discrepancy can be traced more to the proliferation of hobbies that began in the late 1960s. For my grandfather's generation, if you were an outdoorsy person and wanted a hobby you were pretty much limited to hunting and fishing, as well as day hikes. These days we also have backpacking, mountain biking, whitewater paddling, rock climbing, xc skiing, and other stuff to choose from, which all require a significant investment in time and money. I'd hunt if I had unlimited time, but since I have to work for a living every day I spend hunting would be a day not spent hiking or on the bike, and when you add normal social obligations and chores into the mix that's not a lot of days to begin with.

In the meantime, gun ownership has turned into a hobby of its own. My grandfather owned a lot of guns, but they were all for hunting. I don't even think he owned any pistols. Hell, prior to the 1990s it was difficult to impossible to concealed carry in most places. Now I have friends who own a lot of guns, and whose participation in the hobby seems to end there—they don't hunt and I never hear them talk about going to the range or anything like that. So diversification takes away a large part of the traditional base from gun ownership but adds a new base for whom acquisition is more important than having a specific use. If I'm a hunter in 1965 I probably only need a 12 gauge and a deer rifle and maybe a .22. Now it's de rigeur to own an AR even if the real-world applications are limited.

I find it hard to believe that the Steelers will win this game. In the past I talked about defensive strategy and how to contain Barkley, but I made the mistake of analyzing this game the way a normal person would analyze it. The Steelers had their trap game against Cleveland a few weeks ago, and it had all the classic trap storylines: Short week, divisional opponent, road game, playing a team that's so bad that beating their principle rival at home is akin to winning the Super Bowl, dreaded Thursday game, coming off a hard-fought victory against your principle rival and main competitor for the division title (I'm not including the weather because it wasn't bad until the second half, and the Steelers outplayed the Browns then anyway). This isn't so much a trap game as it is a game where the benefits of winning don't necessarily outweigh the costs of losing.

First, we have to talk curse. The Steelers haven't won in Philadelphia since October 24, 1965, when they beat the Eagles 20–14 at Franklin Field. It was one of two Steelers victories that season. TV announcers like to point out that the Steelers have only had three coaches since 1969; what they don't tell you is that they were firing coaches all the time before that. The coach in 1965 was some guy named Mike Nixon, who was only head coach for a single 2–12 season. The quarterback for the Steelers was Bill Nelson, who passed for 79 yards, a touchdown, and an interception. The whole team had a whopping 139 yards of total offense, and won thanks to two defensive touchdowns. The Steelers were somehow 8 point favorites in this game despite being 0–5 leading into it, though Philly wasn't much better at 1–4. The Steelers would begin their losing streak in Philly the following two seasons, but that was when they were perennially terrible and expected to lose. In 1969 they hired Chuck Noll, drafted Joe Greene, and commenced the only true rebuild in franchise history.

It was that 1969 season when the curse began. The Steelers were getting pasted, and rookie Joe Greene was pissed that he was being held all game and not getting flags. In his frustration, he grabbed the ball from the center before it was snapped and threw it into the stands. In the ensuing decade the team would win four championships and go from Same Old Steelers to Super Steelers. But the cost of this was that they wouldn't win in Philly again. To be clear, this isn't some universally accepted curse, since I just made it up two days ago, but I like the idea that everything is controlled by curses and football gods so I have quite the repertoire of phony curses that I've invented to explain nearly every misfortune in the sports world. The idea of a curse is bolstered, though, by the teams that actually lost in Philadelphia. When I first heard about this losing streak, I assumed that maybe it was just luck; that they happened to be scheduled in Philly during seasons when the team was having an off year. Not so.

The Steelers last played there in 2022, when they were starting Kenny Pickett and the Eagles were Super Bowl contenders, so that one makes sense. But others don't. In 2016 they suffered the biggest loss of Tomlin's career (31 points) in a season in which they made the AFC Championship game. In 2008 they won the Super Bowl but lost 15–6 while Ben got sacked nine times. The 1991 and 1970 teams that lost there weren't good, but the 1979 team also lost despite winning the Super Bowl. Since the Steelers won their first championship, 2 mediocre teams have lost there, 2 were AFC Championship game participants, and 2 were Super Bowl winners. At his point I'm almost hoping they lose because it means a deep playoff run.

More seriously, you have to look at the Steelers' situation and consider what's at stake. They have a two game lead over the Ravens for the division, and a win in Baltimore next week will likely clinch it. They probably aren't getting the bye, regardless of what happens. George Pickens is questionable with a hamstring injury, and it's hard to see them scoring enough points against the league's best defense without him. Why rush him back when it's more important that he play against the Ravens? Let Russ fling the ball around to Scotty Miller and MyCole Pruitt and see what happens. Tell Joey Porter Jr. to give AJ Brown the Ike Taylor cushion so he gets used to covering guys without tackling them. Run shit from the back of the playbook just to show other teams that you're willing to run it so they can waste time preparing for it. If they go balls out they might lose anyway and go into Baltimore on a short week tired and banged up (followed by Kansas City on an even shorter week). The Eagles, meanwhile, are looking for a bounce back after almost losing to the lowly Panthers, and beating the Steelers would be a huge statement that they're for real. They've already got the division more or less wrapped up and they have a legitimate shot at the one seed, so this is a game to win. The rest of their season isn't difficult. Eagles should win this one easily.

Jazz doesn't have any cachet among the general public, I'll grant you that. But it does have cachet among critics and musicians, and I think that's where the problem lies.

From what I've read it looks like he was riding Greyhounds around and one happened to stop there.

Tagging @SteveKirk and @birb_cromble. Whether he has the ID and gun (and manifesto) on him absolutely matters. Remember Brian Laundrie? Every time a high profile suspect is at-large there are going to be sightings, and most of these are going to be superfluous. They aren't going to be able to hold a guy for murder based on the ID of a McDonald's employee in another state who had never seen the guy before. I don't even know how they got PC for the search, though I wouldn't rule out that he consented, and he already misidentified himself to the police, so he's not exactly making all the right moves here. But even still, had they searched him and came up empty they wouldn't have been able to arrest him; he's being held on gun charges (and also misidentification and forgery, though those alone wouldn't be enough to hold him). He hasn't been charged with murder yet, but when he is, the ballistics evidence, fake IDs, and manifesto will all be key pieces of evidence against him. Without that, right now we're looking at "this guy looks kind of like the person wearing a mask in the videos" (and one without a mask), which is a much tougher case to make.

It's not so much that we don't want smart people or independent thinkers as it is that we don't want overly opinionated people who will fuck up the deliberation process. A jury full of relative simpletons isn't a good thing because they won't want to pay attention, won't be able to understand the testimony or jury instructions and will instead just rely on whatever biases they have. The Chauvin jury was composed almost entirely of people with professional or managerial backgrounds. What we're trying to avoid is the kind of person who is overly opinionated and is unwilling to work with the other jurors. We need people who can deliberate, not just voice their opinions. If 1 juror gives the other 11 the impression that he isn't fully invested in deliberating and has already made an unchangeable decision, all it's going to do is piss of the other jurors and increase the chances of a hung jury.

That brings me to another aspect of your plan that was faulty: The presumption that you would be able to hang the jury on your own. Hung juries are almost always fairly evenly split. If you find yourself in a room with 11 people who are voting to convict after several days of deliberation, then it's unlikely that they're doing so purely for political reasons. If you haven't turned at least a few members around in that time, then you're probably wrong, and unless you're a total moron, you'll probably come around yourself. In a high-profile case such as this, there is going to be a lot of pressure for a verdict, and the judge isn't going to send everyone home just because you say you're deadlocked; the system is willing to keep you there a lot longer than you think they will.

I think it would be fascinating to hear about how trial lawyers approach selection in a big case like this one.

I can't speak for big cases, and there are differing theories, but a few general truisms hold. Basically, I aim to have a discussion with prospective jurors, not an examination. In big trials they might interview the jurors individually, but most of the time they bring them in 10 or 20 at a time. I'll start by making a general statement that I expect most people to agree on, just to get people comfortable with raising their hands. There will inevitably be someone who doesn't raise their hand, so I'll pick on that person first to see why they don't agree with everyone else (it's usually because the person is incredibly shy). From there, I try to focus on open-ended questions that don't suggest an answer and give the prospective juror a chance to elaborate on their views. I try to avoid anything that can be answered with a simple yes or no.

For example, in this case I might ask "In the past several years there has been a lot of discussion about how people are increasingly feeling unsafe on public transit. What do you think about that?" And this is where @ArjinFerman's comment ties in. Most people will speak freely about controversial subjects during voir dire. Most people will offer opinions that have the potential to get them booted. You don't know what my trial strategy is or what evidence is going to be presented. You haven't read all of the other jury questionnaires. You don't know where I'm going with my questions. If you think that straddling the line between both sides is going to work, you'd better be sure that you know what the sides actually are. If I'm the prosecutor on this case, I'm not trying to get a bunch of woke-ass do-gooders on the jury, because that isn't going to happen. I've probably accepted the fact that the jury pool is frustrated about erratic behavior on the subway and is sick of having to deal with it. Yeah, some people are more liberal, but they're going to be outspoken and probably get the boot from the defense. I'm trying to craft an argument at trial that acknowledges Mr. Penny's right to intervene but that the problem was in the execution. The only question is whether I think you're willing to accept my argument, and you don't know my criteria for that.

Just because it came off as competent based on initial reporting doesn't mean it was competent. He committed murder in one of the most heavily surveilled parts of the country. His entire stay in New York was known and public within 48 hours of the murder, and he was caught within 4 days. The only thing competent about this murder was that he wore a mask and nondescript clothing and left the area fairly quickly. Just because he wasn't a complete moron doesn't make him a criminal mastermind.

This creates a conflict of interest between the interests of the individual and the interests of the state, and it comes up much more than you think and probably has affected you at some point. Consider the following: A runs a stop sign, causing an accident that totals B's car. A policeman on the scene finds A at fault and issues a ticket for running the stop sign, the penalty for which is a $100 fine and points on the license. A pleads not guilty because he wants to avoid the points and it's customary for the state to agree to drop the points in exchange for a guilty fee where the defendant only pays the fine. A enters his plea a week after the accident, and the court schedules a hearing for two months after the accident.

Meanwhile, B is without a vehicle and puts a claim into A's insurance company. She is relying on the insurance payout to buy a new car, which she needs to get to work. Since the civil claim is rolled into the criminal claim, however, the insurance company can't pay out until the ticket is resolved, which it won't be for two months. Furthermore, B now has to be ready to present evidence at trial since she doesn't know that A just intends to get a deal and may be arguing that he didn't actually run the stop sign. Plus, there's always the risk that the cop just doesn't show up and she's the only witness available to testify, so she has to show up lest the whole matter be dismissed.

So now B is stuck waiting months for an insurance payout that A's insurer would have just paid, and making things incredibly more complicated than they need to be.

If over half the jury pool was rejected for spurious reasons then it doesn't sound like it was that easy to get on. I'm assuming you answered the voir dire questions honestly.

Wrongful death is a creature of statute, and as such the statute defines who has standing to sue. A rough approximation is that you'd have standing if you'd be entitled to inherit under the state's intestacy law.

My guess is that there were one or two people on the jury who saw Penny as a hero who wouldn't agree to any conviction no matter how minor.

It doesn't look like it played that way. If 10 or 11 of 12 are willing to convict then they aren't going to decide to acquit because of 1 or 2 people, especially not so quickly after the higher charge is dropped. If the jury is mostly willing to convict the guy of manslaughter after days of deliberation, I don't see 2 people turning around the other 10 in a couple of hours. This looks more like most of the jury wanted to acquit but one or two holdouts wanted a conviction. Dropping the manslaughter charges may have signaled to the jury that the prosecution didn't really believe in their case, which may be enough to flip these people.

I would have been one of those people, and would have more than willing to hide my power level during jury selection.

I'm generally curious; what makes you think you could hide your power level during jury selection? How do you think you could accomplish this?

I'm with @FiveHourMarathon on this. Trust me, a big house isn't all it's cracked up to be, especially if you don't have kids. From 2017 to 2023 I lived in a rather large house and used it thus:

  • The entryway led into a large combination living and dining area. The dining side had a sliding glass door that led out onto a patio. The living side was 12 feet from the wall to the back of the couch. While this is actually a little closer than at my current house, I don't think anything is gained by more distance. Actually, a bit is lost because now I have to keep my stereo speakers wider to maintain proper imaging in the sweet spot. The dining side was probably a tad smaller than a standalone dining room would have been, but it was still big enough to fit a sideboard and had the added advantage of making it easy to eat in front of the TV without feeling like a piece of shit.

  • There was a reasonably sized eat-in kitchen that the dining area made redundant. I seldom ate in there, and the kitchen table became a pile of junk mail and grocery bags. Having multiple dining areas is redundant; in houses that have both a formal dining room and an eat-in kitchen, the kitchen isn't really a place you can have a dinner party, but I've noticed a trend toward just making a larger kitchen where the dining area can be as formal as it needs to be for most people. I'm not going to complain about having both, but if you have both, one or the other is rarely used for its intended purpose.

  • There was another, smaller living area with a small fireplace. I put my bookshelves in here along with a smaller stereo and used it for reading. I entertained people in here on exactly one occasion (excepting larger parties where people can go anywhere) when I invited some friends over for drinks after going out to dinner and I wanted to have a fire.

  • The master bedroom wasn't particularly large but since I only used it for sleeping it didn't matter. A king-size bed would have been cramped once you included dressers, BUT it had a walk-in closet the size of a small bedroom. The idea clearly was that all the clothing storage/laundry/dressing would be relegated to the closet, and that's how I used it. With a queen and a couple nightstands the bedroom was quite roomy. Now, some of these larger houses have master bedrooms that are big enough to have their own separate sitting areas with couches and televisions, but I don't really see the point in this. To cosplay living in a studio apartment?

  • Bedroom 2 was used as a home office. This was necessary since I was working almost exclusively from home for most of the time I had the house, but if I weren't then I would have preferred to have the computer in the small living room.

  • Bedroom 3 was used as a guest bedroom for the once or twice a year I had overnight guests.

  • Bedroom 4 was a junk collector.

  • There was a powder room off the entryway that was used frequently and a full bath upstairs apart from the master that was used rarely. I once thought about asking my girlfriend to move all of her hair stuff, etc. into the other bathroom but decided against bringing it up because a) she didn't live there so it's not like there was a ton of it and b) the master had a jack and jill so it would seem a bit ridiculous.

  • It had a finished basement that contained a couch that the prior owner left there and a bike trainer. I had a small stereo to listen to while on the bike and a 40" TV for Zwift, but that's about it. I only used the bike trainer in the winter. No one ever sat on the couch. The room was primarily used as a way to get from the garage to the upstairs. The basement also had a separate laundry/utility area. I couldn't feasibly use this as a junk collector because if I entertained guests had to go through here to get to the kegerator.

  • I had a two car garage that I used for storing my car. I used the other side to work on bikes, except I let my girlfriend park here when she stayed over in the winter.

At the time I was living there, the house seemed entirely too large. Cleaning it was a pain in the ass. Heating it was a pain in the ass. Cooling it was fine, but I only turn my AC on if it's going to be above 85 for more than a few days, which in Pittsburgh is only a couple times a year. If I lived in a hotter area or was more sensitive it would have been a pain in the ass. I was able to find use for all the space, but I'd be lying if I said I used it all that much. I bought the house because the price was well below what one would expect due to certain topographic complications involving the lot. For one person, it felt huge.

How big was it? About 2,000 square feet. For someone with kids, it would have been fine. I could understand going a little bigger. My uncle's house is 2600 square feet and it seemed more than big enough for three kids. But 4,000? Larger living rooms just put you farther away from the TV. Larger bedrooms add nothing. A larger kitchen does nothing once you have sufficient counter space. There are only so many rooms you can hang out in. The house I'm in now is about 1400 square feet and I don't see any appreciable decrease in my standard of living.

When I said that jazz was difficult, I didn't mean to imply that it wasn't easy on the ears. It's difficult in the sense that it's hard for someone accustomed to pop music to appreciate, especially if they don't have any musical training. One common complaint I've heard from friends who listen to rock and try to get into jazz is that they like the part at the beginning where everyone plays together, but they get bored throughout the endless soloing. When I tell them that the solos are more or less the whole point (not entirely true, but you get my drift), they give me an odd look. If you're used to structured music with to-the-point melodies and solos that don't go on for more than 8 bars (giving you a taste of possible variation but not getting off track), it's understandable why someone blowing out 5 choruses followed by another guy doing the same thing may seem tiresome. I don't think it's a coincidence that people who already like jam bands tend to also like jazz.

As for Bright Size Life, I bought that album on vinyl at a used record shop back in the 2000s and when I took it home to play I noticed that someone had stashed ripped out pages from a porn magazine in the sleeve. I decided to hang on to them as an investment and they're still there to this day. And how can you say that this is Jaco's best work and not his solo album? It's obviously very good, but he's clearly a sideman here and doesn't get to show his full potential. Pretty much every track on that Jaco Pastorius exhibits a new possibility for what the bass guitar can be, particularly "Portrait of Tracy". And "Opus Pocus" is probably the only example in recorded music of menacing-sounding steel drums.

I'm glad you mentioned Laufey because I wanted to make a post about her shortly after that video came out but never got around to it. Anyway, I think that a big part of the problem when it comes to determining whether or not something qualifies as jazz is that, like with most genres, it's hard to define jazz to begin with. Most books on the subject start with a perfunctory description that goes something like this: Jazz is an African American music, is based in the blues, is heavily improvised, and relies on a swing feel. But none of these elements are exclusive to jazz, and none applies to all jazz.

  • African Americans were obviously central to the development of jazz throughout its history, and continue to be central in the present day. But people of all races and ethnicities have participated in its development as far back as New Orleans, and since then it has seen continued development from nationalities across the globe, from Euorope to Africa to Japan. And it isn't especially popular among African Americans today, much as it isn't especially popular among any demographic group, at least in the US.

  • We associate jazz as having derived from the blues and being closely associated with it, but what is the blues, exactly? A scale? A lyrical style? A feeling? An attitude? Trying to define the blues is fraught with the same problems as trying to define jazz. And whatever the blues is,there's plenty of jazz out there whose incorporation of it is arguable at best, and plenty else with no discernable bluesinfluence whatsoever. Not only that, but pretty much every style of American music, from rock to soul to country, derives from the blues to the same degree that jazz does, so this isn't exactly a unique feature.

  • The level of improvisation in jazz runs the gamut from styles that are almost wholly improvised (such as free jazz) to styles with little to no improvisation at all (much of the very early New Orleans stuff wasn't improvised). And there's plenty of other music from around the world that's improvised; even classical music, the composed music par excellence, often includes improvised cadenzas.

  • Swing presents even more problems than the blues does when it comes to defining it, and titans such as John Coltrane and Duke Ellington have openly questioned its necessity to jazz.

I'm not going to belabor the point by going into too much detail, but writers have added additional components to this list such as the use of certain forms, certain harmonic devices, a vocalized tone, rhythmic elements, focus on an individualized sound, etc. but it's never clear what role all these features are supposed to play, or whether there's some kind of magical combination which is distinct, if difficult to describe. The focus of more contemporary critics, Neely included, tends to be more on participation in a "jazz tradition", defined by the scene and by the audience. As far as Laufey is concerned, the argument is that she isn't jazz because she isn't part of the community of jazz musicians (she didn't get her start in jazz clubs, she doesn't collaborate with established musicians, etc.) and that she doesn't attempt to appeal to a jazz audience. Her press comes from mainstream publications, not jazz critics. This is all, of course, independent of what her music actually sounds like.

For all of their faults, any of these approaches allows us to come up with a reasonable, or at least workable, definition of jazz. And then someone insists that Kenny G is a jazz musician and the arguments start flying. Mr. G is in fact an acoustic musician who plays the soprano saxophone, a traditional jazz instrument that is most closely associated with Sidney Bechet and John Coltrane, two undisputed jazz legends. His has a distinct sound, at least occasionally plays with a swing feel, has at least some blues "feeling", came up on the edge of the jazz tradition (he got his start as a teenager in Barry White's band), etc. These definitional exercises are useful, but they have their limits.

For the record, I'm not going to argue that Kenny G is a jazz musician; his style is more adequately described as instrumental pop. But his connection to jazz is more direct than one would think. If we trace the lineage back, we find two converging streams. Before going solo, Kenny G played with the Jeff Lorber Fusion, who were peers of the later-period fusion acts like Bob James, Earl Klugh, and Fourplay, who trace their lineage back to people like Joe Sample and Donald Byrd, who played a more smoothed out version of the soul jazz of Shirley Scott and Richard "Groove" Holmes from the late 1960s, who in turn are successors to organ grinders like Jimmy Smith from the early '60s, who came out of the more blues-oriented wing of hard-bop, which is jazz qua jazz. On the other hand, his solo work is the direct successor to the instrumental pop of the 70s, which was popularized by people like George Benson, Maynard Ferguson, and Chuck Mangione. But these people started their careers as straightahead jazz musicians before chasing pop hits. They did this, at least in part, in response to Wes Montgomery's recordings of contemporary pop songs for Creed Taylor's CTI label in the late 1960s. Montgomery is a jazz legend in his own right, though his CTI recordings aren't of any particular interest to most jazz fans. And then there's John Klemmer, whose 1975 album Touch is probably the most direct progenitor of the Kenny G sound (song titles like "Waterwheels" and "Glass Dolphins" say it all), but I don't know where he fits into all of this.

To say that the situation is complicated is an understatement, but when trying to define and discuss genres and influences everything is bound to be complicated, because influences come from all directions. The remarkable thing about the whole Kenny G controversy, though, is that his audience didn't seem to care whether he was considered a jazz musician or not. I grew up during the height of his popularity, and while he was far more mainstream than Laufey will ever be, no one really seemed to love his music. His audience, to the extent that he had one, was the kind of person who didn't pay to much attention to the music they listened to. He was background music for people who only listen to music as background. He was played on radio stations that marketed themselves as "the station everyone at work can agree on", which eventually evolved into a "smooth jazz" format that revolved around Mr. G himself. He was at his most ubiquitous at catered events; anyone who attended a wedding, football banquet, or charity fundraiser in the 90s would be forced to suffer through his mindless wailing during dinner, and on at least two occasions I heard the DJ announce to the room that "this is Kenny G and gosh, doesn't this make great dinner music" in a patently self-congratulatory manner.

The problem with Kenny G, from a jazz fan's perspective, wasn't that his fans were impostors who were misidentifying themselves, but that the music industry was incorrectly marketing him as a jazz musician. It wasn't so much that he was being proclaimed the Savior of Jazz the way Laufey is (he wasn't), or any other active attempts, but the more subtle, lazy stuff. His recordings were sold in the jazz section of record stores. He won Grammys in jazz categories. He appeared on the Billboard jazz charts. Stuff like that. This might not be so much of a problem, but for all of his popularity, there was a significant backlash among jazz fans and non-fans alike. As my friend's dad so eloquently put it when I suggested that if his son could be as famous as Kenny G if he practiced his instrument enough, "Kenny G doesn't play the saxophone; he play the kazoo". Or as my own father put it when I asked the context of his breaking the record for the longest-held note (Did he make a recording of it? Did he bore a concert audience for 45 minutes?): "He was in a music store and got his horn out and started playing it. A crowd started to gather around him, and after 30 seconds they started throwing stuff at him".

This backlash created a concern that jazz would only be marginalized further, as potential fans would be turned off by association. As guitarist Pat Metheny famously put it when describing why jazz wasn't popular in the United States:

Well, I can understand why a lot of people say they don't like jazz because right now sometimes you say the word jazz and people think of some of the worst music on earth, like, for instance, Kenny G. I mena, you know there's nothing more stupid than that. Let's face it, that's the dumbest music there could ever be. In the history of human beings there could never be any music worse than that and now people think that that's what jazz is. Well, that's not what jazz is at all. Jazz is, at it's best, the most incredible music; it's just that like rock and roll 95% of it really sucks. It's just the really good stuff that's really great, and that's exactly the same in pop music... It takes a little bit longer to discover the good stuff in jazz because you go into the record store and there's so many records there you don't even know where to begin. It's good to find somebody to help you learn about jazz, somebody who knows about it.

Metheny (who is, in my opinion, the greatest guitarist of all time, period) made these comments in 1987 as part of an interview for an Iron-Curtain era Polish children's program, so this was well before the mainstream backlash in the US started, and it is unlikely that anyone here would have heard them contemporaneously. But when the video clip was posted to the internet in 2000, Pat was asked about it on his website, and his reponse stands as one of the greatest takedowns in internet history, so brutal that it deserves to be read in its entirety. But when it comes to the question of whether Kenny G is a jazz musician, Metheny writes:

Jazz musicians and audiences have gone so far as to say that what he is playing is not even jazz at all. Stepping back for a minute, if we examine the way he plays, especially if one can remove the actual improvising from the often mundane background environment that it is delivered in, we see that his saxophone style is in fact clearly in the tradition of the kind of playing that most reasonably objective listeners WOULD normally quantify as being jazz. It’s just that as jazz or even as music in a general sense, with these standards in mind, it is simply not up to the level of playing that we historically associate with professional improvising musicians. So, lately I have been advocating that we go ahead and just include it under the word jazz – since pretty much of the rest of the world OUTSIDE of the jazz community does anyway – and let the chips fall where they may.

And after all, why he should be judged by any other standard, why he should be exempt from that that all other serious musicians on his instrument are judged by if they attempt to use their abilities in an improvisational context playing with a rhythm section as he does? He SHOULD be compared to John Coltrane or Wayne Shorter, for instance, on his abilities (or lack thereof) to play the soprano saxophone and his success (or lack thereof) at finding a way to deploy that instrument in an ensemble in order to accurately gauge his abilities and put them in the context of his instrument’s legacy and potential.

Laufey, however, faces the opposite problem. Jazz fans, critics, and musicians don't have a problem with her music. No one is making fun of her music, and it would be hugely surprising if she ever faced the same kind of backlash that Kenny G has faced. Neely isn't concerned that Laufey being categorized as jazz will tarnish the reputation of the genre; he's concerned with definitional integrity. There's a tendency among some groups to view definitional labels as an indicator of quality, or lack thereof. The whole "popitmism" debate of the 2010s was an attempt to rectify what was seen as decades of denigration of pop music. By the 1970s, rock had established itself as serious music for serious people, and anything that didn't meet certain criteria could be casually dismissed as "pop" music and ignored entirely. We don't need to consider the artistic merits of Paper Lace or Bo Donaldson & the Haywoods because it's accepted as axiomatic that they don't have any. When a pop artist does come along that we feel has value, like Madonna, we can retroactively define her as rock and lobby the Rock and Roll Hall of Fame for her induction to legitimize our opinion.

What the poptimism movement sought to do was to remove the idea of genres as value indicators. It may be pop music, but there's nothing inherently wrong with that, and it doesn't excuse the critic from evaluating it on its own terms. In later iterations, the concept was taken as a license by some to assume that popularity was synonymous with quality, and a backlash set in. The ultimate problem was a misinterpretation of the logic; if a critic gives a pop album a bad review, is it because the took it seriously and rejected it, or because they're simply refusing to take it seriously? And how do you show that you're taking pop music seriously unless you're heaping praise on pop albums? It didn't help matters that some "rockist" critics (the term is either proud or perjorative, depending on who is making the argument) stubbornly clung to the old paradigm that pop artists simply weren't deserving of serious treatment. As more serious people like Ted Gioia got involved—he said poptimism had caused music criticism to devolve into lifestyle reporting—poptimism's influence waned. By this time, Zoomers were starting to come of age, wholly influenced by the idea of gatekeeping.

So Neely, or I, or anyone from our generation doesn't understand why Laufey has to be jazz. People born between 1981 and 1995 aren't supposed to view genres as quality indicators. Laufey being jazz does not mean that Laufey is good. Kenny G might not be jazz, but if he is, he still sucks. But there's also a bit of bullshit to this argument. Jazz is difficult. For most people, Dixieland conjures up images of old cartoons, and the big band era brings to mind senior citizens. Prewar jazz has been reified to a degree that makes appreciation among the youth difficult if only due to its cultural connotations. Postwar jazz is too esoteric. It was created at a time when it was moving further away from mainstream musical sensibilities and toward the avant garde. Even at its most accessible, it involves harmonic structures that are quite different from most contemporary pop music, and the centerpiece is long improvisations that require close listening to fully appreciate. The upshot is that if someone who mostly listens to Thelonious Monk and Anthony Braxton tells you that the new Taylor Swift album is good, it carries different weight than someone who listens to Ed Sheeran and Lady Gaga telling you the same thing.

The argument over Laufey is even further complicated by the fact that singers have always had an uncertain status within jazz itself. More recent singers like Diane Reeves have managed to stay firmly within the jazz camp, and older ones like Sarah Vaughan and Blossom Dearie mostly have, but even stalwarts like Ella Fitzgerald have trouble staking a solid claim. Neely places Laufey in the traditional pop category, but pretty much everyone in that category shares the same uneasy relationship. The problem is that most of these singers came to prominence in an era where American musical theater provided most of the repertory, whether it be in jazz or pop music, and the boundaries between the two were much more blurred prior to the mid-60s. The other problem is that improvisation is an important part of jazz and singers are expected to sing composed melodies with composed lyrics. There isn't a ton of room to maneuver. Scat singing was developed as a sort of workaround, but it's still hard to see the voice as an instrument on par with, say, the trombone unless you're willing to make certain allowances.

A singer like Frank Sinatra is a case in point. Is he jazz or traditional pop? He first came to prominence with the big bands of Harry James and Tommy Dorsey, so he has a jazz background. But that was at a time when jazz was America's Popular Music, and the material done with singers was often "sweeter" than the purely instrumental stuff. When he went solo in the 1940s the Axel Stordahl arrangements didn't try to be jazz, and by the early 50s he was largely a novelty act. When he resurrected his career with Capitol beginning in 1953, he was most often paired with either Nelson Riddle or Gordon Jenkins. Riddle favored more jazz oriented arrangements, while Jenkins had a lush, Hollywood style. He'd use a variety of arrangers throughout the rest of his career, some more jazz-oriented than others. But in the 1960s he made a famous string of recordings with the Count Basie Orchestra that are unquestionably jazz, and he'd record an album with Duke Ellington in 1968. He was never a great scat singer. Was Sinatra a jazz musician? Sometimes.

So I've now reached the part of the essay where it's customary to offer my own opinion on whether Laufey is jazz, but I'm not going to do so because I don't know. I will say this: I'm not going to cynically toss her into the deep end of the jazz pool the way Metheney does with Kenny G in order to force comparisons with Diane Reeves or Sarah Vaughan. I don't know what this would accomplish. I'm not going to say she's not jazz as opposed to traditional pop because those lines have always been blurred more than Neely leads his viewers to believe. I'm not going to say she isn't jazz because she isn't part of the jazz community, for the same reason I wouldn't say that Charles Mingus wasn't jazz if I found out that he was similarly disengaged.

I'm not going to say she definitively is jazz, either, because I haven't heard enough of her music to make that determination. From what I have heard, it sounds like she's on the periphery, akin to someone like Norah Jones or Eva Cassidy. But I'm glad that a popular performer is at least taking an interest in jazz that suggests they actually listen to it regularly and not just say they're influenced by it to gain cultural cachet. I'm glad that at least some teenage girls are excited to hear Misty, not as an academic exercise but as a genuine emotional experience. I'm glad that a popular musician is viewing the voice as an instrument to be explored and not as a vehicle for sub-Mariah Carey histrionics. Whatever Laufey is, I think her emergence is a good thing.

For instance, it's not at all obvious to me as a layperson why accepting a pardon would or should invalidate someone's right against self-incrimination; as I understand it, accepting a pardon is not an admission of guilt, and a person may have perfectly reasonable opposition to testifying as to their factual guilt. If a pardon doesn't stop a pardoned murderer from being compelled to state under oath and before the whole community that they murdered their housekeeper, or something, well... seems like it's a blessing with a curse.

Well, let's look at the actual language of the Fifth Amendment:

No person... shall be compelled in any criminal case to be a witness against himself

The underlying thrust of this, especially when read together with the rest of the amendment, is that the constitution offers protections against prosecution. Not embarrassment, not reputational harm, not even civil liability. Once a pardon is issued, the pardoned individual can't be a witness against himself in a criminal case because there can be no criminal case. Look at what happened with Bill Cosby; the prosecutor dropped a weak criminal case to allow the alleged victim the opportunity to pursue a civil suit. By entering into a non-prosecution agreement, the alleged victim could now depose Cosby and he couldn't invoke the Fifth Amendment privilege because nothing he said could incriminate him. (Ignore the fact that the trial court misapplied the law and he ended up serving prison time for these statements; the clarification from the appellate court upholds the principle.)

It might not be as hard to catch him as you think. The stories that broke throughout the day were kind of scattershot, but I just saw one on CBS that had a more complete reconstruction of the events. Initial reports said that the shooter escaped on a Citi bike. Assuming at the time that it was a regular pedal bike, you could probably track him down based on that, though it might take quite a lot of legwork. The CBS story said, though, that he arrived to the scene on foot, shot the guy, then ran to a docking station and picked up an ebike, which he then used to escape. These require either an app or a credit card to use, and you can already nail down the station it was taken from and the window of time when it was rented. It gets better, though. While the regular bikes don't have any kind of tracking, the ebikes do. You know know where he returned the bike to, along with the exact escape route. This can be invaluable information if you're looking for clothing, the murder weapon, or any other evidence he may have ditched along the way. I guess there's a chance he could have used a stolen credit card that wasn't cancelled yet, or a stolen phone that happened to have a Citi bike account on it, or some other way of getting the bike, but for all this guy seems to have done right, the getaway plan was fairly stupid.

I commented about this above, but to reiterate: The lawyers accepted the case on a contingency basis. Since the lawyers take on a considerable amount of risk by working on contingency, they're entitled to compensation beyond what they would get for time and effort expended, defined in terms of a percentage of the settlement. Delaware law provides guidelines for how attorney's fees are to be calculated in these kinds of cases, but attorney's fees are always subject to court approval for reasonableness. A strict reading of one test entitles the attorneys to get (roughly, I'm going from memory) 10% of the amount saved if the case is settled early, 20% if the case is settled after discovery, and the full third if the case goes to trial. By that test, the attorneys in this case would theoretically be entitled to something like 18 billion, but they knew there was no way in hell the judge would ever agree to that, so instead they asked for something like 6 billion, based on some byzantine calculation where they used various discount rates to claim they were entitled to 11% of the total. The judge still disagreed, saying they were nuts to assume that kind of windfall based solely on the unusually high value of the case. The judge did agree that the number was going to be high: She pointed to the fact that the litigation took 6 years and was disrupted numerous times (most notably by COVID and Musk's acquisition of Twitter), that they billed 20,000 hours, that numerous experts were required, numerous people had to be deposed, an inordinate amount of records had to be examined, and the issues involved were incredibly complicated. She then looked at the counterproposal from the defendants, which suggested that they should instead get 15% of some lower number I'm not entirely sure how they arrived that. The judge accepted that proposal.