This weekly roundup thread is intended for all culture war posts. 'Culture war' is vaguely defined, but it basically means controversial issues that fall along set tribal lines. Arguments over culture war issues generate a lot of heat and little light, and few deeply entrenched people ever change their minds. This thread is for voicing opinions and analyzing the state of the discussion while trying to optimize for light over heat.
Optimistically, we think that engaging with people you disagree with is worth your time, and so is being nice! Pessimistically, there are many dynamics that can lead discussions on Culture War topics to become unproductive. There's a human tendency to divide along tribal lines, praising your ingroup and vilifying your outgroup - and if you think you find it easy to criticize your ingroup, then it may be that your outgroup is not who you think it is. Extremists with opposing positions can feed off each other, highlighting each other's worst points to justify their own angry rhetoric, which becomes in turn a new example of bad behavior for the other side to highlight.
We would like to avoid these negative dynamics. Accordingly, we ask that you do not use this thread for waging the Culture War. Examples of waging the Culture War:
-
Shaming.
-
Attempting to 'build consensus' or enforce ideological conformity.
-
Making sweeping generalizations to vilify a group you dislike.
-
Recruiting for a cause.
-
Posting links that could be summarized as 'Boo outgroup!' Basically, if your content is 'Can you believe what Those People did this week?' then you should either refrain from posting, or do some very patient work to contextualize and/or steel-man the relevant viewpoint.
In general, you should argue to understand, not to win. This thread is not territory to be claimed by one group or another; indeed, the aim is to have many different viewpoints represented here. Thus, we also ask that you follow some guidelines:
-
Speak plainly. Avoid sarcasm and mockery. When disagreeing with someone, state your objections explicitly.
-
Be as precise and charitable as you can. Don't paraphrase unflatteringly.
-
Don't imply that someone said something they did not say, even if you think it follows from what they said.
-
Write like everyone is reading and you want them to be included in the discussion.
On an ad hoc basis, the mods will try to compile a list of the best posts/comments from the previous week, posted in Quality Contribution threads and archived at /r/TheThread. You may nominate a comment for this list by clicking on 'report' at the bottom of the post and typing 'Actually a quality contribution' as the report reason.
Jump in the discussion.
No email address required.
Notes -
I'm pretty sure it is, yes. Provably beating the average human, or even reaching the same level would be a huge milestone that Elon would be shouting from every roof. That ancient rationalist prophecy about truck drivers getting replaced by AI would have already come true.
Eh, it takes time for change to percolate, and truck drivers are sufficiently selected that we can assume they're better drivers than average- the average driver, after all, includes averages from lots of people who insist on driving drunk/high, texting while driving, etc.
Ok, are you actually saying FSD is as safe as a human driver right now, or are you just pointing out reasons why being as safe doesn't necessarily mean wide scale adoption? The former is an interesting conversation, but the latter strikes men a zero stakes one about angels dancing on a pin.
Same question @jkf.
Both. I think the following things can be true- trucking is (understandably) highly regulated and it will take a long time to get major changes like self driving trucks into the mainstream, truckers are above-average drivers and so self-driving software will need to improve massively to replace them, driving a semi truck is a different problem from driving a car and needs beefier software, and liability reasons don’t affect the calculus much because the trucking industry is structured around making insurance companies pay for accidents anyways.
If you add it all up, I think this points to ‘robot Uber’ before it points to ‘self driving trucks’. After all, Uber drivers do not go to school and get a special license and take regular drug tests. These are also regular cars in a far less regulated field.
More options
Context Copy link
I'm saying that the truly "average driver" as reflected in accident statistics does not really exist -- the famous "paradox" about how 80% of people think they are better drivers than average is actually kind of true. There are a certain amount of really bad drivers out there, and quite a lot more who are pretty good -- and they would be scared shitless by driving with a robot who drove like them 80% of the time, but like a drunken maniac the other 20%. (which would be as safe as the "average driver", statistically)
IDK whether FSD is even that safe at the moment -- I don't think it's knowable right now due to lack of adoption and/or public testing. Seems worse to talk about than dancing angels to me, unless somebody wants to bring some stats -- but if you insist, wouldn't people be, like, using this in prod if that were the case?
As I recall Elon promised to FSD from San Francisco to NYC 5+ years ago -- why hasn't he done so by now?
Sure, but I don't see why there needs to be. There's an X amount miles driven by humans, there's a Y amount of miles driven by AI, if
y/$accideents >= x/$accidents
then AI is "better than the average driver" even if there's no such individual person to benchmark against.This would work if you are planning to force everybody to ride with the robots -- in reality the ones who are causing most of the crashes are the least likely to adopt self-driving cars anytime soon; criminals and poor people. So rolling out (hypothetical) "average driver" safety-level FSD cars on an optional basis would replace the safer drivers (rich, sane people) with something somewhat worse, while leaving the ones actually causing the accidents still out there tooling around.
tl;dr -- for me to be interested, they need to be better than me -- I don't care if they are better than the average driver. The average driver is pretty bad.
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
And yet still insufficient -- the set of human drivers includes a lot of people who are drunk/stoned/distracted/angry at any given moment -- perhaps unsurprisingly, these people cause a lot of accidents, which brings the average performance down substantially.
All you need to do to be much safer than average is not do those things; for me to feel safe sleeping (for example) in a robot car, I'd want a couple std deviations better than human average at least. I imagine trucking companies feel the same way (maybe even less risk tolerant) -- particularly considering that with automated trucks they no longer have a human to throw under the bus when he does something dumb.
All you need to be much safer than average is not live near certain low iq/low conscientiousness/high time preference populations, and yet if you attempt to do that it's the second coming of the apocalypse and the libs cry foul to the moon.
Perhaps we need segregation for the roads, have an AI and Emergency vehicles only lane. Anyone else caught driving there unless they are gunning it for the hospital gets cited/jailed.
More options
Context Copy link
While truckers who get in an at-fault accident will be immediately fired and not hired by any trucking company ever again, ambulance chasers don't go after them because they don't have the money to give a big payday. Trucking lawsuits usually hinge on getting a big insurance payout on the basis of 'you should be liable for hiring/overworking/undermanaging him'. There's no inherent reason a trucking company wouldn't prefer to have an ambulance chaser fighting Tesla's lawyers than State Farm's.
I'm more thinking of assuaging the public's lust for blood when a truck takes out a schoolbus or something -- the driver still wears the criminal penalties for any mistakes in a way that I don't think Tesla will be prepared to accept.
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link