This weekly roundup thread is intended for all culture war posts. 'Culture war' is vaguely defined, but it basically means controversial issues that fall along set tribal lines. Arguments over culture war issues generate a lot of heat and little light, and few deeply entrenched people ever change their minds. This thread is for voicing opinions and analyzing the state of the discussion while trying to optimize for light over heat.
Optimistically, we think that engaging with people you disagree with is worth your time, and so is being nice! Pessimistically, there are many dynamics that can lead discussions on Culture War topics to become unproductive. There's a human tendency to divide along tribal lines, praising your ingroup and vilifying your outgroup - and if you think you find it easy to criticize your ingroup, then it may be that your outgroup is not who you think it is. Extremists with opposing positions can feed off each other, highlighting each other's worst points to justify their own angry rhetoric, which becomes in turn a new example of bad behavior for the other side to highlight.
We would like to avoid these negative dynamics. Accordingly, we ask that you do not use this thread for waging the Culture War. Examples of waging the Culture War:
-
Shaming.
-
Attempting to 'build consensus' or enforce ideological conformity.
-
Making sweeping generalizations to vilify a group you dislike.
-
Recruiting for a cause.
-
Posting links that could be summarized as 'Boo outgroup!' Basically, if your content is 'Can you believe what Those People did this week?' then you should either refrain from posting, or do some very patient work to contextualize and/or steel-man the relevant viewpoint.
In general, you should argue to understand, not to win. This thread is not territory to be claimed by one group or another; indeed, the aim is to have many different viewpoints represented here. Thus, we also ask that you follow some guidelines:
-
Speak plainly. Avoid sarcasm and mockery. When disagreeing with someone, state your objections explicitly.
-
Be as precise and charitable as you can. Don't paraphrase unflatteringly.
-
Don't imply that someone said something they did not say, even if you think it follows from what they said.
-
Write like everyone is reading and you want them to be included in the discussion.
On an ad hoc basis, the mods will try to compile a list of the best posts/comments from the previous week, posted in Quality Contribution threads and archived at /r/TheThread. You may nominate a comment for this list by clicking on 'report' at the bottom of the post and typing 'Actually a quality contribution' as the report reason.

Jump in the discussion.
No email address required.
Notes -
Hey, I'm quite libertarian, but there's good reason to believe that our comfortable society would not survive long if small groups had the ability to make deadly, highly infectious pathogens. We're at least lucky that there's not an easy, cheap, undetectable way to make nuclear weapons.
Yes, "we overlords need to prevent you from doing X for safety" CAN BE and IS abused all the time, and I'm with you in beating that drum as often as I can. Unfortunately, that does not mean that there aren't a few Xs that the overlords really do need to prevent us from doing.
Is not really possible, knowledge isn't the major bottleneck, its process, materials, equipment, and skillset. This is just a confusion that some more knowledge oriented profession have about difficulty in other fields.
I don't see how that's the case.
If you were already reasonably wealthy (~few million USD at hand) or magically given the money, then you absolutely would be bottlenecked by knowledge.
You could purchase lab equipment, reagents etc, hire staff without much difficulty. I think you would rapidly find out that your staff have thoughts when they get an inkling of what you're up to. I can think of a semi-legitimate way to avoid scrutiny, but thanks to @faul_sname 's reminder, I'm not going to blab. It's very obvious to me even as someone not directly involved in microbiology, so any competent actor would recognize it as their best bet. Even [REDACTED] would only get you so far.
Alternatively, you could go do a bachelor's and masters in microbiology and try and manage as much as you could yourself, but that still leaves plenty of scope for being unmasked.
Right now:
Right now, I think you need a state-level actor to safely make bioweapons at scale. Smaller, if you accept the massive risk of failing and dying because of error. Much of that is a combination of knowing the right things/hiring the right people, and then motivating them properly.
As it stands, I think a blanket-ban on anything with a whiff of bioweapons research seems warranted. What are the upsides really? If you have a legitimate use case, you want the government on your side, and probably enough organizational weight to negotiate for looser restraints from the labs.
This and the fear that the layman can use a LLM to make bioweapons are in completely different realms of argumentation. Only a tiny fraction of the population makes enough money to have a ~few million usd on hand.
As you pointed out, you can go get the knowledge, the skillset, the knowledge of the process, nothing is stopping you, except you know time to do all of that. The fear is that an LLM can skip a 4 year degree + a 2 year masters in providing you all of that. Idk much about biology, but I am passingly familiar with explosives.
The cost of bioweapons development has dropped dramatically. While I can't quote a sticker figure for a whole bioweapons project (for understandable reasons), I can point out that all the necessary components, like access to genetic sequencing and engineering, lab equipment etc have all drastically dropped in price over time.
I'm not claiming that an oracular AGI will let the average American with the average bank account make a pandemic in his garage. This is partly predicated on similarly (or likely more) powerful AI being deployed in screening and defense.
My point is that we risk moving from a regime where it takes:
To:
It is clear to me that this relaxation will balloon the number of people/orgs who meet the criteria of knowledge/motivation/wealth.
Explosives do not, as a rule, self-replicate or mutate. Completely different ballpark. Any redneck can make a pipe bomb, and many without blowing off a finger. Nuclear bombs, which are on the same scale of lethality, require far more effort.
Money? I am positing both independent wealth and the ability to get a degree. Just the degree isn't sufficient unless you have millions of dollars, as a rough bound. Most terrorists are somewhat broken individuals, they are unlikely to go to all that bother or stick it out.
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
Please do not try to bait people into explaining in detail why this particular thing is easier than it looks.
Is it really baiting? For the majority of nitro chemistry - you take something organic, some nitric acid, some sulfuric acid as catalyst and the resulting thing will probably make a nice boom. The tricky part is getting the the stuff to make boom when you tell it to. Which requires reagents with high purity. And the guys in Merck do know what to look for if someone starts making purchases. And it is not field in which you can learn from your mistakes - both in production and procurement.
We have had total synthesis of cocaine for more than a century. The market is huge - and yet it is cheaper and easier to be grown in bolivia and shipped to Europe and US, than to be made domestically with high purity and untraceable.
Making whatever terrorist related is easy. But it is often a many step process with complicated supply chain. And every step is one where you could draw some unwanted attention. Or kill yourself.
Any man that is able to lone wolf a terrorist attack of the kind safetists fear, won't be on that will need chat gpt guidance.
Yeah I'm not at all concerned about chemical weapons.
More options
Context Copy link
More options
Context Copy link
Is this bait? This was my honest assessment.
Hey, I'm not a biologist, and you might be right (...although I don't know why you listed "process" and "skillset" as not being knowledge-based?). But are you willing to bet civilization on it? The stakes are pretty high here, so I think it's fair to raise the burden of proof that "this is actually hard" beyond the normal level of an Internet argument.
Note that entire nations have tried and failed to create nuclear weapons for 80 years, which is good evidence that it's genuinely hard. Meanwhile, it's conceivable (if not proven) that a worldwide pandemic spread inadvertently from a small biolab in Wuhan. The two levels of effort are orders of magnitude apart.
I'm not a biologist either. But I am in defense research, and one of the things parts of the defense/intel establishment intensely want is to be able to create biological compounds and medical supplies in austere forward bases. Think 3d printing drugs, bandages, needles, etc. There is a LOT of money being thrown at that problem. And it hasn't gone anywhere(fast). So if it was just "Well we need to know the formula" then it would be solved. But its not. Skillset != Knowledge. My girlfriend's father is a bit of an anarchist. He gave her several books on the chemical process and formulas for making bombs. And then said never to do them because he has a friend who tried and now no longer has thumbs. Making explosives (knowledge of the formula) and having the skill to keep all of your limbs are two different things and LLMs can't give you the skillset. "Process" is the knowledge in the sense of austere manufacturing is knowledge, its how do you create clean rooms, how do you create biological precursors with everyday chemicals, you do you titrate, filter, mix, combine, to get the right compounds. Chemical Engineering is literally the field of how do we make chemical processes more efficient/practical, and they are paid big bucks to do it. If it was easy why are they getting paid well? The problems with these internet arguments is that they abstract all of the details and the details are fucking hard.
Note that it took a bunch of highly skilled chemical biologists to create the virus, the "spread" was what was inadvertent. The effort on the creation vs the effort on safety protocols are two different things. Since we are talking creation, I'll bet you that a jailbroken LLM cannot tell you how to create a novel virus via gain-of-function without you already having a biology background.
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
They don't and they won't. Things like that, just like making nuclear weapons, require a bunch of physical infrastructure that costs a lot of money and takes a lot of effort to build, and you certainly can't just build it unnoticed. Even if you can ask ChatGPT for the recipe and it just spits it out, there's nothing you can actually do with it. What we're really relying on is that random small groups don't have the resources to do these kinds of things.
They can't actually stop us from doing things.
They can arrest us after the fact. Normal people behave because they care about their reputation and about the consequences of their actions (even if just the "I'll be arrested" part of the consequences). But that does not really work on crazies or fanatics. They don't care.
If we really do, somehow, get to the point where random small groups can easily produce deadly pathogens, we're in trouble anyway. For example, look at what Aum Shinrikyo managed to do. The cult was disbanded and the leader executed afterwards, but that's afterwards. If they had managed to make something really deadly, they wouldn't have been stopped in time.
More options
Context Copy link
I'm open to that, I just want ideally to:
a) set an expectation that it has to be really, really bad before the company starts cutting you off. Apocalypse bad, not misgendering-bad or said-nigger bad
b) require serious defence of the above assertion to a hostile audience
Killing people isn't that hard. If you're worried about big society-spanning plagues then those are difficult (plague is spread by fleas, are you breeding those too?) and potentially possible to mitigate without sending the police into everybody's browser. I don't want 'suppress info' to be the default response.
In the software world we call this "missing test coverage". If your safety features don't get tested until any test failure is apocalyptic, you don't actually have safety features. Maybe we should be picking more politically neutral or less politically relevant test cases, but anything is better than nothing.
If they're pre-existing plagues, then they're difficult-to-impossible. Anything you can get by introducing a few mutations into some virus is at most a few mutations away from a virus that wasn't currently a society-spanning plague. Centuries ago you could have a germ slowly co-evolve with the immune systems of some subset of humanity and then eventually make its way out to devastate a larger immunologically unprepared population, but these days there aren't many subsets of humanity that aren't at most a weekly airplane flight away from the rest of us.
If they're not pre-existing plagues, it's kind of harder to say, isn't it? Gunpowder would have been a pretty awesome capability for a predator to have, but it was impossible to evolve except by the extremely roundabout method of "get intelligence to come up with it". There may be similarly awesome capabilities that are only possible to put into germs in the same way.
Nor do I ... but while I'm libertarian enough to have voted (L) in every presidential election, I'm also pessimistic enough to wonder whether how amenable to my desires the universe really is. Totalitarian suppression of change is itself an existential risk, whether it fails (which historically tends to be a bloody process) or succeeds (in which case a "boot crushing a
humansapient face forever" is itself a possible contributor to the Fermi paradox), but the seemingly-obvious solution of "just don't do that" might seem less obvious in a world where a home biolab ends up being a thousand times more dangerous than an airline ticket and a boxcutter were in our world.More options
Context Copy link
More options
Context Copy link
More options
Context Copy link