domain:shapesinthefog.substack.com
Using the wide definition you gave, where 'utility' can be pretty much anything, sure.
Even if bees did suffer 7% as much as humans, that isn't much. Consider that doctors use a 1/10 scale when asking patients to describe pain. A 1 on this scale barely registers, and dividing the scale into tenths is evidently the most useful way to do things (a 1/100 scale would be quite unwieldy). I know it's not the most useful analogy, but if 10 is the worst pain imaginable then a 1 on the scale is pretty low, definitely in the "mild annoyance category". So basically bees are capable of experiencing pain to the extent that my thumb knuckle decided to start hurting a few minutes ago or that my Achilles tendons are tight in the morning. And this is the most suffering they are capable of experiencing, i.e. the same suffering that the majority of people experience on a day to day basis without thinking about it, even excluding those who are suffering more.
I only had them for the first time last year. The first time I had them I think they'd been frozen and weren't particularly nice, but earlier this year I had fresh ones. With some lemon juice and red onion (or alternatively, tabasco), I thought they were delicious.
Terrible ban. We get stuff posted here of a similar level of snarling, but pointed at the left, and it regularly doesn't catch these types of bans.
Which of his statements was actually even worthy of the ban here?
I don’t know about you, but I have all sorts of preferences which don’t lead to minmaxing.
Oysters are particularly weird, to be honest. No brain! They just sort of stick around, filtering the water. Like plants, but meaty.
I don’t trust them.
Electrolytes, magnesium, and adjusting your macros may help but if your intensity is too high for your current stamina levels you have to pay the piper while your body adjusts. The obvious thing to try is adjusting the intensity until you're happy with your energy levels afterwards, for example have a day where you go harder on your arms and go easier on the hike then the next day switch it around.
Unfortunately, I think that there are two layers of nonsense compounding on one another in the article.
- The reporter isn't specific in exactly where and how the cuts are being made, much less the reasoning behind them.
- Satya Nadella saying "1/3 of Code is written by AI" is a nothing burger all its own.
First, the obvious question is "what kind of code?" Does he mean boilerplate stuff that, before LLMs, was handled mostly by copying previous projects and re-using the basics? Does he man config files and deployment scripts for infrastructure? This is very much still code, but not in the user-facing, self-contained full product sense.
Looking deeper, the next questions are "so what?" and "how much code can AI actually write?" I am reminded of the classic The Mythical Man Month. Writing code isn't a linear function. 1.5x inputs does not yield a corresponding ratio of 1.5x outputs. The actual writing of code is often a pareto or power law function; you spend 80% of your time on 20% (or less!) of the codebase. Much like the hard part of writing is editing, the real slog in coding comes in debugging and, later, refactoring. Shitting out shitty but "hey it works" code is easy.
Every mid-to-senior level developer, data scientist, and ML engineer I've had discussions with more or less comes to the same conclusion space; AI is really handy, right now, for discrete problems. It's a massive time saver. It's actually extra handy for writing tests. In the not so distant future, it will probably be able to do some real system engineering work.
But it can't replace all the devs because, at some point, using more LLMs in your development will actually cause the project to take longer (again, reference The Mythical Man Month). If you look at the "thinking" output of Chain Of Thought models, you can see how it flirts with recursion loops. It tells itself to think about x but also to make sure it considers y too and, oh yeah, definitely make sure z is in there too. And that's for simple chat based prompts. If you have an LLM read a detailed system design plan and then hit the "do it" button, my worry isn't that it would output broken, non-internally consistent code, but that it would never actually output anything functional. Instead, I imagine millions of lions of incomplete functions with a lot of extraneous documentation and the wholesale swapping in and out of design patterns. Spaghetti code, but without even a "fuck it, it works" level of functionality.
I don’t care if they’re living. I care if they experience something that I would recognize as pain. The closer they get to having our sort of neurotransmitters or whatever, the more empathy I feel.
Apes and chimpanzees might scoff at us feeble humans: how we have to cook our food before eating it instead of consuming it raw
Great. Now imagine what happens when we not only become dependant on cooking, but we also lose our ability to cook. That's the issue being raised here. Do you think that's not happening? That it's impossible?
Surely aborting fetuses for having a trait has a different moral calculus than removing that trait while leaving the person otherwise intact?
The disconnect is that you conflate sexual intercourse (i.e. PIV) with "firing the gun", when under your metaphor PIV would be shooting to kill (fulfilling the act's primary purpose) and non-procreative sex would be target practice (fulfilling a secondary, recreational purpose).
"Sorry I let your kid drown, but since I saved 15 bees today I'm still 5% ahead of the game."
And FFXIV : Endwalker (cw: level 90 spoilers).
I think that level of imprecision is pretty darn normal when describing preferences. It’s not a technical term like “gluten-free” or “kosher.”
Hell, even the latter is subject to complex edge cases.
I don’t think we’re talking past each other much.
FWIW I completely endorse your perspective on the rat and the kitten, though I don't expect most others to get it. Regardless of intentions, learning to kill animals well requires botching the process rather a lot of times.
Cats, uh, do not generally deliver a humane and quick death, and hunters who place their shots poorly are not thought well of. I certainly wouldn’t tolerate a novice Hunter making gut shots. But that’s an unrealistic expectation of cats; what animals do to each other is not bound by the golden rule.
My usual reaction to catching a rodent is to break its neck and feed it to the cat already dead, not to torture it. But the cat needed to learn to catch rats, to be clear(and that is how cats learn- their mothers bring them wounded prey). It is, of course, beneath a person to maim an animal because its attempts at escape are amusing, Ivan thé terrible style. But we shouldn’t worry about the suffering of lower animals in veal or shrimp or egg production. Their suffering is instrumental, not intentional.
Methods of Proof did not strike me (a programmer with many years of experience) as particularly relevant to software engineering. On the other hand, it's a very short course, I think about 1800 XP. Discrete math I think should be more relevant. You will definitely get questions about complexity analysis (Big-O) in interviews.
M4ML is a selection of the most ML-relevant lessons from the LA => MVC => Stats sequence. I think it covers about half of LA and stats, and a third of MVC. I chose to take M4ML first in order to benefit from the interleaving of topics, instead of doing the full sequence one subject at a time.
You can definitely do software engineering without all this math. I studied most of this in college and haven't really ever used it, except some concepts from discrete math. There are specific domains where it can be useful or even essential, but you can have a solid career and make a lot of money basically never using math. One of the best engineers I've ever met told me he almost failed out of college because he wasn't good at math.
I don't know much about how much math you need for ML, as I'm not an ML developer. I would start with discrete math and M4ML, just in case.
Yes, it kind of is. The more the average man’s opinion matters to the ruler, the more likely it is that the country is democratic
Ok, but that means there are absolute monarchies that are "democractic" and liberal democracies that aren't (and the "liberal" qualifier is important, because Botond already implied it's not really a democracy if it's not liberal enough, but your claim would imply the amount of liberalism is irrelevant). I can imagine a coherent view being extracted from this but I think it would boil down to "democracy" == "rule of the majority", but then I don't see how you can claim there's a tradition of democracy in the west.
Isn’t the main alt-right and alt-left anti-democracy argument that people’s opinion don’t matter, it’s all ‘elites’, ‘lobbies’ , hidden and less hidden power-brokers who decide? Even they agree that this average joe pressure is democratic in nature.
I can't speak for everyone, but kinda. It's more that they punch below their weight. And like I said above - I can accept democracy being the will of the majority, but I think it derails the previous arguments more than clarifies them.
Flame of Frenzy in Elden Ring.
But then it turns out that when you offload vital cognitive function to this device, the brain never develops them itself, so now every child grows into an adult dependent on this device for life.
Funnily enough, I finally got around to reading The Secret of Our Success by Joseph Henrich after reading Scott's review of it ~7 years ago. I'm only about 70 pages in, but Henrich has already clearly elucidated that this pattern you're describing (of humans becoming frail and atrophied in some domain because of our life-or-death dependence on technological interventions) is also known as "the history of the human species" or perhaps even "the very thing that makes us human"*. Apes and chimpanzees might scoff at us feeble humans: how we have to cook our food before eating it instead of consuming it raw, how we always lose against them in unarmed fights (even fights between a burly adult male human and a juvenile chimp). Who cares what they think? We took over the planet, not them. In the distant future, who's more likely to colonise the solar system: the humans who stubbornly insist on hanging on to their pain receptors in spite of the fact that they've never laid eyes on a rusty nail in their entire lives; or the humans who've outsourced that cognitive module to an external gadget, and can hence devote that extra processing power to optimising their local Dyson sphere? Trick question: the former group won't even exist, having been ruthlessly outcompeted by the latter, just as the proto-humans who weren't onboard with this whole "applying heat to raw meat before eating it" thing got outcompeted by those who were.
*Reading it is making me even more contemptuous of Luddites than I was before. If, as Ted Kacsynski would have us believe, the industrial revolution was a disaster for the human race, why stop at 1750? Technological developments and our reliance on increasingly complex tools have been changing who we are, at a cellular, neural level, for as long as the human race has existed.
wouldn’t you care if someone were purposely buying bees only to kill them?
Only if they were doing so on an industrial scale and fucking with the bee economy. Human beings annihilate insects in their quadrillions every year; if they're worth anything more than zero then I guess we should all just kill ourselves right now to make room for them.
There’s possibly an element of Jewish thought in this reasoning + Singer’s. Because there’s an eagerness to heap up behavioral proscriptions, however numerous; there’s the love of rules and the eagerness to find extrapolations to the rules which defy normal intuition; there’s the arbitrary basis to begin morality; and there’s the obsession with trivia and edge cases over more substantive issues. That’s immaterial, but just interesting to note — it’s possible some of Matthew’s moral intuitions come from a different traditional framework.
This is called autism, not Jewishness. Autism can lead to people not having an innate understanding of why social rules work the way they do and trying to make sense of them in arcane ways that take them overly literally.
I admit I have trouble parsing your arguments here as anything but "do not ever attempt to change anything for the better (unless we define "better" as things that have existed before and now don't), you moron, you absolute buffoon".
It seems like you are conflating things. Why would having sexual intercourse in general be the equivalent of shooting to kill? Where is that even coming from?
For my metaphor, the equivalent of shooting to kill would be having intercourse specifically on a day the woman is expected to be most fertile, in the hopes that it will bring forth new life. In fact, there are supposed tricks to time sexual intercourse to have a male or female baby (male sperm swim faster, so if you abstain from sex up until the moment of ovulation, there's like a 30% higher chance a male sperm will get there first.) Something like that would be shooting to kill. But just any old act of sexual intercourse is not this.
"Fufilling the act's primary purpose" - I'm not talking about the act's purpose. I'm talking about the object's purpose. That's the conflation many people are making on this thread. The object (Genitals, guns and bullets) and the act (firing a gun, engaging in sexual intercourse) are different. Still different are the things that the act can do (have a kid, target practice, etc). I'm not appealing to the Act's purpose at any point to describe what someone should do with the object.
More options
Context Copy link