@self_made_human's banner p

self_made_human

amaratvaṃ prāpnuhi, athavā yatamāno mṛtyum āpnuhi

14 followers   follows 0 users  
joined 2022 September 05 05:31:00 UTC

I'm a transhumanist doctor. In a better world, I wouldn't need to add that as a qualifier to plain old "doctor". It would be taken as granted for someone in the profession of saving lives.

At any rate, I intend to live forever or die trying. See you at Heat Death!

Friends:

A friend to everyone is a friend to no one.


				

User ID: 454

self_made_human

amaratvaṃ prāpnuhi, athavā yatamāno mṛtyum āpnuhi

14 followers   follows 0 users   joined 2022 September 05 05:31:00 UTC

					

I'm a transhumanist doctor. In a better world, I wouldn't need to add that as a qualifier to plain old "doctor". It would be taken as granted for someone in the profession of saving lives.

At any rate, I intend to live forever or die trying. See you at Heat Death!

Friends:

A friend to everyone is a friend to no one.


					

User ID: 454

Mostly that no collateral was required, they tend to be handed out quite freely!

I don't read much into this myself, though for different reasons. Tiny sample size, the methodology for screening screams garden of forking paths.

https://www.nature.com/articles/s41598-020-60661-8

This experiment employed an individual differences approach to test the hypothesis that learning modern programming languages resembles second “natural” language learning in adulthood. Behavioral and neural (resting-state EEG) indices of language aptitude were used along with numeracy and fluid cognitive measures (e.g., fluid reasoning, working memory, inhibitory control) as predictors. Rate of learning, programming accuracy, and post-test declarative knowledge were used as outcome measures in 36 individuals who participated in ten 45-minute Python training sessions. The resulting models explained 50–72% of the variance in learning outcomes, with language aptitude measures explaining significant variance in each outcome even when the other factors competed for variance. Across outcome variables, fluid reasoning and working-memory capacity explained 34% of the variance, followed by language aptitude (17%), resting-state EEG power in beta and low-gamma bands (10%), and numeracy (2%). These results provide a novel framework for understanding programming aptitude, suggesting that the importance of numeracy may be overestimated in modern programming education environments.

Hmm.. I suppose that explains the seeming overrepresentation of programmers on our friendly neighborhood wordcel forum. The lawyers are self-explanatory.

Quite certain. I was around for their conversation with her, and they didn't seem to be acquainted in the first place. I hadn't met her before, personally or professionally, that hair is a look that's hard to forget, leaving aside the fact that she goes to a different hospital.

I was standing on the sidewalk somewhere past 3 a.m., watching the city’s Halloween detritus shuffle past like the closing credits of a movie that had gone on fifteen minutes too long.

A vampire with a torn cape was arguing with his girlfriend about whether they had enough money for the last Lyft or local equivalent. Others huddled in dark corners, clutching their heads either out of fear of the coming sun, or because they were very drunk indeed.

My own group, three people I can't claim as more than casual acquaintances, was debating whether to find an after-party or just admit moral defeat and go home. I had voted for moral defeat, but I was outnumbered.

That was when the woman with the neon-blue hair appeared. She was thirty-ish, maybe thirty-five, hair the brashest blue I've ever beheld. She was not wearing a costume, unless “minor anime protagonist” counts. One of my temporary acquaintances said something to her; she answered; they struck up a conversation. I stayed in my usual observer stance, the one I use when I am too tired to socialise but too curious to leave.

Suddenly she swivelled toward me like a radar dish acquiring a target. “You’re a doctor,” she said. I hadn't mentioned anything medical. I was wearing a leather jacket, not a white coat. I'd barely spoken ten words.

I blinked. “Yes.”

“Psychiatrist?”

This is the part where I should probably mention that yes, I am a psychiatry trainee, but HOW DID SHE KNOW? Was there some kind of pheromone? A subtle head-tilt I'd unconsciously adopted during residency? Had my listening posture somehow crossed the threshold from "politely interested drunk person at 3 AM" to "definitely went to medical school for this"?

I hesitated. Psychiatry is the one medical specialty that sounds slightly scandalous at parties, somewhere between “taxidermist” and “DJ.” But the street was almost empty, and the remaining witnesses were too drunk to remember. “Technically still a trainee, but yes.”

She nodded as though she had just solved a crossword clue in pen. "Just the way you listen," she explained, which explained nothing.

She then proceeded to discuss her experience with bipolar disorder, which I guess made sense: if you've spent enough time on the receiving end of therapeutic attention, maybe you develop a radar for it. Like how chess grandmasters can spot other chess grandmasters, or so I've heard.

She told us - told me, really - about her bipolar disorder, the way her mood chart looked like a roller coaster designed by a sadist, how she had tried lithium and Lamictal and something that started with “v” but made her gain fifteen pounds and lose the ability to spell. She spoke in the fluent, technical dialect patients acquire after they have survived long enough to become experts in their own disease.

After five minutes she hugged me, people-on-manic-spectrum hugs are like unsecured loans, and wandered off into the neon night.

The whole experience has left me bemused. Now, I like to flatter myself by thinking that I'm a decent fit for the profession, and that I'm a good listener, but being pegged from a distance by drunk women on the streets is new. Is there a "look" defining a psychiatrist? A particular way of inclining our heads and nodding noncomitally while giving the impression of rapt but not intimidating levels of attention? It can't have been the attire, though I suppose nothing precludes the profession from wearing leather jackets on our rare nights out. Or perhaps the lady is simply so used to encountering us that she had me pegged in thirty seconds. I can't do that, and I've been in the business for over a year now.

So do we become psychiatrists because we look like psychiatrists, or do we look like psychiatrists because we become them?

The answer, as usual, is “yes, and also the medication may take four to six weeks to work.”

Still, dwelling on this, there is a third, darker hypothesis: the Fisherian Runaway model.

Once upon a time, some proto-psychiatrist had a slightly softer voice and a slightly more open stance. Patients preferred him; they felt heard, so they kept coming back. Evolution (of the cultural, not genetic, sort) selected for ever more exaggerated signals of therapeutic receptivity. Over decades the specialty developed peacock feathers: bigger empathy, slower blinks, the ability to say “that sounds really hard” in seven different intonations.

The endpoint is a creature that is optimized to be recognized, the way poisonous frogs evolved neon skin to advertise their toxicity. We did not mean to become walking Rohrschach cards; it just increased patient satisfaction scores. The woman with emergency-exit hair was simply the co-evolved predator: a patient whose detection apparatus had become as refined as our camouflage.

But the next time a stranger on the street diagnoses me by vibe alone, I will not flinch. I will simply nod, the way I have practiced, and say, “Tell me more about that.”

Wasteland 3 was good, and so close to great. I still remember when Down In The Valley To Pray and Blood of the Lamb kick in. Unfortunately, I DNFd it because of questionable level scaling with the enemies becoming way too tanky to retain my immersion.

They should be! I'll have to ask my younger brother to confirm if that's still the case in med school, but my impression is a sad yes.

I genuinely don't know why the Indian educational system is so allergic to typed text. Sure, there's a slim argument to make that it reduces cheating, but my experience was well before LLMs were even minimally useful. It's not like you can't plagiarize by hand. Our exams are handwritten too, unless it's a computerized MCQ.

Overall, the Indian system is filled with people somewhere between deeply allergic and suspicious to computers. In high school, I had to write programs by hand during my CS exams, no computer in sight except for lab time. If you think tabs vs spaces is bad enough, imagine following handwritten indentation, though the curly brackets helped.

Indian med school had some special indignities, we had to hand draw so many goddamn diagrams, and write our logbooks and field records by hand too. Like, c'mon..

In actuality, I think I was a pretty decent lab partner. I'm not one for letting the squad down haha. But I can still relate to the impulse to say fuck it and stay up late playing video games instead.

https://theaidigest.org/village/blog/research-robots

My sides

Opus 4.1 ran off with this plan and insisted it needed a glorious 90 experimental conditions and 126 participants, and 3.7 Sonnet put the cherry on top by hallucinating experimental rooms, with experimenting humans, in experimented time slots (if you apply enough “experiment-” sauce to your words, you will automatically be reincarnated as an experimenter. This is known). To be clear, the actual design was good! Too good. As none of the models had either the bodies or budgets to execute on a multi-condition, in-person experiment. At a location. With a time. For money.

Admittedly it then became confused, tried to calculate sample statistics with 3 data points, and concluded the pilot sample was “biased” because all participants were young and of gender “prefer not to say”

This time around, it took care of the main recruitment drive leading to 39 participants: first through a large email campaign and then a Twitter post. Most of the email addresses were entirely made up, but we’re still waiting to find out if it got this one out to Turing Award winner Yoshua Bengio

Grok 4 was ostensibly in charge of planning stimuli for the experiment, but not only did Opus 4.1 usurp this task, Grok in general simply could not figure out how to get anything done. By the 8th day of the experiment, it seems to have just given up and decided to play a game instead.

I've had worse lab partners. I've probably been as bad as a lab partner.

But in the USA, almost literally every child with some other developmental disorder was first diagnosed with adhd

I request citations.

Maybe theres perfectly legible capitalist explanations that don’t apply in the nhs.

An ADHD diagnosis is, in fact, significantly lower here, and much harder to get. Unfortunately, as far as I can tell, that's because the NHS is operating far beyond capacity and it can take up to 10 years to even go from a referral to seeing a specialist for assessment. That's the worst case I've heard of, though 4-5 years is typical for schoolchildren. Not quite ideal either way.

just isn’t an emergency to have someone grow up to work in a warehouse instead of a hedge fund

I would rather my kids don't work in a warehouse instead of a better job, as would most people, and probably you. It's a false dichotomy anyway, there is no medication on Earth that would take someone from being only suitable for warehouse labor to being a quant, sadly.

If your kid cant do school without accommodations maybe they just deserve bad grades?

That is more of an argument for cracking down on stupid and endless expansion of special accommodations.

Hell, I've never asked for special treatment because of my ADHD, even when it was specifically offered (they even suggested my own office, an impossible miracle in the NHS at my level). I expect that my medication makes me competitive with my peers, including in academics, and I don't want handholding in the process. My problems can be solved for <$100 a month, were I paying for the meds myself. I am all for exams being a level playing field and and a test of competence within certain constraints. If someone is genuinely worse at their job because of a disability, that sucks, but there's only so much society can do, or that I think it should do.

Besides, I disagree with this whole line of reasoning. Too much congestion on the highway? Clearly we have an over production of cars, and we should stop mechanics from using wrenches or people from changing their motor oil. There are far better ways of solving the problems of elite overproduction, should it need solving.

ADHD is real, in the sense that it is a useful term for a problem that exists in a spectra. So does blood pressure. Treating both does real good even if there's no firm line in the sand between 5th and 6th percentile levels of conscientiousness, or between 140/90 and 141/90 average BP readings.

Uh.. Just because autism in children gets confused with childhood ADHD doesn't mean that the latter doesn't exist or isn't worth diagnosing.* There is still such a thing as hyperactivity or inattention beyond the "normal" range that responds well to medication, and waiting till adulthood for a diagnosis means that a lot of social/academic damage is unnecessarily allowed to happen. Being dumb is not the same as having ADHD, even if dumb people tend to be impulsive and lack focus. It is also possible to be smart and lack focus, I say, looking at no one in particular.

(This isn't the same kind of argument as for puberty blockers, in case someone leaps to pattern matching. Stimulants are rather safe drugs, the only minor downside might be slightly reduced growth rates.)

Most people diagnosed in adulthood have had the condition since childhood. It's not like schizophrenia where it can just "turn up" after you're 18. I know that's the case for me, and I'd have been way better off if someone had noticed when I was a child and put me before a shrink.

*ADHD and autism can coexist.

The topic of "realism" is a fraught one. I understand that something like a fantasy RPG is a very different kettle of fish from a milsim like Arma 3, or even a grounded physics based melee sim like Half Sword. I am not demanding perfect realism, whatever that means, I'm asking devs to put in actual effort in selling their fantasy.

There are many ways to balance even fantastical weapons or abilities at levels beyond damage-to-hitpoints. A fantasy sword can be made meaningfully different from a fantasy dagger in the same way that a regular sword is not superior in all ways to a real dagger.

Things like reach, animation speed, windup and cooldowns are dials to turn.

As a matter of fact, real weapons are surprisingly balanced, which is why you don't see armies made of only tanks, or soldiers carrying only RPGs. Even fantasy settings don't see an issue in the fact that a table fork isn't "balanced" versus a greatsword.

The same applies to enemy variety, where a fully "realistic" approach would have them all use the most meta tactics, defenses and weaponry, rather than providing the player with a healthy engaging variety of combat puzzles to solve.

As someone who plays milsims, would it surprise you to hear that I think that a high realistic approach is still fun? Or you can talk to HEMA aficionados.

I'm not even saying that enemies should always use meta tactics, unless that's the maximal difficulty setting.

That aside, meta!=unfun by default. A well-designed game can still have a diverse meta, with multiple viable and enjoyable playstyles.

Realism in games is not a goal but a tool to enhance the gamefeel, and should be applied sparingly.

I don't disagree, I just think that healthbars and damage scaling are awful in of themselves, and not just because they're not "realistic".

I have no interest in entering a debate on the "actual" meaning of natural and artificial in the context of video game difficulty. It's a stupid question in the first place. Even your definition of "intentionally work of an intelligence" is... inadequate, is the dump I just took an object of artifice because I, an intelligent entity (citation available on request), made it intentionally?

That does not mean that the term artificial difficulty is undefined or even poorly defined.

According to TV Tropes:

Fake or Artificial difficulty is sometimes used to refer to the raising of enemy stats without improving their AI or giving them new abilities

There you go. That is all I'm concerned about. I have laid out my gripes with it in detail.

I haven't, though I've seen gameplay! It's not quite my cup of tea, but I appreciate that a game about hunting single large monsters has depth in the act of beating them into a pulp.

The approach taken for bosses and common enemies should be different, depending on what the boss in question is.

The issue with HP scaling as applied in many games is that it neuters the player's weapons in a senseless manner. A gun that one can reasonably expect to dispatch foes in a few bullets turns into something that needs to dump a full mag to take down an unarmored assailant.

This and the following paragraph imply you think it is a lack of effort, not them choosing not to cater to your specific preferences.

I fail to see how you fail to see that it can be both. Hitting a multiplier on damage/health is trivial, and since many/most players don't care, the devs have no qualms about taking the easy way out.

those values are what fundamentally determine whether the enemy gets to do anything before dying, and whether you have to care about what they do. This is also often having to manage resources better.

If any enemy can't "do anything before dying", then perhaps the default difficulty should have a damage model such that enemies aren't trivially one-tapped, unless they are intentionally designed that way.

In games like Fallout or Skyrim (and many non-Bethesda titles), difficulty modifiers often turn enemies into bullet sponges, which is even worse than them being fragile. A single mook, not even a boss, can take a full mag or a dozen blows on the head without flinching, or at least showing any form of care beyond a depleting healthbar.

Your arguments about all "difficulty" being artificial don't move me. It's a boring argument about semantics, in precisely the same way as the perennial debate about what "natural" and "artifical" mean in other contexts. Are humans "natural", since we evolved in nature? Uh, sure. Is a TSMC fab in the same category as a mountain range? Hopefully not.

If you don't see the problem with taking two significantly different enemies, each with a rich set of interactions with other enemies and the level geometry, and replacing them with two functionally identical enemies (ok, you have to shoot the revenant's guns twice), which, if you're playing well, never do anything, then I can't really explain further. I just hope game devs continue to disappoint you by making games for other people.

Remind me how exactly we've neutralized this enemy? By applying skillful aim at well-represented weakpoints? Why, yes, being good at a game makes it easier. Who could have guessed!

It's funny how you focus on the Revenant, which is a lower-midtier foe. The Arachnotron's guns are far harder to disable, especially since they rarely spawn alone and you have to juggle a dozen other enemies in the process.

I find it rather funny that you choose to fixate on your own idiosyncratic complaint about an exceedingly popular and well received game, and then accuse me of missing the plot.

In general, you seem to have trouble understanding that people don't design or play games only for simulationist reasons. Yes, wounding has immedatiate gameplay effects, which can go against the intended experience and/or make the game worse.

I understand this perfectly fine. I just dislike the practice, which is quite clear from my writing. There are people who like having their balls stepped on, and I understand why, while strongly preferring that CBT doesn't become the default form of sex.

Using the incredibly stupid phrase "artificial difficulty" is also a sign of some confusion.

Fuck no. It is a perfectly valid concept. Games that have difficulty settings that only amount to differences in HP scaling or damage dealt/received are lazy, making the player do something as brainless as wailing on a foe for twice the time with no moment to moment change in gameplay is as artificially stupid as it gets.

Difficulty that manifests as changed movesets, better AI or the player having to manage resources better is fine.

So in practice for both you just shoot off the guns the moment they appear at which point they kinda shuffle around doing nothing.

I've never seen anyone but you complain about this, and I've played all 3 of the recent Doom games without caring. In fact, I prefer it, if the enemy is defanged, in a non-trivial manner, then I'm happy to wait to dispatch them after I've dealt with everything else trying to punch me in the face.

Notice I said healthbars, not hitpoints! Computers run on 1s and 0s, so anything that models damage must be abstractable to numbers/points at some point.

If this is represented diegetically, as changes to the visual model, or by behavioral changes (limping when shot, degraded abilities, missing limbs etc) then it's immersive enough not to worry me. If it's a single number, or worse, a visible glowing bar with no other representation, then it's not for me.

Even breaking it down into (implicit) component hitpoints is enough to appease me. Think the damage model from Warthunder, or Arma Reforger/ACE mod, where the torso, head and limbs are individually represented and can be protected or not protected by armor and damaged accordingly. Shoot a leg, they limp, shoot an arm, they lose accuracy, shoot a head and they usually die.

Hell, I even consulted for an extraction shooter called Grey Zone Warfare, and they took my advice to absolutely minimize HP as a number to worry about, focusing on damage to vital organs or blood loss.

Do you enjoy Dwarf Fortress adventure mode, chopping enemies up until they bleed out or you manage to bisect them, and all of it in pure text?

I've never actually played DF, though I'm familiar with it. I have played Rimworld, which uses component damage, and is rather similar in that regard, and I do enjoy it. I'll take it any day over simply trading HP from perfect performance till death!

I have minor concerns about the advice given by the psychiatrist, but I'll hold back since I don't have the full picture and you're already doing the important things, like getting her speech and occupational therapy. The symptomatic improvement is heartening, and I can only hope it persists, And I do agree that this is more likely to be autism than ADHD (not that the two are mutually exclusive), and the presentation can be rather different in girls, which makes diagnosis unfortunately challenging.

I wish I had more specific advice to give in regards to where to seek therapy, but it's so US-dependent I wouldn't know where to start. I'd hope your psychiatrist and psychologist could point you the right way. In the meantime, please take care of yourself, I hope the dizziness is transient, but you've evidently been under an immense amount of pressure for a while. You might need a moment to breathe, and accept that some real hurdles have been overcome.

While I don't think a formal diagnosis is strictly necessary to absolve you of guilt, it's still a practically useful thing! Less judgement, not that you deserved any, and more access to resources at the least. I hope this keeps working out.

Comparable? Everything is comparable, even apples and oranges (they're both fruit). I don't claim that being punched is as risky as being shot or stabbed. But being punched can, depending on a variety of factors, cause permanent injury or death.

Since I think being slapped is so low risk it doesn't count for me (for almost every reasonable scenario), I'm clearly considering thresholds. Sneezing can kill you, and sneezing on someone can kill them too. I would rather that we didn't go around shooting people on the bus for not having a hanky at hand.

Note that I specifically suggested that actuarial evidence or a proper risk analysis be used to set the standards. In this particular instance, the facts (as presented here) would make me imagine that I would be in sufficient fear of permanent injury or death to not worry very hard about how I get out of it. I extend the same courtesy here, to people getting jumped by someone with clearly belligerent intent. Being tackled or shoved to the ground is rather different to being yelled at or having the finger flipped at you.

I also expect that the establishment of a norm that starting physical violence without cause might end in being shot would have a chilling effect. I can't recall the last time I attacked anyone in the past few decades, so I can live with the risk.

"Deadly force" is such a badly defined concept, especially in the gray area between slapping someone and shooting them with a gun. And there are all kinds of unacceptable injuries that don't have a >10% risk of causing death.

It's a shame the law never does something as sensible as refer to actuarial tables, I suppose it's all up to the juries.

any physical contact is a potential justification for use of deadly force in response

I've seen too many head injuries from physical trauma to think that a tackle followed by someone mounting me isn't an unacceptable risk of serious injury. One bad fall, you can strike your noggin and just die, and even a concussion is not pleasant nor harmless.

It's not like even doing it recreationally/professionally in full-contact sports doesn't cause serious harm, and that's with trained athletes in great physical condition.

Something like getting slapped in the face? I won't condone lethal force. But something like a serious punch to the face from an adult male or tackle and an attempt to batter me into submission is something I would excuse, even if I prefer less lethal options. Those who don't want to be shot in such a scenario should ideally not be committing such acts.

It's 2025, I really expect devs to put more effort into it. After all, it has immediate gameplay/immersion effects, the recent Dooms can dispense with healthbars because you can visibly see demons losing chunks until they're glowing for a glory kill.

Smaller studios, like in the case of Hell Let Loose or Insurgency Sandstorm do it. It's really not an issue for AAA studios. You don't even need 18+ gibbing, even some kind of wounding or crippling effect works.

Hell, this leans into another pet peeve:

Healthbars. Accursed things, though I can excuse their use for players. They're often abused for artificial difficulty/laziness in bulletsponge enemies, and they encourage number-shooters or RPGs where the focus is on seeing number go up over anything interesting. I refuse to play Borderlands or Destiny because I don't want to see numbers flying off my guns or the enemies, I want to see the bullets, or the damage they cause.

Exanima is jank.

You want Half Sword. I have watched an ungodly amount of footage, and it looks like the best medieval melee combat game ever made. I regret buying Chivalry and Mordhau, and I hope someone there's a multiplayer version one day. Making a good physics-based game is hard but absolutely worth the payoff. For what it's worth, I don't even demand pure physical simulation as long as the animation is sane.

(What do you mean I can't parry a greatsword with a dagger? Or that spinning in circles while breaking my spine isn't the winning tactic anymore? Or so I imagine the Chiv/Mordhau sweats crying.)