site banner

Friday Fun Thread for February 23, 2024

Be advised: this thread is not for serious in-depth discussion of weighty topics (we have a link for that), this thread is not for anything Culture War related. This thread is for Fun. You got jokes? Share 'em. You got silly questions? Ask 'em.

2
Jump in the discussion.

No email address required.

I completely understand what you're saying and I agree, 999 times out of a thousand, any possible mistake in transcription / medication is probably not a big deal at all. And you don't need special technology to make a mistake as a human being.

Oh, and try to forget too, while you're at it, that anonymized patient case reports are regularly published in medical journals, because doctors have to learn things as well.

I'm not really concerned how the data is used, more about potential patient-related issues.

This is the kind of stuff that American hospitals sometimes do, and this one case is an extreme example.

18 yo patient was treated for brain aneurysm by hospital

patient and family decide that they've stayed long enough in the hospital (2 months)

hospital denies all transfers, tries legal motion to give the hospital legal guardianship of the adult patient, alleging insufficient mental condition to make own medical decisions

normal case would be to get guardianship to family + they had the adult patient sign a bunch of consent forms at the same time they were attempting that legal process

family heists the patient out of the hospital, get cell-phone tracked by the police and chased

police finally back off after seeing a 2nd hospital disagreed with the first

I've recently been made aware of a case of a family with 3 children. One child somehow received 3rd degree burns from a boiling pot of water, was taken to the hospital for treatment. Parents were locked up with a heavy bail, all children sent to foster care.

Here's one hypothetical case :

parent comes into American hospital with a child with a black eye / skull fracture / some other kind of injury that could come from something completely innocuous or from domestic abuse

staff follows the process of performing tests without telling the parent that they're looking for confirmation of domestic abuse

tests come back negative because the child was not abused

low-paid worker/ poor technology /??? introduces some kind of typo

parents get arrested for domestic abuse - kid ends up abused in foster care

This is all in normal times when there is not a powerful coalition of interests to prevent family members from visiting patients, decrease any kind of oversight over what happens inside hospitals and introduce some kind of hero worship for healthcare workers.

That is not remotely anywhere near the issues I covered.

A discharge summary is precisely that. It's written up when a patient is being sent out of the hospital (and still alive, thankfully, death certificates are a pain), and exists solely to summarize events and therapies, as well as ongoing medical care and planned follow up. They are both medicolegal documents, as well as necessary for continuity of care (if you're a doctor relying on patient memory to cover everything they have or had done to them, better have good malpractice insurance).

There is no universe in which a transcription error or misspelled drug leads to a hospital getting into a tussle with the family over a patient being discharged or criminal charges being brought against them. That does not happen, or, if it's happened, it's so vanishingly rare as to not be worth worrying about. Those are issues of hospital policy, legality and overpolicing, not anything related to "imported doctors" using "shady technology". I invite you to show me the relevance.

Further, this is simply an evolution of existing techniques, such as human transcriptionists (who may or may not be licensed for medical transcription, for what that's worth), and voice dictation software. Whisper is just more advanced in terms of functionality, and Dragon VTT is probably old enough to predate modern ML/DL, though some of that might have been folded in. It's also free and open-source, I just happen to have found a way to get it for "free", using the ChatGPT app, without relying on the otherwise unreliable 3.5 model for handling patient data. Hence all the rambling about me reviewing the text, copying it over and editing it for errors. It's not 3.5 doing anything of note, it exists as a dumb receptacle for a far more contextually useful service, the Whisper service, in much the way someone can use an AI Waifu or Twitter bot to learn how the Navier-Stokes equation works (real example from a meme). Your objections make no sense in that regard, and I would hope that Western hospitals have more robust systems (who am I kidding? It's another junior doctor pulling their hair out), but it saves me time, and does a better job for the poor bastards who are waiting for discharge paperwork and insurance to clear before they get to leave.