I have no such advice. I use a Linux basedd NAS myself.
I have no such advice. I use a Linux basedd NAS myself.
Seems you also use a bit of freeBSD in your setup besides Linux. Still FOSS though!
The problem with incorrect transceiption exists with my secretary too. In the system I work in the secretary write my recordibg, sends it to me, I read it. I can edit the text at this point and then digitally sign it with a personal private key. This usually happens at least a day after being recorded. All perscriptions or orders to my nurses are given inannother system besides the raw text in the medical records. I can’t easily explain the practical workings but I really don’t see that the AI system will introduce more errors.
But I agree that in the event of a system failure, there will be a catastrophic situation.
Ah sorry, I mean removing the option of using the keyboard as an input method in the medical records system. The keyboard itself isn’t physically removed from the computer clients.
But I agree that in the event of a system failure the hospital will halt.
Unfortunately the interface of the medical records system will be changed when this is implemented. The keyboard input method will be entirely removed.
Thats another issue and doesn’t lessen the importance of this issue. Both are important but separate. One is about patiwnt data, the other about my voice model. Also in thsi case I have no control over the mesical records and it’s already stored outside the hospital in my case.
Sure that’s another problem but this data is already sent beyond the hospital. We have a national system in place gatjering all medical records.
My biometric data, in this case my voice. Training an AI, tailored to my voice, out of my control, hosted as a cloud solution.
Of course there is an aspect of patient confidenciality too, but this battle is already lost. The data in the medical records is already hosted outside of my hospital.
I take this as humour - I understand my situation and IT suite isn’t more insecure than many others :)
I don’t know if it’s common practise in other countries. In Sweden where I work it is. I think the rationale is the following:
Of course we have to review the teanscribed result. At my hospital, all doctors carry smart cards and use the personal stoed private key to digitally sign every transcribed medical record entry.
I agree and I suspect this planned system might get scuttled before release due to legal problems. That’s why I framed it in a non legal way. I want my bosses to understand the privacy issue, both in this particular case but also in future cases.
Yes I agree. Broadening the scope a little, I frankly just wait for a big leak of medical records. The system we use is a birds nest of different softwares, countless API:s, all sorts of database backends. Many systems syem from MS-DOS, just embedded in a bit more modern integrated environment. There are just so many flaws and I’m amazed a leak hasn’t happened (or at least surfaced) yet.
I’m not sure what exact service will be used. I won’t be able to type as the IT environment is tightly controlled and they will even remove the keyboard as an input device for the medical records.
I understand the fight will be hard and I’m not getting into it if I cant present something they will understand. I’m definetly in a minority both among the admin staff and my peers, the doctors. Most are totally ignorsnt to the privacy issue.
I work in Sweden and it falls under GDPR. There are probably are GDPR implications but as I wrote the question is not legal. I want my bosses to be aeare of the general issue ad this is but the first of many similar problems.
The training data is to be per person, resulting in a tailored model to every single doctor.
Thaks fot he advice but I’m not against using AI-models transcribing me, just not a cloud model specifically trained on my voice without any control by me. A local model or more preferrably a general local model woulf be fine. What makes me sad is that the persons behind this are totally ignorant to the problem.
That’s correct! I’m not againt using technology to cut costs or providing better healthcare. My question is entirely about the privacy implications.
Sure but what about my peers? I want to get the point across and the understanding of privacy implications. I’m certain that this is just the first of many reforms without proper analysis of privacy implications.
It will not be possible to use my own software. The computer environment is tightly controlled. If this is implemented my only input device to the medical records will be the AI transcriber (stupidity).
I’m a psychiatrist in the field of substance abuse and withdrawal. Sure there’s a shortage of us too but I want the hospital to understand the problem, not just me getting to use a old school secretary by threatening going to another hospital.
Debian-based custom built thing. Nothing special.