Tuesday, December 31, 2019

Nope, I'm Not Back - Just Leaving a Copy of My Comment Under a Med Tech Article

*Edited.

Nope, haven't changed my mind - I still feel the same way that I wrote about when I left, in November,  about blogging/social media.  

However, after just learning that Bezos and Amazon jumped in the medical transcription AI/VR game, none of us are too happy about it - so I left a comment under a medical technology article and needed to leave a copy of it to verify that it was me (considering a certain someone likes to impersonate me).

Here it is below - which I felt was a necessary thing to do, on behalf of all the (mostly) women and (a few) men still in this field, whose livelihoods have been affected by the tumultuous transition our field has undergone, in the past 15 years - followed by a New Year's note to all my fellow transcriptionists/editors (and only here) :)

_________________


For starters, Amazon/Dr. Alexa being responsible for collecting and tracking people's private medical information? I don't think so - like, ever. Security risks and HIPAA? No security system fortress in the world is completely impenetrable and un-hackable. Everyone always says their system is safe because it's encrypted, but so were Anthem's insurance claims, and we know how that worked out. Amazon collecting and tracking commercial products I buy is fine, but they're not welcome to collect my private healthcare information, sorry.

Bezos is not doing this "cheap" service out of the kindness of his heart, he's not exactly known for being charitable. There's always an endgame, with Bezos, so what is it?
 
So that Amazon, via "Dr. Alexa," can sync that info and start recommending products from Amazon that patients can buy for their medical condition, without "Dr. Alexa" having any true understanding of the mildness or severity of that condition, nor considering other comorbidities patients might have that affect that condition (or vice versa) and it's treatment?

Well, that's a potentially health-dangerous idea. Not to mention, in that case, it's not just medical transcriptionists/editors like me who should be worried for their jobs - clinicians should be worried about the future of "Dr. Alexa," too.

Haven't we commercialized healthcare in the US enough?

If I found out that my personal doctor was using Amazon Alexa-ish software to do my transcription, I'd change physicians immediately lol.
And let's review our history and do an actual cost-savings recap of the transcription transformation that began in the early 2000s. First, outsourcing to India was the cheaper answer. Then it was VR/AI. Then EHR/EMR and "checking boxes." All of these "revolutions" ultimately resulted in hospitals and clinics finding out they still needed to pay humans to go back and proofread and edit them, with their first language matching that used by the dictator - and that the box-checking aspect of EHR alone wasn't working.
In the end, the cost of VR licensing, implementation, and tech support of EHR/EMR - plus tack on later adding editing contractors (the clinicians themselves do not have time) - eventually cost them the same, if not more, than just hiring a handful of human employees from the get-go, not to mention displaced thousands of American workers, contributing to the last recession.
Even Nuance, the current "best" in VR, is still far from "captain's log, star date." I know, because I've been a medical transcriptionist for 24 years, now editing Nuance-based VR software's mistakes. The going industry rate for doing so has now dropped to around 4 cents a line, regardless of contracting company, which works out to about $7 to $8 an hour on average (and that's if the dictation is clear, with no background noise and no accent ) - with no benefits - as we are all now subcontractors.
Editors are still needed because VR often cannot understand accents for even simple words, let alone complex medical terminology, pharmaceuticals, and medical devices, especially if there is background noise like beeping machines, babies crying, etc. They cannot translate slang, idioms, and figures of speech.  
They often cannot distinguish important medical terms like hypertension versus hypotension, which is extremely important for patient care. Cheaper programs cannot even distinguish between "discreet" and "discrete," and often even "their" and "there." 
Even Nuance VR/AI, the industry's current gold standard, still puts in things like "The patient had a cabbage" instead of "The patient had a CABG" (meaning coronary artery bypass grafting)" - even ones that claim to have "machine learning."

At some point, healthcare needs to realize that you still need humans for this one, at least to proof/edit them, because we're talking about the interpretation of human language and communication, and technology cannot pick up on all of the subtleties of language and keep up with our evolving forms of communication and perhaps never will.
That is because interpreting all the subtleties of human communication, especially medical terminology, is a specialized skill - a distinctly human skill - and the clinicians themselves do not have the time, nor did they pay all of that money to medical school, to spend most of their time proofreading themselves.
These are legal documents; thus, there's an inherent, imperative need for them to be accurate - not only for communication/coordination of proper care for patients between clinicians, but they are often used later in court depositions for disability claims or malpractice lawsuits. However, of even more interest to the readers here is the fact that both public and private insurance companies require these documents to substantiate the need for your patients to receive treatment and pay you for your services.
Since they're that important, it's important that they be accurate - and if you want that type of accuracy, you need to pay for it.

And not only have I personally seen patient care suffer due to VR and EHR "checked-box" mistakes, but you're being penny wise, but pound foolish, to choose this route, thinking you're cutting costs this way, because you will pay more in the long run, either due to having to hire scribes or editors or money lost from lost insurance payments due to inaccuracies on reports.
But please continue - realizing that despite all the previous promises this industry has heard, with the advent of each newer, cheaper, better idea, they have all ultimately resulted in the same thing - return of work back to the US to some cheap contracting company to edit them for you, who universally will exploit their subcontractors by paying them for less than minimum wage and no benefits, claiming they have to in order to compete with India and VR/AI. And in the end, with the cost of clinic or hospital-wide software licenses and launch, plus editors, the cost will end up being the same or greater than just hiring a few human transcriptionists or scribes outright as actual employees.
Lastly, this is healthcare - you're supposed to care for people - all people - including the people being exploited, sweatshop style, to edit these VR reports for you, to make sure your important medical-legal documents are accurate, so that your patients are well cared for, so that insurance companies pay for your services, and so that your butts are covered, when you get sued for malpractice or are called in to give a deposition on someone's condition for disability claims ;)
Please value the skilled work that we do to provide this service for you with the accuracy you require (and often demand), because it is a specialized skill - and if you haven't realized that yet, you soon will - after implementing VR alone with no one to edit it. At the very least, please make sure the contracting company you eventually hire to edit them for you are not exploiting their subcontractors (which is a rarity, especially with larger contracting companies).
_________________________________

Dear Fellow Contractors and Subcontracting Medical Transcriptionists/Editors,

I understand we've all been through a lot, in this field, the past 15 years - and there's little work for low pay, and that it has to be that way to compete with EHR/EMR checkbox systems, AI/VR and India.

However, just because the humanity of this field is being replaced by automation, doesn't mean we lose our humanity and begin to behave in an automated way (or worse).

There's no need to be overly competitive, cut-throat, rude, bullying, and/or verbally abusive, demanding perfection for pennies per line - resources truly are scarce, but there's still some work.  And there's no android apocalypse yet, there's no terminator saying, "Are you Sawah Connah?" yet, I promise :)
If it is no longer possible to pay people properly in this field anymore, because we've lost value by market standards, then the least we can do is continue to value each other.

Please be kind to each other. This business has become hard enough on us, nobody needs your nasty on top of it. Please give the same benefit of the doubt you would like others to give you - kindness doesn't cost anyone a thing.
Warmly wishing you bright days in the coming New Year - Happy New Year :)
~ Chrystal Chaplow
_______________________________

Otherwise, I've just put some favorite posts back up over the past year - either my favorites, other people's favorites, or otherwise had a private good memory attached to them (parking in draft any that revealed too much personal info) - enjoy :) 

(Comments are still closed, however, due to persistent harassment from a certain individual).

My best to you in 2020 :)

Since A Couple of People Mentioned Missing Seeing My Christmas Decorations This Year (Thank You) ... :)

I hope your Christmas was full of as much love and light, this year, as mine was :)