| Vic Crain Jan 7 | OK, let's get real. There are people who abuse the health insurance system in the US for personal financial gain. Some of these are "legal" while most are not. Examples: - The doctor who brags about placing 78 stents into a patient.(1,2)
- The nurse practitioner who writes opioid scripts for non-existent patients. Yes, docs can bill for writing scripts. She got flagged for writing scripts at the rate of 1 every 15 or so seconds, faster than she could possibly see any patient.(3,4)
- The 20-something son of a wealth corporate executive who lives at home, works under the table and is accepted into Medicaid. Yes, that's legal, just not ethical. Some of the top "cheats"are rich and white. (I wonder how many children of Congressmen are on Medicaid? Anyone know?)
It's perfectly fair for insurance companies to want to stop this abuse, and it's a relatively easy statistical analysis to catch much of it. Except that's not what the health insurance industry is doing. There are three common approaches which are flawed and waste time and money with potential pain and health implications for the patient. - Step therapy: This is a process in which patients are required to try less expensive procedures before moving to those that are more costly. It makes logical sense, certainly. If cheaper works, why go the more expensive route.
The problem occurs when the insurance company doesn't know whether the cheaper options have been tested, either under a different insurer or by a different doctor. Then, treatment is delayed while the patient gathers the information and submits it, often two or three times as insurers have done little to automate this process and patients are amateurs at this task. - Prior authorization: Insurers can require patients or doctors to submit a request for coverage in advance of a procedure or medication being undertaken. The problem is that neither patients nor doctors may be familiar with when prior authorization is required and when it is not. And the insurer may not inform either patient or doctor promptly when there is a denial.
What else is wrong with this procedure? - The absence of industry-wide standardized forms which would indicate the information required for approval. Doctors themselves may not know, or may delegate preparation of the request to a staff member who is less than fully knowledgeable.
- Lack of a clear and industry-wide standard for when prior authorization is required and when it is not.
- Lack of coverage for weekends and holidays.
- In one recent case, a neck surgery received three denials of prior approvals for different aspects of the surgery. Two were overturned on appeal by the doctor. The third was found to be irrelevant upon physical entry into the neck area.
Most stunningly, on denying the third appeal, a representative called the patient and explained, "Since you are medically viable for another surgery for another 25 years, there is no reason to spend extra money now to minimize the risk of a surgery you might not need." Even if the extra cost is for a thumbnail-size piece of titanium mesh. - Misuse of AI and doctors to approve or deny prior authorizations: There are two issues here:
- Insurers may refer prior approval requests to doctors who are not qualified in the medical area in question. Bluntly, the person who reviews the request for a back surgery may be a hand or foot specialist.
- Insurers tend to hire doctors that have had malpractice complaints. After all, what successful doctor is going to give up a lucrative private practice for a 9-to-5 job with an insurer?
- There are now class action law suits filed against Cigna, United Healthcare and Humana for using AI to decide on approvals of Medicare client requests for rehabilitation services. The law suits allege that doctors reviewing the claims were told to align their opinions with what the AI system said was correct. The lawsuits also claim that 90% of the prior authorizations requested that are reviewed using AI are rejected.
Bluntly, AI has nothing to do with "intelligence." It is a type of correlational analysis of patterns of words and graphical information. It is not competent at this time to make medical judgments, or for that matter, for many of the other applications for which people are trying to use it. The implications of the use of these tools include: - Delays in patient care, which can increase pain and have a negative impact on outcomes for the patient.
- Wasting the time of doctors and office staff.
- Wasting time and increasing costs for insurance carriers, which they in turn try to pass along to patients and government programs.
At this point, both competent rule making and common sense seem lacking among both legislators, regulators, and health insurance company executives. We can only hope that law suits will keep coming and growing to force change. Or we trash the US health insurance system and move everyone to national health care. Sources: - Alltucker, Ken, "Once every seven minutes, a Medicare patient gets a coronary stent they don't need, report says", 31 October 2023. https://www.usatoday.com/story/news/health/2023/10/31/hospitals-overuse-coronary-stents-medicare-patients/71393725007/
- Walter, Michael, "Once every seven minutes, a Medicare patient gets a coronary stent they don't need, report says", 31 October 2023. https://cardiovascularbusiness.com/topics/clinical/interventional-cardiology/coronary-stents-pci-cardiologists-unnecessary-800m
- U.S. Attorney's Office, Middle District of Florida, Press Release, "Brandon Nurse Pleads Guilty To Unlawful Drug Distribution And Acquiring Controlled Substances By Fraud", 12 December 2022.
- Diaz, Jaclyn, "78 people face charges for $2.5 billion in attempted health care fraud, DOJ says", 28 June 2023. https://www.npr.org/2023/06/28/1184795720/78-people-face-charges-for-2-5-billion-in-attempted-health-care-fraud-doj-says
- Bendix, Jeffrey, "Cigna using AI to reject claims, lawsuit charges", 7 August 2023. https://www.medicaleconomics.com/view/cigna-using-ai-to-reject-claims-lawsuit-charges
- Mole, Beth, "UnitedHealth uses AI model with 90% error rate to deny care, lawsuit alleges", https://arstechnica.com/health/2023/11/ai-with-90-error-rate-forces-elderly-out-of-rehab-nursing-homes-suit-claims/
- Goforth, Allan, "Humana is 2nd insurer to face lawsuit for AI-based denials of Medicare Advantage claims", 29 December 2023. https://www.benefitspro.com/2023/12/29/humana-is-2nd-insurer-to-face-lawsuit-for-ai-based-denials-of-medicare-advantage-claims/
Please note that some sources of information for this post cannot be listed due to HIPAA rules. Found online, and still relevant. | | | | You can also reply to this email to leave a comment. | | | | |
No comments:
Post a Comment