Mastodon Response to Misleading WaPo Hospice Article: Part the Third ~ Pallimed

Thursday, January 9, 2014

Response to Misleading WaPo Hospice Article: Part the Third

(If you missed Part 1 or Part 2, click on the links. If you don’t have time, here is the quick summary. The Washington Post published an article December 26th, 2013 claiming hospice care was taking billions from Medicare presumably in waste and fraud. This series offers a critical review debunking the claims and offering a more insightful view of the challenges hospices face. Today Dr. Scott tackles some statistics and the way forward - Ed. Sinclair)

One of the consistent errors made by people commenting on this story, either in the comments sections or on social media, is the failure to understand the difference between estimated marginal life expectancy and actual marginal life span. The hospice regulations require two physicians to certify that the patient’s life expectancy is less than or equal to six months if the disease/illness runs its normal course, given their current condition and taking into account any decisions made by the patient/family to forego potential treatments. Many are erroneously confusing life span and life expectancy.

If the regulations instead required two physicians to certify that the patient’s actual marginal life span will be six months or less, then these commenters would be right. In that case, patients who lived longer than six months would be evidence of error. You would probably allow a few of them, as an acceptable error rate. One commenter argued that patients who lived longer than six months should be considered analogous to the patients with suspected appendicitis, had surgery, yet turned out to have a normal appendix. You accept a percentage of surgeries for normal appendixes in order to minimize the risk for missing abnormal appendixes. (This was a more common idea prior to the more extensive use of imaging to guide the surgery decision.) This is an inappropriate analogy. Starting an appendectomy for someone with a normal appendix is a mistake. No benefit accrues to the patient. A patient who lives for seven months after being admitted to hospice is not necessarily a mistake. It does not follow that the original certification was wrong. In fact, it was probably right. At worst it was likely to be borderline. Half the patients in a group with a median life expectancy of six months will live longer than six months. Those who do still received a benefit from being on hospice. And Medicare saved money because they were on hospice.

Since the regulations stipulate life expectancy, it would be useful to think about what a frequency distribution curve would look like for a set of patients with a median life expectancy of less than six months. It would be a positively-skewed distribution. It bunches up at the left side, since it is bounded at zero days. Patients cannot die sooner than right now. It tails off to the right, with what are called outliers. The flatter the curve, the further out that tail will go. For non-cancer diagnoses that are harder to prognosticate, the curve is flatter, with more outliers and more extreme values for outliers. This is not indicative of fraud. Claiming it is represents a misunderstanding of the mathematics involved.


Hospices have traditionally enrolled patients with life expectancies considerably less than six months. It has been a concerted effort to try to enroll patients sooner. This has been encouraged (even by MedPAC) as a way to save Medicare a great deal of (more) money. Consider the “ideal” situation, where most patients were enrolled when they reached a median life-expectancy of six months. In this case, half of the patients would have a life-expectancy greater than six months. There would (for most diagnoses) be outliers that stretched into years. A hospice that succeeds in enrolling patients earlier in the process—closer to the six month estimate – will end up with more patients who live 7+ months. This is simply the mathematical consequence of a positively-skewed distribution, which is what a life-expectancy distribution for hospice patients looks like. (The stats geeks reading this, who are already seething about my over-simplifications here, may be thinking that the distribution is actually bi-modal. I hear you. I feel your pain. But I’m not going to try to go into that idea now.)

Currently, when you see differences in the MEAN length of stay for for-profit hospices compared to not-for-profit hospices, it is because of differences in the mix between cancer and non-cancer diagnoses. The MEDIAN length of stay shows no difference. Cancer has a more predictable course. Its life-expectancy distribution curves are narrower. Non-cancer diagnoses have flatter curves, with more skew to the right. The outliers are further away from the median. The more of these patients enrolled, the more extreme outliers you expect, and the further you pull the mean away from the median. The authors admitted that NHPCO had explained this to them, but then ignored that explanation (or didn’t understand it) when they asserted that “the growth in the average duration of hospice care stems less from the decline in the proportion of cancer patients than from another trend. Patients who are suffering from a non-cancer ailment began staying longer on hospice: Their average stay in hospice care grew from six weeks to almost 11 weeks on average between 2002 and 2012.“ Do note that the authors are using an increase in average length of stay to 11 weeks as part of their argument. 11 weeks is less than three months. The Medicare Hospice benefit asks us to certify patients with an expected life expectancy of six months or less, more than twice that average. Any argument that seriously uses an 11 week mean length of stay as an argument that an unacceptably high proportion of these patients really had a life expectancy of greater than six months is prima facie a bad argument.

Others commented that non-death discharges should be viewed as errors as well. While I agree that the very high percentage of non-death discharges is worth looking into further, I vehemently disagree with the idea that non-death discharges should in general be considered failures, wastes of money, or fraud. There are many reasons for non-death discharges. Those of us who are hospice providers consider many of these reasons to be successes rather than failures.

Take the cancer patient who would have wanted to try chemotherapy but whose oncologist feels that performance status is too poor and burden outweighs benefit. This patient can enroll in hospice. Patients often feel better, improve, after enrolling in hospice. The extra help at home. The excellent symptom management. The optimization of medication regimen (which often includes discontinuing cholesterol, blood pressure, and diabetes medications that don’t have a net benefit vs burden). All of these things can contribute to a patient improving. Some of these patients will improve sufficiently that they are candidates for chemotherapy. We happily discharge these patients, knowing that we helped them and that they are not getting the treatment that is consistent with their goals. We call these patients “hospice graduates”.

The same sort of improvement can be seen in non-cancer patients as well. Dementia patients are a prime example of this. We see improvements for the same reasons: more care at home, better symptom management, optimization of medication regimens. Now these patients improve enough that they no longer fit the category of “life-expectancy less than six months”. We discharge these patients, but LESS happily. We are happy that they improved, but we’d like to keep them, since we think our interventions led to the improvements. For Alzheimer’s disease, these patients are STILL DYING. They still have a terminal illness. They just have a life expectancy estimate that exceeds six months. Most of us wish we didn’t have to discharge these patients. We discharge them because we play by the rules. A rational, well-designed Medicare hospice policy wouldn’t require us to do so. Note that the scrutiny placed on hospices for these patients has actually led to more patients being discharged (as hospices fear they will be challenged on these admits). If you are more interested in scoring rhetorical points than you are in having an honest discussion, then you can use this “increase in live discharges” to claim this proves that hospices were doing something wrong all along.

The authors inappropriately used mean as the measure of central tendency when discussing length of stay. Hospice length of stay data graphs out as a positively-skewed distribution. The honest measure of central tendency to use with such a distribution is the median. Use of the mean instead is considered to be deceptive. It will be listed in any handbook of “How to Lie with Statistics”. This is basic, intro-level stuff. There are only two possibilities. Either the authors knew that mean was inappropriate or they didn’t. If they didn’t know that using the mean is inappropriate, then they are not qualified to be incorporating such statistics into their articles without seeking help. (Their editor should have caught this. If neither the editor nor the authors were sufficiently familiar with intro-level statistics, they should have asked a starving adjunct professor teaching at the local community college to help them out.) If they DID know that mean was inappropriate and chose to use it anyway, then they are guilty of deception, of believing that their narrative was so important that the means justified the ends when trying to convince others. Including both mean and median would be acceptable reporting (and is what both NHPCO and MedPAC do).

The mean length of stay did increase between 2000 and 2011, but didn’t substantially between 2009, 2010, and 2011. Between 2009 and 2010, both the mean and median length of stay DECREASED. The median (50th percentile) length of service in 2010 was 19.7 days, a decrease from 21.1 days in 2009. The average length of service dropped to 67.4 days in 2010 from 69 days in 2009. Comparing 2000 to 2011, the average length of stay increased from 54 to 86, but the median stayed the same at 17 days. Between 2000 and 2011, total number of hospice patients increased from 534,000 to 1,219,000.

For access to the primary data:
http://www.medpac.gov/chapters/Mar13_Ch12.pdf
http://www.nhpco.org/press-room/press-releases/research-published-jama
http://www.nhpco.org/press-room/press-releases/hospice-facts-figures
And for annual stats back to 2006 see this Pallimed post.

Thus the same number of short stay and long stay patients are being added. Long stay patients defray the costs of taking care of short stay patients. And long stay patients enrolled in hospice still save Medicare money, even the outliers.

The authors used flawed MedPAC data when discussing profit margin. MedPAC itself admits that their data is inaccurate and incomplete. It fails to include costs for federally mandated volunteer (for at least 5% of patient care hours) and bereavement services (for at least 13 months after patient death). These are real costs to hospices, and the data ignores them.

From this NHPCO press release:
“The discrepancy in the numbers is an indication of a change in the calculation methodology, by excluding the costs of delivering statutorily mandated services, rather than pointing to the fact that hospice margins are actually shrinking. For MedPAC to recommend countering an erroneous growth in hospice margins by reducing the annual inflationary adjustment is absurd and potentially devastating to the hospice community”.
Finally, let me reiterate that I actually agree that there is some fraud by hospice firms. There is some fraud in all situations involving companies with a profit motive. The authors did not make their case that it is widespread. They didn’t come close to making their case that hospice firms are costing Medicare any money at all, much less “draining billions”. They pointed out that some lawsuits are pending. I’d point out that these represent potential problems and not yet proven ones. Judges have ruled against Medicare and for hospice organziations in the past in hospice CAP payment cases.

Despite having grossly exaggerated the cost to Medicare, falsely suggested that discharged patients weren’t actually dying, and deceptively used statistics to advance their narrative, the authors did raise some important points as well. They described some fairly unpleasant recruitment tactics. I consider some of these tactics to be unethical. I would not want to be involved with a hospice that used them. The recruitment bonuses are particularly galling. Note that the NHPCO actually considers these to be unethical as well. They are not, however, against the current rules, nor are they illegal. And they are the same sorts of things that happen at businesses around the country. I want hospices to be better than businesses around the country. My gut feeling is that hospices ARE in general better than businesses around the country. (I think that this stems partly from the fact that hospice team members, current blogger excepted, in general are very nice people.) But we can’t expect all hospice firms to be better than other corporations just because it’s the nice thing to do. With money involved, it isn’t a surprise that some companies aren’t. If Medicare would like to write some better rules, I’d be among those cheering.

Other posts in this series:
Part 1 (Tue): Debunking the hyperbolic headlines
Part 2 (Wed): Did these hospices enroll patients inappropriately? Do for-profit and not-for profit hospices differ?
Part 3 (Thu): Digging into the statistics and the way forward

Bruce Scott (@skipbidder) is an academic physician in Ohio, fellowship-trained and board certified in Geriatrics and in Hospice and Palliative Medicine. His hobbies include boardgaming, cooking, and pedantry.

Pallimed | Blogger Template adapted from Mash2 by Bloggermint