This week, the Office of Health Strategy (OHS) unveiled their plan to monitor for unintended consequences of their plan to cap healthcare cost increases. OHS acknowledged in the plan that the Cap “may cause providers to reduce provision of necessary healthcare services so as not to exceed the benchmark.” Only a very small fraction of potential harms from the Cost Cap that would not be detected by OHS’s plan. We outline a few examples below. To protect Connecticut residents’ health, OHS needs to have a broad, robust, and tested underservice monitoring system in place before applying the Cost Cap.
Underservice includes inappropriate undertreatment (stinting) of patients as well as cherry picking more lucrative patients and dumping (lemon dropping) more costly or difficult patients (adverse selection). Underservice poses a bigger risk for underserved populations and people with high healthcare needs.
OHS and their consultants have stated that there is no evidence of underservice in Massachusetts’s Cost Cap, which began in 2013 but has not made coverage more affordable in the state. OHS and the consultants have offered no evidence for either their assertion or a description of an underservice monitoring plan in Massachusetts.
Underservice poses a bigger risk for underserved populations and can widen health disparities. The Institute of Medicine, in their Unequal Treatment report, found that, “Financial factors, such as capitation and health plan incentives to providers to practice frugally, can pose greater barriers to racial and ethnic minority patients than to white patients, even among patients insured at the same level.”
OHS is relying on parts of the underservice plan for PCMH Plus, Medicaid’s controversial shared savings plan, for most of their Cost Cap plan. Advocates have raised deep concerns that PCMH Plus’s underservice monitoring is insufficient and leaves people at risk. Unlike PCMH Plus, which includes a control group of Medicaid members not in the program to identify underservice, there is no control group for the Cost Cap monitoring plan as it will cover the entire state.
OHS’s underservice monitoring plan relies on process measures to identify underservice, not health outcomes or reduction in health status of Connecticut residents. OHS is choosing to include less than half the measures used by the PCMH Plus program. OHS also relies on prevention metrics, which are important, but not generally expensive so not as likely a target for underservice.
The majority of OHS’s thirteen measures are screenings, not treatment. Concerns were raised in the Technical Team meeting reviewing the plan that only monitoring screenings will miss whether people get follow up care for problems identified.
|Underservice measures Outcome measures in green Measures self-reported by providers in red||PCMH+||Proposed Cost cap|
|Avoidable ED visits||✓|
|Adolescent well care visits||✓||½ ✓|
|Avoid antibiotics for adult acute bronchitis||✓|
|Developmental screenings, ages 0 – 3||✓||✓|
|Diabetes HbA 1c screening||✓||✓|
|Medication management for people with asthma||✓|
|Behavioral health screening, ages 1 – 17||✓||✓|
|Metabolic monitoring for children on antipsychotics||✓|
|Readmission within 30 days||✓|
|Anti-depression medication management||✓|
|Prenatal and postpartum care||✓||✓|
|Follow up after hospitalization for mental illness||✓|
|Follow up after ED visit for mental illness||✓|
|Annual fluoride treatment, ages 0 – 4||✓|
|Annual monitoring for persistent medications||✓|
|Appropriate treatment for children with upper respiratory infection||✓|
|Asthma medication ratio||✓||✓|
|Breast cancer screening||✓||✓|
|Cervical cancer screening||✓||✓|
|Chlamydia screening for women||✓||✓|
|Diabetes eye exam||✓||✓|
|Diabetes: medical attention for nephropathy||✓|
|Follow up for children prescribed ADHD medication||✓|
|HPV vaccine for female adolescents||✓|
|Oral evaluation, dental services||✓||✓|
|Use of imaging studies for low back pain||✓|
|Well child visits, ages 2 – 5||✓||½ ✓|
|Colorectal cancer screening||✓|
|Controlling high blood pressure||✓|
½ ✓ — measure is aggregated with another measure
Like PCMH Plus, the Cost Cap monitoring plan also relies on patient satisfaction surveys, patient complaints and grievances. While patient experience is a critical piece of good care, survey results do not correlate with the quality of healthcare provided or health outcomes that could be harmed by the Cost Cap and underservice. Complaints and grievances are also unreliable indicators of underservice. Literature reviews have found that the most common patient complaints are about being treated poorly and poor communication with providers. This is entirely understandable. Patients are experts in how they should be treated and communicated with, but most would not know if they were given an inferior treatment or drug to save money.
It is important to note that PCMH Plus consumer notices did not mention the risks of underservice or that providers benefit financially by lowering the costs of members’ care. Consequently, it is not surprising that consumers didn’t file many complaints or grievances about the impact of a program change they were unaware of.
Unlike PCMH Plus and Medicaid, OHS is not proposing to include mystery shopper surveys to assess access to care or patient steering to less appropriate care. OHS has no plans to survey local community providers, safety net providers, constituent workers, local public health departments, or social service organizations to identify underservice. OHS has no plans to monitor marketing materials, patient communications, or decision support tools for evidence of cherry-picking/lemon dropping or steering for underservice.
Also as in PCMH Plus, the Cost Cap monitoring plan relies on shifts in patient risk scores between practices and health systems to identify cherry picking and lemon dropping. Risk scores are assigned to patients by an algorithm, usually proprietary, based on their demographics and diagnoses. Risk scores are meant to reflect the likelihood of incurring high healthcare costs. In PCMH Plus, there is evidence of health systems gaming the risk scoring system for financial gain and/or, far worse, a decline in the health of members in the program.
We’ve collected from advocates only a few examples of the many ways underservice could cause unintended harm that would not be detected by OHS’s monitoring plan.
- Reducing home health hours and range of services provided
- Mental health – reductions in access to inpatient care, Intensive Outpatient Programs, limited panel of therapists, less access to therapists who can also prescribe, reducing timing and duration of care, creating more administrative barriers to entering or remaining in therapy, suicides, reduced community placement options to avoid inpatient care and to facilitate discharge, increased suicides, criminal justice involvement, impact on education, family and income impact
- Note: The alternative quality contract, which was designed to reduce the total costs of care, reduced access to mental health care in Massachusetts.
- Substance abuse – longer wait or fewer admissions to inpatient rehabilitation facilities, Intensive Outpatient Programs, and access to therapy as for mental health, more reliance on medication management over therapy, increases in relapses, overdoses, suicide, criminal justice involvement, impact on education, family and income
- Breast and other cancers – access to reconstructive surgery, latest therapies, mortality or recurrence of cancers due to less follow up care
- Shift to less effective, less costly medications with more side effects for any condition
- Fewer transplants, dialysis at home instead of in a facility
- Denying people with disabilities services or care not related to their disability, that would be provided to anyone else, a higher bar for approval for people with disabilities
It is highly unlikely that OHS’s proposal, either the portion that can be implemented now or the future plan, would detect much underservice. It is important to remember that the lack of evidence is not evidence that there is no underservice.
 OHS’s Cost Cap monitoring plan incorrectly labelled adverse selection as “stinting.”