Quality in Managed Long-Term Services and Supports Programs. A. Audits


All the MLTSS programs conduct routine audits of MCOs. What varies among them is the frequency and intensity of focus. Half of the programs conduct annual audits in-house (Pennsylvania, North Carolina, Texas, and Wisconsin), and two delegate this responsibility to the EQRO (Texas, Wisconsin).

Currently, Michigan conducts audits every other year on each MCO; in the off year they focus on validating that the MCO has implemented its corrective action plan from the previous year. Both Arizona and Minnesota audit the MCOs on a three-year cycle. Tennessee conducts different types of audits with varying frequencies:

  • Annual: Fiscal Employer Agent (FEA) Audit; Area Agencies on Aging and Disability7 (AAAD) Audit; Money-Follows-the-Person8 (MFP) Audit, Provider Qualifications Audit.

  • Semi-annual: Care Coordination Audit; Critical Incident Audit.

  • Quarterly: New Member Audit; Referral Audit.

  • Monthly: Network Adequacy Audit.

Minnesota's audit process begins with annual audits conducted internally by each MCO; the MCOs submit the results of their audits to the state. Eighteen months following receipt of each MCOs audit report the state then conducts a "look-behind" audit focused on the MCOs implementation of remediation activities in response to any issues or deficiencies that the MCO had identified.

One of Tennessee's additional care coordination auditing activities that sits outside the semi-annual audit bears mentioning. "Ride-alongs" have been instituted where state staff accompany the MCO care coordinator on member visits and assess the care coordinator's ability to meet contractual care coordination requirements. These "ride-alongs" occur six times per quarter per MCO, with the state sitting down with MCO management staff to debrief afterwards.

B. Managed Care Organization Performance Reporting

Information on MCO performance is not always articulated as "performance measures" per say, but may be found in reports that states require the MCO to submit. Several of the MLTSS programs are "combo" waivers (i.e., programs combining the 1915(c) authority for Medicaid HCBS waivers with the managed care authority of the 1915(a) or 1915(b)). With combo waivers CMS requires that the state collect, use and report performance measures demonstrating the state's adherence to the 1915(c) assurances, most of which are quality-related. States typically require that the data for the assurance-based performance measures, or the measures themselves, be reported by the MCOs. In recent years the "Terms and Conditions" of 1115 demonstration waivers, another regulatory vehicle used for MLTSS, have required performance measures for some of the 1915(c) assurances as well.

It is not surprising that many of the performance measures in MLTSS programs are similar to those found in the FFS 1915(c) programs. Even if CMS did not mandate "c-like" measures, one would still expect similar measures given the commonality of populations and expectations for good practice in assessment, person-centered planning, and safeguards for member health and welfare. Many of the process measures that the case study states report are related to timeliness of screening, assessment, care planning and service delivery as well as the extent to which defined processes for addressing critical incidents and grievances are followed.

In both FFS and MLTSS there is keen interest in the development and use of outcome measures for Medicaid LTSS programs.9, 10, 11, 12, 13, 14 Consensus about what constitutes "good" outcomes for individuals using LTSS is somewhat more elusive than in the health arena where treatment outcomes are more definitive. That said, the states in our study are collecting data on several outcome measures; some are population-specific and others are applied across populations.

EXHIBIT 3. Examples of Process Measures in Study States
  • Timeliness of screening/assessment/reassessment (based on state standard)
  • Timeliness of service plan development (based on state standard)
  • Timeliness of service initiation (based on state standard)
  • Timeliness from FEA referral to receipt of consumer-directed services (based on state standard)  
  • Timeliness of care coordinator face-to-face and telephonic contacts
  • Care coordinator caseload & staffing ratio
  • Percent of complaints received & resolved
  • Late/missed visits by service type
  • Percent of grievances received & resolved

Since there is considerable overlap in the measures used by the eight case study states, Exhibit 3 and Exhibit 4 present examples of some process and outcome measures, respectively, without identifying the states utilizing them. More details on performance measures employed by each state may be found in the Appendices at the end of this report.

EXHIBIT 4. Examples of Outcome Measures by Study States
  • Number of episodes of law enforcement involvement
  • Number of psychiatric inpatient & emergency room hospitalizations
  • Number of mental health crisis interventions
  • Percent in competitive employment
  • Percent living in a private residence alone, with spouse or non-relative
  • Number of substantiated recipient rights complaints per 100 beneficiaries served  
  • Increases in:
    • Annual dental exams
    • Diabetes management
    • Annual gynecological exams
  • Community tenure of persons transitioned from nursing homes
  • Number of persons transitioned from nursing home to community
  • Number of persons entering nursing home
  • Potentially preventable readmissions
  • Potentially preventable complications

The reader will notice that in Exhibit 4 some health-related outcomes are listed. Conceptually, states are supportive of including such measures, especially since one of the hallmarks of MLTSS is coordination of LTSS and medical care, with the intended effect being the achievement of better health outcomes. However at this juncture Medicaid agencies are somewhat reluctant to include health outcomes as performance measures until their plans are fully integrated with Medicare. While the MCOs may be expected to coordinate with Medicare providers that their members use, ultimately they do not have control over those providers. The states argue that neither they nor their MCOs should be held accountable for outcomes over which they do not exert control. This argument should abate as states begin participating in the CMS Duals Demonstrations.

C. Verification of Service Receipt

Verifying the delivery of home and community-based LTSS services is a critical component of managed care oversight due to the vulnerability of populations served. Late or missed visits, especially those that provide assistance in essential every day activities, place the member at potential risk of untoward outcomes. Moreover, managed care entities are required by federal regulation to monitor delivery of services by providers as well as take corrective action if service delivery is late or missed.15 In MLTSS, this requirement is closely connected to ensuring member safeguards, and important to allaying beneficiary and advocate fears that MCOs "skimp" on services in order to contain costs and maximize profit.

Five out of the eight programs verify service receipt against what was authorized in the service plan (Arizona, Michigan, Pennsylvania, Tennessee, Wisconsin); they compare whether members receive the services identified in their service plans. Two programs (Michigan, North Carolina) verify service receipt against reimbursement; the latter is a proxy approach because verification in this instance is not directly tied to the service plan. In seven out of the eight study states reviewed, states monitor service receipt retrospectively through reports submitted by the MCO.

Only in Tennessee is service verification done on a real-time basis. Tennessee utilizes an electronic visit verification (EVV) system where direct care providers clock-in and clock-out via phone from the member's home. The days and times that providers are expected to arrive are programmed into the system; if the worker does not clock-in within 15 minutes of the scheduled start time, an alert is sent to both the provider and the MCO. The MCO/provider is expected to deploy back-up workers and they, as well as the state, have the ability to track whether and when the replacement worker clocked in. The EVV system produces reports on missed and late visits by MCO, provider and service type.

The frequency of MCO reports on service verification varies from monthly in Arizona, and quarterly in Pennsylvania, to annually in North Carolina and Michigan.

Although its approach is still retrospective in nature, Arizona has implemented a "gap report" strategy that the MCO must submit monthly. What distinguishes Arizona from the other retrospective approaches is that their reporting requirements go beyond counts of missed visits and includes the reason for the service gap as well as actions taken at the individual level to address the missed visit.

While the retrospective validation approach is the most common, unless the MCO has a systematic mechanism for being alerted in a timely fashion when service delivery is late or missed, deployment of needed back-up help cannot be assured. On the other hand, while the EVV system in Tennessee is considered by many as a promising practice, it does have potential cost implications associated with up-front installation, as well as costs associated with staffing resources to monitor the EVV system for no-shows. For this approach to be most effective, it needs to be monitored (by providers and the MCO) in real-time so that when an alert is sent indicating a worker no-show, either the provider or MCO proactively contacts the member to assess the immediate need, and then deploys a back-up worker as necessary. In addition, the EVVs approach may also pose some challenges for verifying self-directed services. One of the features of self-direction is that it allows members to have flexibility about the day and time of day a service is delivered. As currently configured, EVV is driven by the date/time the worker is supposed to arrive and if a member changes this without formally requesting a change, then a worker no-show alert will be triggered. Moving forward it will be instructive to follow how Tennessee addresses this seeming constraint in the EVV system.

D. Mortality Reviews

In 2008, the U.S. Government Accountability Office (GAO) recommended that CMS encourage states to conduct mortality reviews in 1915(c) HCBS waivers.16 The mortality review process typically involves screening a death to ascertain whether it meets a pre-determined criteria for an in-depth review, investigation by a mortality review committee of circumstances that led to the death, a systems-level review to examine any commonalities across deaths to identify and recommend changes to reduce future risk of death.

The GAO was silent on the advisability of mortality reviews for 1915(c) waivers serving other populations and for MLTSS programs. However, good practice in community-based LTSS suggests that mortality reviews are an important oversight in LTSS program.17 Among the programs reviewed, we found evidence that seven conduct mortality reviews. Six delegate this responsibility to the MCO. The Michigan program which enrolls members with severe mental illness and IDD requires investigation of unexpected deaths only. Arizona requires MCOs to conduct mortality review for deaths among members with IDD only. The Tennessee Choices program, serving the Aged/Disabled population, does not require mortality reviews.

EXHIBIT 5. Member Feedback Surveys
  State   Entity Conducting Survey Survey Type Notes
  State     MCO     Contractor  
AZ X X   Satisfaction  
MI   X   MHSIP1 Mail survey to members with mental illness.
X     Core Indicators In-person survey with IDD members; 1 year grant from ACL2 to cover costs of data collection; uncertain about sustainability due to cost.  
MN X     Satisfaction Managed Care Public Programs Satisfaction Survey.
NC   X   Satisfaction MCO must contract with external vendor; MCO surveys must be approved by the state.
    X Core Indicators In-person survey with IDD members.
PA   X      
TN     AAAD Experience
of Care
Based on items from the PES & MFP Quality of Life Survey items.
X       HCBS Experience  
of Care
Participating in pilot study for CMS-funded HCBS Experience of Care Survey (external survey vendor).
    FEA Satisfaction Survey of consumer-directed members.
TX     EQRO Experience
of Care
LTSS-focused items added to CAHPS survey.
WI X     Experience
of Care
State-developed PEONIES.
  X   Satisfaction Nature of survey at MCO discretion.
  1. Mental Health Statistic Improvement Program Survey.
  2. Administration on Community Living, Michigan Department of Health and Human Services.

E. Member Feedback

CMS' recent MLTSS guidance calls for states and/or MCOs to measure members' experience of care and quality of life. All of the programs reviewed field either satisfaction or experience of care surveys, with most administering them on an annual basis. In some instances, the surveys are conducted by the state, whereas others are completed by the MCO. There are a few examples of these surveys being administered by a contractor (EQRO, AAAD,18 FEA19). In a few cases, there is a dual-survey approach where both the state and either the MCO or another contractor conduct them. In Michigan, separate surveys are conducted with members having mental illness and IDD. A few states use externally tested instruments (Mental Health Statistics Improvement Program [MHSIP], Core Indicators, Participant Experience Survey (PES), HCBS Experience of Care Survey) while others rely on state or MCO-developed instruments. In addition to surveying members, two of the programs conduct focus groups (Pennsylvania) or listening sessions (Wisconsin) with members.

F. Member Oversight

"Stakeholder engagement", inclusive of program oversight, is considered a key element in CMS' guidance document. Moving forward, CMS expects states to involve stakeholders, including members, in program evaluation and monitoring. CMS also expects states to require MCOs to convene member advisory committees to provide feedback on MCO MLTSS operations. We were therefore interested in learning how the established MLTSS programs engage members in monitoring and broader program oversight.

In Michigan, members sit on a quality committee and in North Carolina and Texas they have seats on advisory committees. Five programs require advisory committees or state staff to elicit input from members as part of an MCO's annual review or periodically through member focus groups. Three programs require MCOs to engage members either by having them serve on the MCOs' governing board (Wisconsin), or by having seats on the MCOs' Advisory and/or Quality Committees (Pennsylvania, Tennessee). Minnesota requires that each MCO have a Member Advisory Committee and that it meet regularly. Tennessee is unique in that it requires each MCO to have at least 51% of the seats on their Advisory Group be comprised of members or their authorized representatives.

G. External Quality Review Organization Responsibilities

Our interest in the EQRO pertains to activities they perform above and beyond those required under the Medicaid managed care regulations (compliance review, validation of encounter data, performance measures and PIPs). In particular, we were focused on additional quality management activities for which states employ EQROs in their MLTSS programs.

Four of the study states maintain a more traditional relationship with their EQRO (Arizona, Michigan, Minnesota, and Pennsylvania). But in Wisconsin, the EQRO takes on the added task of conducting the care management review in the MCOs. The EQRO assumes multiple additional tasks in Tennessee; rather than just validating PIPs they are involved in assisting the MCOs with PIP implementation, as well as responsibilities for training the MCOs and state staff on quality-related issues. The Tennessee EQRO also reviews all MCO corrective action plans from its annual compliance review and conducts a legislatively-mandated network adequacy review. The EQRO's scope of work in Texas includes focused studies, an annual satisfaction survey of members, validating encounter data as well as developing data for the program's performance dashboard and planned MCO report cards.

H. Long-Term Services and Supports Performance Improvement Projects

All Medicaid managed care programs must have an ongoing series of PIPs focused on clinical and non-clinical areas.20 In this inquiry, our interest was to discover the types of PIPs MLTSS programs conduct and if they have particular relevance to MLTSS services and/or populations--in essence whether the programs require their MCOs to engage in LTSS-specific PIPs.

PIPs often span more than one year as they require time for design and implementation, as well as time to review results and draw conclusions about the PIP's impact. Two of the study programs require the MCOs to conduct at least one LTSS PIP (Pennsylvania, Wisconsin); two states mandate two LTSS PIPs (Michigan, Tennessee); and two programs require three LTSS PIPs (North Carolina, Texas). In some states, some PIPs are dictated by the state, where in others they are at the discretion of the MCO. In some cases, the state may periodically mandate a specific PIP (e.g., in 2012 Tennessee required a PIP on rebalancing). In Texas, the EQRO establishes two of the three PIPs with the third at the MCO's discretion. Examples of LTSS PIPs in addition to Tennessee's rebalancing PIP include improvement initiatives on:

  • Increased use of adult day care;
  • Increased integration of behavior and physical health;
  • Increases in depression screenings;
  • Reduction in preventable hospitalizations;
  • Increases in diabetic care; and
  • Reduction in nursing facility rates.

Another approach that surfaced is an initiative in Texas and Minnesota where MCOs work together to develop collaborative PIPs. The advantages of this approach is that the MCOs are not working at odds with each other, and it is especially helpful for providers who may be involved in implementing PIPs who work for more than one MCO.

I. Quality-Related Financial Incentives, Penalties and Withholds

States have opportunity in designing their payment structures to reward MCOs for quality care/outcomes and to dis-incentivize them for performance below acceptable thresholds. In our interviews with states as well as in reviewing MCO contracts and other supporting information on state websites, we identified multiple examples of states using monetary incentives, penalties or withholdsto support quality-related program expectations and goals.

Five programs offer quality-related incentives (Michigan, Minnesota, Tennessee, Texas, and Wisconsin), two issue monetary penalties (Minnesota, Tennessee) and four impose quality-related withholds (Minnesota, Pennsylvania, Tennessee, Texas). Monetary incentives are offered for:

  • Transitioning members from institutional settings to community (Wisconsin, Tennessee).

  • Increasing number of members with self-determination arrangements (Michigan).

  • Improvement in number of consumers engaged in meaningful employment (Michigan).

  • Improvement in number of consumers in private residence (Michigan).

  • Improvement in number of consumers discharged from detoxification unit and seen for follow-up within seven days (Michigan).

  • Superior clinical quality, service delivery, access to care and/or member satisfaction (Texas).

  • Reductions in inpatient hospital costs (Texas).

  • Optimal chronic disease care (limited to diabetes care, coronary/vascular disease care) (Minnesota).

In Michigan penalties can be levied for patterns of non-compliance, poor performance on a performance indicator standard, substantial inappropriate denial of services, and substantial or repeated health and safety violations. Tennessee is a strong advocate for assessing liquidated damages and its MCO contracts include detailed tables of amounts per infraction for "transgressions or omissions" ranging from threats to the smooth and efficient operation of the program to actions/inactions that result in threat to the member. Penalties can range from $100 per day to $10,000 per month depending on the breach.

Withholds of MCO payments are a tool used by Pennsylvania, Tennessee, and Texas to encourage delivery of good quality of care and services. Minnesota uses withholds for promoting MCO compliance with completing and submitting care plan audits and health risk screenings/assessments.

J. Report Cards

Two states were in the process of developing report cards at the time the study was being conducted. In Texas, the EQRO was assisting the state to finalize a legislatively-mandated MCO report card which will eventually be published on the state's website.

Tennessee was developing their report card from a combination of data from required MCO reports and audit results. At the time of the study, the report card was being used internally by state monitoring staff in MCO oversight. In the future the state expects to integrate the MCO performance data into the larger report card structure for the entire Medicaid managed care program (TennCare).

View full report


"LTSSqual.pdf" (pdf, 1.03Mb)

Note: Documents in PDF format require the Adobe Acrobat Reader®. If you experience problems with PDF documents, please download the latest version of the Reader®