Opportunities for Engaging Long-Term and Post-Acute Care Providers in Health Information Exchange Activities: Exchanging Interoperable Patient Assessment Information. Creation and Development of the Minimum Data Set

12/01/2011

Based on recommendations in the 1986 IOM study, in 1988 the HCFA's Health Standards and Quality Bureau contracted with a project team led by the Research Triangle Institute (RTI), with subcontractors from the Social Gerontological Research Center, HRCA (Boston), the Center for Gerontology and Health Care Research, Brown University (Providence, Rhode Island), and the Institute of Gerontology, University of Michigan (Ann Arbor) to develop and evaluate a national assessment instrument and data system for nursing home assessment in the United States.48 The Minimum Data Set for Nursing Home Resident Assessment and Care Screening (MDS) and the RAPs, which are triggered by MDS assessment items or combinations of items, were developed through this contract. An expert panel representing a wide variety of clinical disciplines and professional organizations involved in geriatrics served in an advisory role. From 1989 to 1991, these experts participated at every stage of the design and testing of the MDS.49

Version 2.0 of the RAI/MDS was developed under a second contract awarded by HCFA in 1994 to the HRCA, a subcontractor on the original contract.50 The 1995 training manual for version 2.0 was written by HRCA in conjunction with HCFA. The 2002 and 2007 updates to the manual appear to have been written by CMS (HCFA’s successor agency) staff. The interRAI web site states that members of interRAI developed the RAI and the RAPs for the MDS version 2.0, and calls the RAI "The interRAI LTCF".51 InterRAI refers to the RAPs as Clinical Assessment Protocols (CAPs), "in recognition of their applicability to more populations than nursing home residents alone."52 While it is undoubtedly true that those who were major contributors to MDS 2.0 are or were also members of the interRAI, a 2001 letter to the editor of The Gerontologist from an employee at HCFA made it clear that in terms of the contract for developing MDS 2.0, there was no direct relationship between HCFA and interRAI.53

The MDS 3.0 revision appears to have originated within the Office of Clinical Standards and Quality at CMS and revised based on comments received from the nursing home industry, professional groups, individual providers and expert panels. To initiate the revision, CMS worked with stakeholders in identifying objectives, chief of which was to improve clinical relevance.54 CMS’ goal with respect to the revision was to reduce provider burden and improve clinical items such that data collected would be clinically relevant, accurate, and useful. CMS also sought to limit the data submitted to information the Federal Government needed to know, such as issues surrounding payment, quality, and regulatory oversight.55 The data collection form was restructured for greater usability, and items that were confusing or unnecessary were deleted. Another goal of MDS 3.0 was improving user satisfaction and increasing the efficiency of collecting data for reporting purposes. Long-term goals include moving toward standardized nomenclature and integration of the assessment into EHRs.56

A draft MDS 3.0 was released in April 2003 for public comment. At the same time, CMS awarded a contract to the RAND Corporation to evaluate the revision, including validating new and revised sections of the draft in community populations and facilities. Areas of emphasis in the revision include diagnostic coding, delirium, pain, falls, depression, behavior disorders, quality of life, and palliative care. Key changes include basing assessments, when possible, on resident interview, and also a focus on improving accuracy and efficiency.57 The Commonwealth Fund provided RAND with grant money to convene a panel of nursing home experts to provide input.58

The evaluation team, in addition to RAND, included the Harvard Medical School Department of Health Care Policy, the Colorado Foundation for Medical Care (a Quality Improvement Organization), Carelink (for developing the Instructions and Guides), the Kleinmann Group, and RSS Consulting Services.59 In December 2003, the scope of the project was expanded when CMS signed a Memorandum of Understanding (MOU) with the Veterans Health Administration (VHA) to work together to improve the MDS 3.0. In October 2004, VHA Health Services Research and Development (VHA HSR&D) initiated a large research project to validate changes in MDS 3.0 in VA nursing homes, in order to contribute to the 3.0 revision.60

As part of the RAND study, a workgroup was assembled to review the instruction manual developed for MDS 3.0. This workgroup included representatives from the RAI Coordinator Group, the American Association of Nurse Assessment Coordinators, the American Health Care Association, the American Association of Homes & Services for the Aging, and the VHA. The RAND contract for evaluating MDS 3.0 ended March 31, 2008, and the report was released in April 2008.61

Initially, MDS 3.0 appeared to be on a fast track, with a revision expected to be available by December 2004.62 However, a coalition of stakeholder organizations in LTC submitted a letter of concerns, including the need for development of MDS 3.0 to be coordinated with activities promoting HIT and HIT standards.63 In August 2004, HHS's ASPE and CMS co-funded a project through which Apelon Systems, a medical terminology and vocabulary contractor, would attempt to apply HIT standards to a sample of the MDS to demonstrate how standardization would support the use of content and messaging standards and assure that patient data be interoperable and comparable across settings.64 As noted above, these HIT content and messaging standards were approved by the Secretary of HHS as accepted CHI standards and announced in a Federal Register notice in 2007.65 In 2007, the AHIMA Foundation, with subcontractors from Regenstrief (LOINC), Apelon Systems, and Altshuler Associates (HL7), began work on a contract with ASPE to apply content and exchange standards to the full MDS, starting with the MDS 2.0 data set and moving to MDS 3.0 when CMS made clear their intent to implement the revised assessment tool and data set.

Based partially on concerns voiced by a number of stakeholders regarding how data submitted to CMS under the MDS 3.0 would work with electronic records and the limited time available to implement system updates and provide staff training from the time when materials would be ready to the proposed implementation date, CMS extended the original implementation date from October 2009 to October 2010. The American Association of Homes and Services for the Aging wrote a letter to President-elect Obama's transition team encouraging the delay in order to make the MDS 3.0 interoperable, arguing that CMS could achieve interoperability under the MDS by adopting certain standards instead of CMS’ proprietary data exchange formats.66 Others voiced similar concerns after the proposed rule for implementing MDS 3.0 was published in May 2009.

In the FY 2010 proposed rule for skilled nursing facility (SNF) PPS [74FR222208], CMS acknowledged the concerns about interoperability issues, and announced they would implement MDS 3.0 using the LOINC representation of the MDS 3.0 data set. CMS considered use of the HL7 Clinical Document Architecture (CDA) for exchanging standardized assessment content, but did not feel comfortable with its adoption without further study to gauge the impact of its use on such a large scale process as the submission of MDS data, which numbers approximately 30 million submissions annually. Similarly, CMS studied the use of the Systematized Nomenclature of Medicine-Clinical Terms (SNOMED-CT), but did not feel the semantic matching to MDS data was sufficient for CMS' payment, survey, and quality measurement needs. CMS indicates they have no plans to include the HL7 Clinical Document Architecture (CDA), messaging standards, or SNOMED-CT in the October 2010 release of MDS 3.0. CMS is considering the use of HL7 messaging standards with the CARE tool, but stated, “We are soliciting comments on the most appropriate clinical standards to use for clinical assessment instruments.”67 In the final 2010 SNF PPS rule [74FR40288], the issue of interoperability standards was not addressed.

The final version of the MDS 3.0 item set, data specifications and resident assessment manual, also was delayed to provide time to work on pieces such as the care area assessments, which replace the RAPs, the RAI user’s manual, quality measurements, and CMS’ Five Star Quality Rating System for nursing homes. Portions of The Long-Term Care Facility Resident Assessment Instrument User’s Manual for Version 3.0 were released in November 2009 and the complete manual was expected to be available sometime in early 2010. Copyright information contained in the RAI manual indicate it is a public document and may be copied freely. The manual recognizes a number of organizations and stakeholders, LTC experts, contractors, and CMS staff for their contributions to the “development, testing, writing, formatting, and review of the MDS 3.0 RAI Manual, MDS 3.0 Data Item Set, and MDS 3.0 Data Specifications.”68 The RUG Version IV (RUG-IV), a new classification system designed for use with MDS 3.0, was developed through the CMS-sponsored STRIVE (Staff Time and Resource Intensity Verification) project carried out by the Iowa Foundation for Medical Care of West Des Moines, Iowa.69

The development of MDS 3.0, though separate from, is linked to the development of a new assessment tool, the CARE instrument, and MDS expertise has been shared with the developers of CARE. The principal investigator on the MDS 3.0 project is also an advisor to the CARE demonstration project.70 CMS is developing a roadmap to address the future, and a strategic vision for the assessment instruments, including CARE and MDS. Despite the delays, MDS 3.0 is now on schedule for implementation in October 2010, while a report to Congress with the results from the CARE demonstration is required in 2011.71

View full report

Preview
Download

"StratEng.pdf" (pdf, 1.74Mb)

Note: Documents in PDF format require the Adobe Acrobat Reader®. If you experience problems with PDF documents, please download the latest version of the Reader®

View full report

Preview
Download

"StratEng-A.pdf" (pdf, 198.14Kb)

Note: Documents in PDF format require the Adobe Acrobat Reader®. If you experience problems with PDF documents, please download the latest version of the Reader®

View full report

Preview
Download

"StratEng-B.pdf" (pdf, 534.93Kb)

Note: Documents in PDF format require the Adobe Acrobat Reader®. If you experience problems with PDF documents, please download the latest version of the Reader®

View full report

Preview
Download

"StratEng-C.pdf" (pdf, 214.97Kb)

Note: Documents in PDF format require the Adobe Acrobat Reader®. If you experience problems with PDF documents, please download the latest version of the Reader®

View full report

Preview
Download

"StratEng-D.pdf" (pdf, 4.65Mb)

Note: Documents in PDF format require the Adobe Acrobat Reader®. If you experience problems with PDF documents, please download the latest version of the Reader®

View full report

Preview
Download

"StratEng-D1.pdf" (pdf, 12.32Mb)

Note: Documents in PDF format require the Adobe Acrobat Reader®. If you experience problems with PDF documents, please download the latest version of the Reader®

View full report

Preview
Download

"StratEng-D2a.pdf" (pdf, 3.62Mb)

Note: Documents in PDF format require the Adobe Acrobat Reader®. If you experience problems with PDF documents, please download the latest version of the Reader®

View full report

Preview
Download

"StratEng-D2b.pdf" (pdf, 3.58Mb)

Note: Documents in PDF format require the Adobe Acrobat Reader®. If you experience problems with PDF documents, please download the latest version of the Reader®

View full report

Preview
Download

"StratEng-E.pdf" (pdf, 3.66Mb)

Note: Documents in PDF format require the Adobe Acrobat Reader®. If you experience problems with PDF documents, please download the latest version of the Reader®

View full report

Preview
Download

"StratEng-F.pdf" (pdf, 162.5Kb)

Note: Documents in PDF format require the Adobe Acrobat Reader®. If you experience problems with PDF documents, please download the latest version of the Reader®

View full report

Preview
Download

"StratEng-G.pdf" (pdf, 340.24Kb)

Note: Documents in PDF format require the Adobe Acrobat Reader®. If you experience problems with PDF documents, please download the latest version of the Reader®

View full report

Preview
Download

"StratEng-H.pdf" (pdf, 173.91Kb)

Note: Documents in PDF format require the Adobe Acrobat Reader®. If you experience problems with PDF documents, please download the latest version of the Reader®

View full report

Preview
Download

"StratEng-I.pdf" (pdf, 194.52Kb)

Note: Documents in PDF format require the Adobe Acrobat Reader®. If you experience problems with PDF documents, please download the latest version of the Reader®

View full report

Preview
Download

"StratEng-J.pdf" (pdf, 311.03Kb)

Note: Documents in PDF format require the Adobe Acrobat Reader®. If you experience problems with PDF documents, please download the latest version of the Reader®

View full report

Preview
Download

"StratEng-K.pdf" (pdf, 4Mb)

Note: Documents in PDF format require the Adobe Acrobat Reader®. If you experience problems with PDF documents, please download the latest version of the Reader®

View full report

Preview
Download

"StratEng-L.pdf" (pdf, 2.33Mb)

Note: Documents in PDF format require the Adobe Acrobat Reader®. If you experience problems with PDF documents, please download the latest version of the Reader®

View full report

Preview
Download

"StratEng-M.pdf" (pdf, 164.3Kb)

Note: Documents in PDF format require the Adobe Acrobat Reader®. If you experience problems with PDF documents, please download the latest version of the Reader®