Each qualitative evaluation activity explored the beneficiaries’ healthcare priorities, methods for managing health information, computer experiences, challenges, and perceived impact of using the PHR in order to assess beneficiary preferences and identify key challenges and barriers as well as enablers for PHR use for this population. Additionally, utilization data analysis revealed beneficiary PHR usage patterns and frequency of use, and suggested preferences for PHR functions within this population.
For each evaluation activity, NORC submitted an Institutional Review Board (IRB) package internally to assure the evaluation plan and methods developed met NORC’s requirements for appropriate and ethical research standards. Once official approval to conduct each evaluation activity was received from the IRB, NORC began the recruitment process. NORC contracted with Alan Newman Research (ANR) to assist in all recruitment activities and meeting logistics.
All instrumentation for data collection activities were developed in close consultation with ASPE, CMS, Dr. Patricia Flatley Brennan and the members of the PHR Technical Expert Panel. Below, we describe specific processes related to design, data collection and analysis for each evaluation component beginning with beneficiary discussion groups.
Discussion Groups with Beneficiary MyPHRSC Users. NORC conducted two in-person meetings with beneficiaries considered ‘users’ of MyPHRSC. In close collaboration with CMS, ASPE, and key management staff at QSSI, NORC secured an initial selection pool of beneficiaries who had used the PHR one or more times since registering for the tool and who were physically located within the Columbia, SC region.
NORC developed a pre-discussion screener to provide context for the discussion groups and to prepare beneficiaries to recall PHR functions and use. ANR assisted in fielding the brief pre-discussion screener, which asked beneficiaries when they signed up for the PHR; how often they used the PHR; how they found out about the PHR; and any issues or concerns they had in using the PHR. Beneficiaries were also asked to spend time using the PHR prior to the discussions in order to allow them to be prepared to share their experiences with the PHR and to ensure they could accurately recall their experiences. The pre-discussion screener is included in Appendix 9 while the full discussion group guide used for this activity can be found in Appendix 5. The topics covered in the discussion group guide include:
- Overall Computer Experience
- Perceptions of PHR Utility
- Perceptions of PHR Usability
- Perceived impact of PHR Use on Health Status
- Consumer Satisfaction with PHRs
Two ninety-minute group meetings were held in Columbia, South Carolina on September 11th and September 12th, 2008. NORC staff documented each session, including all the individual comments from beneficiaries. Each meeting was led by two facilitators accompanied by one note taker to record the conversations. The first meeting was attended by four beneficiaries, while eight beneficiaries attended the second meeting, yielding a total of 12 participants.
NORC compiled meeting notes and any discrepancies were resolved by coordinating notes with audio recordings taken during each meeting. Final notes from each discussion group were reviewed by discussion topic and responses. NORC performed analysis of the meeting results to identify emergent categories or recurring themes relating to the key research questions, particularly, those related to use, usability and utility of the PHR from the patient perspective.
Semi-Structured Telephone Interviews with MyPHRSC Nonusers. Nonuser perceptions of the PHR were gathered through nine semi-structured telephone interviews held from November 20th to December 12th, 2008. QSSI provided NORC a list of beneficiaries who had not logged in since registering for MyPHRSC and who were physically located in the Columbia, South Carolina region. QSSI provided NORC with a list of names and contact information for 50 beneficiaries considered nonusers. NORC placed telephone calls to 12 beneficiaries on this list to assess their interest in participating in these discussions, and 9 of 12 beneficiaries signed up to participate.
The approved semi-structured interview guide for the nonuser interviews covered the major topic areas listed below. The full semi-structured interview guide is included as Appendix 6.
- Overall Computer Experience
- Perceptions of Potential PHR Utility
- Perceptions of PHR Usability
- Perceived impact of PHR Use on Health Status
- Consumer Satisfaction with PHRs
- Reasons for not using the PHR
NORC staff documented each beneficiary telephone interview, carefully recording all responses. Each interview was led by one facilitator accompanied by one note taker to record the conversations. NORC performed analysis of the interview results to identify emergent categories or recurring themes relating to the key research questions, particularly, those related to usability and usefulness of PHRs from the beneficiary perspective.
Observations of Beneficiary PHR Users. NORC conducted the user observations with MyPHRSC pilot participants in order to gain context-based insights into how PHRs fit into the beneficiaries’ home lives, their relationships with health providers, as well as into the overall healthcare delivery process. This task leveraged a set of social science techniques commonly used in the private sector to create a unique data set that added a strong complement to the other evaluation activities.
The goals of the user observations were as follows:
- Examine the key usability components of MyPHRSC
- Examine the key utility components of MyPHRSC
- Uncover the key issues, concerns and perceptions related to MyPHRSC use and adoption
- Explore the impact of the PHR on patient-provider interactions, patient self-care and self-management
Beneficiary contact information was provided by QSSI. The method for participant selection for this task is not designed to obtain a representative sample. However, this is not critical to the approach taken in this task, or as important as obtaining productive observation sessions with the people who were selected. NORC recruited beneficiaries from three groups: 1) Beneficiaries who were eligible for Medicare due to age or disability; 2) Beneficiaries who had primary care responsibility for themselves or had a caregiver; and 3) Beneficiaries who actively participated in the discussion groups or beneficiaries who did not participate. A total of five respondents were interviewed in five sessions over the course of two days. All participants were from the Columbia, South Carolina area.
|Participant||Eligibility||Independent or Caregiver||Participant in Beneficiary Discussion Group|
|Participant #1||Age eligible||Independent||Yes|
|Participant #2||Age eligible||Independent||Yes|
|Participants #3||Age eligible||Caregiver||Yes|
|Participant #4||Disability status||Independent||Yes|
|Participant #5||Age eligible||Independent||Yes|
Beneficiaries were asked to complete a Personal Health Record Booklet prior to the user observation. This booklet included an informed consent form and the following three activities: 1) Use your personal health record; 2) Explain where you keep your health information; and 3) Write a Journal.
The final discussion guide covered the major topic areas related to beneficiary background, comfort with technology, health information management, current use of MyPHRSC and key challenges. The full discussion guide is included as Appendix 7.
During the observation, participants were asked to demonstrate how they would use the PHR in various situations. For each participant, NORC also conducted data entry tasks to identify any variability. Although not every participant viewed every page of the PHR, all were asked to view the health record summary and to conduct a search on medical information. Most participants also entered medication information into the PHR while some were asked to complete a Wallet Card.
Given that MyPHRSC was a novel piece of technology, it was thought that beneficiaries might have difficulty relating to a direct question about their general impressions on using the application. NORC therefore used a social marketing technique of emotional affinity to provide a framework for participants to share some of the emotions they experienced when using MyPHRSC. Participants were asked to select from a set of images, which acted as a surrogate for the array of emotions they might experience when using the PHR. Although not part of the initial goals for the user observations, understanding what emotional reaction MyPHRSC provoked in users was a supplemental question that emerged over the course of the study.
Three staff from NORC conducted the user observations, including two co-facilitators who led the protocol and one note-taker who took detailed notes. All sessions were audio recorded.
Discussion Group with Providers. Provider perceptions of PHRs were gathered through one 90-minute discussion group held via WebEx and teleconference on November 13th, 2008 with a total of 9 providers. NORC contracted with ANR to assist in recruitment of providers practicing in South Carolina with a practice population base of 25% or more Fee-for-Service Medicare beneficiaries. ANR also assisted in fielding a brief pre-discussion screener. The sample screener is included as Appendix 10.
The approved provider discussion guide addressed the major topic areas as listed below. The full discussion guide is included as Appendix 8.
- Experience using Health Information Technology in practice
- Overall Perceptions of PHRs
- Perceptions of PHR Utility
NORC staff documented the session, including all individual comments from providers. The meeting was led by two facilitators accompanied by one note taker to record the conversations. In total, nine providers joined the meeting by teleconference and WebEx.
NORC performed analysis of the meeting results to identify emergent categories or recurring themes relating to the key research questions, particularly those related to provider perceptions of use, usability and usefulness of PHRs.
Analysis of Usage Data. NORC analyzed one year of registration and usage data from the PHR application to describe characteristics of beneficiaries that registered for and used MyPHRSC, as well as to understand broad patterns of use for MyPHRSC. The quantitative analysis was intended to be used in conjunction with the qualitative analysis, as the findings provided further avenues of research to be explored with more comprehensive data. The analysis provides information relevant to the following key research questions:
- Key utility and usability components of MyPHRSC. How frequently do users return to the PHR? What are the general patterns of use? What are the key features of the PHR? Which PHR functions are used most and least often by beneficiaries? Based on the usage data, we can also make inferences in terms of PHR usability.
- Impact of the MyPHRSC on disease management and patient self-care. What is the prevalence of chronic conditions among the population? Are there any differences in use between registrants with chronic conditions and other users?
To guide the analysis of the usage data, NORC developed a set of three hypotheses. The development of the initial set of hypotheses considered the overall research objectives of this evaluation, key findings from the PHR environmental scan and a preliminary analysis of the usage data. The hypotheses were also reviewed with ASPE, CMS and the PHR Expert Panel for feedback. The three hypotheses were:
Beneficiaries with chronic conditions or diagnoses are more likely to use MyPHRSC than beneficiaries without such conditions. Previous research has shown that these beneficiaries exhibit a greater need for and higher interest in PHRs.  
Women are more likely than men to use MyPHRSC. Previous research has shown that females may exhibit a greater need for and higher interest in PHRs.
Younger beneficiaries are more likely than older beneficiaries to use MyPHRSC. Younger individuals are more likely to have access to computers, be more comfortable making use of technological tools and may have fewer physical and cognitive issues that may prevent use of PHRs.
Data Sources. Over the course of the evaluation, NORC received summary data tables from QSSI which could then be linked by the user’s unique identifier. The data was generated from the reporting tool in MyPHRSC. All data received were de-identified by QSSI. These data were summarized over different time periods, including daily, monthly, and quarterly periods (Table 2).
|Reporting Period||Variable Name|
User log in count
User log in count
Total views for each MyPHRSC page
|Not time-dependent||User ID
Date of birth
Variables of interest. The analyses involved predictor variables, such as age and gender, and dependent variables, like user log-ins and page views. Some dependent variables were constructed from the data provided.
Gender, age, and illness/condition diagnoses were selected as predictor variables since they were independent of MyPHRSC. Age as of the midpoint of the pilot (September 30, 2008) was calculated from the beneficiary’s birth date. Information on the illnesses/condition diagnosis for each user was based on a translation of International Statistical Classification of Diseases and Related Health Problems (ICD-9) diagnostic codes to Systematized Nomenclature of Medicine—Clinical Terms (SNOMED CT) descriptions that the participants might recognize. The illnesses/conditions diagnosis list contained only those conditions for which Medicare processed a claim containing that particular code. MyPHRSC automatically included the Medicare claims descriptive terms, and users could enter additional conditions as needed.
Some MyPHRSC users were not the actual Medicare beneficiaries. Authorized representatives were those who a beneficiary designated to be able to view their health information on MyPHRSC. These individuals were assigned a unique ID separate from the beneficiary’s user ID. Since NORC wanted to assess the use of the PHR by Medicare beneficiaries, authorized representatives were excluded from analysis.
The two dependent variables included:
- Login category: never, single, or multiple
- Months used
The two dependant variables were less straightforward to develop. First, it was necessary to operationally define “use” of MyPHRSC. Based on the data available “use” could be defined as any of the following:
- registering for MyPHRSC
- user log ins to MyPHRSC
- logging in once
- logging in more than once
- viewing particular pages in MyPHRSC
All beneficiaries who participated in the pilot had to register for MyPHRSC in order to participate. For the purposes of this analysis, dependent measures were constructed primarily from the log-in data. For the first dependent variable, registrants were initially classified into three categories that differentiate between non-use, one-time use, and repeat-use of MyPHRSC:
- Never logged in to MyPHRSC during the study period
- Logged in to MyPHRSC one time during the study period
- Logged in to MyPHRSC more than one time during the study period
Additionally, a dependent measure was constructed that tabulated the total number of months during the study period in which a registrant logged in to MyPHRSC at least once in that month. Rather than aggregating total logins, which may have been clustered in a single or only a few months, this measure allowed for an investigation of repeated use of MyPHRSC over time, and may have more accurately represented sustained usage. We can imagine, for example, that a beneficiary might have logged in several times in the first month, lost interest, and never logged in again. This is different than a beneficiary who logged in once or twice a month every month to keep track of prescription drug use or to review processed claims.
Finally, the number of times a beneficiary viewed each page of MyPHRSC was included as a variable in this analysis.
The data analysis strategy revolved around developing descriptive information about different kinds of registrants, as well as testing the proposed hypotheses. For all variables, whether predictor or dependent, we provided a frequency, or count, of the number of occurrences of that variable. These frequencies provided us with a broad-brush understanding of the beneficiaries who utilized MyPHRSC and general findings about their usage patterns.
Because age and gender both have implications for technology use, we conducted bivariate analyses of categorical variables separated by age and gender and tested these associations for significance. A p-value of less than 0.05 indicates that, if the null hypothesis is true, there is less than a 5% chance of obtaining the observed result or one equally extreme. The bivariate analyses provided associations between variables — in the example above, thinking about age ranges for men and women and their usage might be of more interest than simply thinking about gender or age and usage. To understand the combined effects of the variables of interest, we conducted multivariate analyses.
To test the hypothesis that beneficiaries with chronic conditions or diagnoses were more likely to use MyPHRSC than beneficiaries without conditions, we conducted multivariate logistic regression predicting MyPHRSC use by illness/condition diagnosis, while adjusting for age and gender. This adjustment for age and gender was made since the elderly are more likely to have chronic diseases and have different usage patterns, and women are likely to have different usage patterns than men.
To test the second hypothesis that women are more likely than men to use MyPHRSC, NORC utilized a multivariate logistic regression. Use of MyPHRSC was the dependent variable, and gender was the main explanatory variable. NORC adjusted for age. A similar strategy was used for the third hypothesis — that younger beneficiaries are more likely to use MyPHRSC than older beneficiaries.
To test the third hypothesis that younger beneficiaries are more likely to use MyPHRSC than older beneficiaries, NORC used multivariate logistic regression, and adjusted for gender since women may have different usage patterns than men.