Figure 2 provides an overview of the interviewer responsibilities.
Before the in-person interview, directors were mailed and asked to complete the PIW (Appendix XIV). This mailing provided:
- Interview date, time, name of interviewer, and contact number tocall in case the appointment neededto be rescheduled.
- Reminder to prepare a residentcensus list before the interview.
- Questions asked during the in-person interview that likely would require referring to records or consulting other staff, and instructions for completing the worksheet.
About three-fourths of facilities had a PIW filled out at the time of the in-person interview.
Figure 2. Interviewer responsibilities
SOURSE: CDC/NCHS, National Survey of Residental Care Facilities
An interviewer’s initial task was to contact the facility about 4 days before the in-person interview to remind the director of the appointment using the Reminder Call Script (Appendix XV).Before placing the call, interviewers reviewed information entered in the Facesheet by recruiters to better understand nuances and specifics about the facility. In most cases, the reminder call was done by phone, but if the facility director requested it, an e-mail was sent.The goal of this call was to reconfirm the appointment to reduce no-shows and cancelled and break-off interviews, and to remind the director to complete the PIW and prepare a resident census list before the interview.
During data collection, 40 previously set appointments with directors required rescheduling because of interviewer unavailability or for efficiency reasons, in order to include the case in a travel cluster assignment. When this occurred, interviewers used the Appointment Reschedule Call Script (Appendix XVI) to reschedule the visit. Thirty-six of these broken appointments were rescheduled, three could not be rescheduled, and one case was ineligible.
Facility In-person Interview
For most facilities (73%), the entire in-person interview was conducted with one respondent who was usually the facility director, administrator, owner, or operator of the facility or its residential care component. The interview consisted of asking questions about the facility, conducting sampling for a preselected number of current residents depending on the size of the facility (three, four, or six), and then asking questions about those sampled residents. The in-person interview took, on average, 2 hours and 7 minutes to complete for small and medium facilities, 2 hours and 26 minutes to complete for large facilities, and 3 hours and 3 minutes to complete for very large facilities. After every interview, interviewers gave respondents a ruler, pen, and a thank you letter (Appendix XVII) as tokens of appreciation for participating in the survey.
CAPI Questionnaire Instruments
Data were collected using CAPI software on laptop computers. The CAPI system allowed interviewers to move correctly and efficiently through the questionnaire and use modified question wordings based on responses to prior questions. Only questions specific to the individual facility or resident were asked, skipping unnecessary questions. CAPI included hard and soft range checks, and hard and soft consistency checks among question items. Hard edits required the interviewer to fix the discrepant data before the interview could continue. Soft edits resulted in a prompt for the interviewer to either correct the data or suppress the edit. Use of the CAPI system also eliminated the need to enter data from a hard-copy questionnaire, thereby reducing this type of data entry error. The CAPI instrument was also programmed to enable the interviewer to complete the facility questionnaire or the resident selection questionnaire in any order, to accommodate the schedules of the facility director and staff. In all but 43 cases, interviewers began with the facility questionnaire.
The NSRCF CAPI instrument was composed of four different modules (viewable at: ftp://ftp.cdc.gov/pub/Health_Statistics/NCHS/Dataset_Questionnaires/nsrcf/2010/).
The facility instrument included questions on facility characteristics such as ownership, size, types of living arrangements and amenities, policies, staffing, services, and general resident characteristics. The question wordings, response options, and accompanying skip patterns for this questionnaire are provided in Appendix XVIII. The facility questionnaire took an average of 65 minutes to complete.
For the resident selection instrument, interviewers entered the number of residents currently residing at the facility and based on the number of beds the facility had, three, four, or six different numbers were randomly generated by CAPI. Interviewers then used these numbers to identify which residents on the census lists to select for the resident questionnaires. The question wordings, response options, and accompanying skip patterns for this questionnaire are provided in Appendix XIX. The resident selection instrument took about 6 minutes to complete.
The resident questionnaire included questions on the sampled residents’ demographics, living arrangements, activities, health conditions, cognitive and physical functioning, and services received. The question wordings, response options, and accompanying skip patterns for this questionnaire are provided in Appendix XX. Respondents to the resident questionnaire were those who were most knowledgeable about the sampled residents and had access to their records; in addition to directors, they were RNs, LPNs, and personal care aides who provided direct care services. The resident questionnaire took an average of 19 minutes each to complete.
After leaving the facility, interviewers also filled out a debriefing questionnaire that contained questions about the interviewers’ experience, including a personal assessment of the accuracy of answers obtained and difficulties encountered in conducting the interview. These questions are provided in Appendix XXI. The debriefing questionnaire took on average about 7 minutes to complete.
When a respondent needed to end the in-person interview before all of the required questionnaires were completed, an attempt was usually made to complete the interview either by phone or in person. If only one resident questionnaire remained, the interviewer attempted to complete this interview by telephone. If several components of the survey were not completed, field management staff decided whether the interview was to be continued on a revisit, by phone, or not continued at all. Of the 119 facilities that broke off the interview before its completion, 64 later completed them by telephone, 34 later completed them in person on a revisit, and 21 were never completed.
Refusal Aversion and Conversion Strategies
Initial refusals or resistance to participate in NSRCF occurred at several points during the recruiting and interviewing phases: 1) when explaining the study and completing the screener questionnaire, 2) when setting the appointment, and 3) when rescheduling with facility respondents who cancelled an appointment, no-showed for an appointment, or broke off the interview midway through the questionnaire. Altogether, 941 facilities refused at least once over the course of the field period. The most prevalent reason given by facility respondents at all phases was that they did not have the time to complete the interview. Other common reasons included: not interested, confidentiality concerns, chain or supervisor approvals, and study concerns.
Different strategies were employed to convert these cases, depending on the reason given. The study protocol included the following approaches:
Using the FAQs and knowledge of the survey to explain the study, address directors’ concerns, and explain how the survey protocols were flexible enough to handle scheduling needs.
Allowing a cooling-off period between the refusal and attempts to convert the refusal.
Performing a drive-by visit to make personal connection with the respondent.
Sending targeted letters (e.g., Having Trouble Reaching You, No Time, Not Interested/Not if Voluntary, Cancelled Appointment) (Appendix XXII).
Facilitating supervisor and chain approval.
Additional strategies were also developed during data collection to enhance these refusal conversion efforts. They included:
Perform drive-by visits on a more widespread basis by expanding these visits not only to gain cooperation but also to screen for eligibility, appoint, and complete some interviews on the spot.
Reduce the overall length of the interview for very large facilities, by changing the number of required resident interviews from six to four residents.
Offer facilities the option to complete the last resident interview by phone.
For facilities that screened as eligible, develop and mail a Schedule Appointment Reminder Letter to those that delayed or hesitated setting their appointment; and a Reminder Postcard, sent 3 weeks before the end of data collection to facilities still being actively pursued (Appendix XXIII).
Of the 269 facilities where a drive-by visit was made, 154 (57%) were favorably resolved; that is, 21 facilities completed the facility interview that same day, 95 facilities completed the interview later on, and 38 were found to be either ineligible or out of business. About 375 facilities received some type of conversion letter, and of these, 156 (42%) eventually completed the facility interview. The Reminder Postcard was mailed late in the data collection period and yielded only two additional completed facility interviews.
Follow-up With Chains
When a facility told the recruiter or interviewer that they could not proceed with the survey without receiving approval from their chain office, further efforts were made to gain the cooperation of the chain. These steps included:
Recruiter or interviewer asked the facility for a chain contact name.
Chain Outreach package (Appendix VIII) was mailed to the chain contact.
RTI scientific and senior staff contacted the chain official by telephone.
These calls began in May and continued until November 9, 2010, during which time the chain gave approval, denied approval, or more commonly, stopped communicating with NSRCF project staff. The facilities that requested help in obtaining chain approval resulted in 51 chains requiring follow-up. This number corresponds to about 5% of all chains in the sample. About one-third (n = 16) of the 51 chains that were called granted approval for the facility to participate, which resulted in 49 facilities ultimately receiving chain approval via this approach.
Additional Strategies to Increase Participation
Several other strategies were initiated toward the end of the data collection period to boost the number of final completed cases. These included:
Extending the field period by 6 weeks.
Reassigning cases to different supervisors and recruiters.
Changing supervisor responsibilities to allow more time for case review and strategizing.
Using high-performing interviewers as recruiters.
Setting weekly production targets per stratum.
Offering financial payments to interviewers and recruiters for setting facility appointments that resulted in a completed interview.
Assigning NSRCF project senior staff to contact a few facilities that were identified as being potentially receptive to participating.
Reducing the number of resident interviews from six to four in very large facilities.
Targeting efforts on remaining cases located in specific size strata and geographic regions where nonresponse was more prevalent in order to reach survey completion goals for these areas.
Lengthening the field period in combination with other strategies employed resulted in about 240 additional completed facility interviews during the extended 6 weeks of data collection, or about 10% of all interviews completed in NSRCF. The incentive program itself produced 188 of these completed interviews.
Cases that were still in the process of recruitment for participation after data collection ended were sent a letter (Appendix XXIV) to thank them and to let them know the study had ended.