The in-state trainings provided an opportunity for case managers to learn specific skills to assist homeless individuals complete SSI/SSDI applications. This section describes the recruitment and participation of case managers in the trainings, the extent to which in-state trainings adhered to the SOAR curriculum, and case managers' perceptions of the trainings.
For in-state trainings to be useful, it is critical that stakeholders identify the right set of individuals to be trained. Some states initially tried to train as many case managers as possible. To recruit participants, training organizers leveraged existing networks by reaching out to organizations that had already agreed to participate in SOAR as well as other partners. In many communities, information about trainings also spread by word of mouth. States often recruited supervisors along with case managers for initial in-state trainings. Some states included supervisors to build enthusiasm for SOAR, and others thought that since case managers turned over rapidly, educating supervisors would provide continuity. Later, many state leads and trainers agreed that supervisors did not benefit from learning all of the details in the full two-day training.
Recruitment practices tended to change as the initiative matured, becoming more narrowly targeted. As states refined SOAR efforts, they targeted trainings to organizations that were participating in SOAR, particularly to case managers who would actually be responsible for assisting homeless individuals. And, among those, in order for training to have the most impact, states began targeting training to case managers who had the most time and resources to implement SOAR and who had the support of their supervisors. Because of staff turnover, however, it was often not enough to offer training to case managers from a particular agency just once. The availability of training to new case managers who replaced those who left varied based on organizational commitments and the availability of trainers.
For in-state trainings to be useful, participants and their supervisors must also understand the commitment involved in and expectations associated with attending training. In some communities, the state lead or trainers did little to build attendees' enthusiasm for SOAR or enlist their supervisors' support before the training sessions. Case managers were also not obligated to commit to using SOAR prior to attending the training. As a result, they and their supervisors sometimes viewed SOAR training as one of many staff development programs and not as an opportunity to develop new skills or a mandate to put those skills to use.
The number of trainings in states we visited ranged from two to twelve. The number of participants at those training varied substantially from fewer than ten participants to as many as sixty per training. Formal contracts helped to ensure that trainings were conducted as planned. For example, in one state, a local organization had a contract from a city government office to conduct at least one SOAR training per year, complete SOAR applications, convene stakeholders, and track data. Another had a contract with the state PATH agency to conduct 4 trainings per year. In each state, trainings occurred as planned. Other states were able to conduct trainings regularly without issuing formal contracts, but still others conducted trainings more haphazardly and not all that were intended to occur actually did.
The number of in-state trainings, however, is not necessarily a good indicator of much effort states have made to implement SOAR or how well states are implementing it. Some states conducted a large number of trainings but completed few SOAR applications and do not have infrastructure to support trained case managers. Other states held few trainings but have completed a relatively large number of applications and/or have achieved notable application approval rates. More important was whether states targeted trainings to case managers with the time and resources to implement SOAR, and whether trainers had support from their supervisors to spend time and resources providing ongoing assistance to case managers and other stakeholders.
The TA contractor provided states with assistance designed to ensure fidelity to the SSTR curriculum. The TA contractor observed the first in-state training and provided trainers with detailed feedback. The tool that the TA contractor used to provide feedback is included in Appendix C. This tool provided numerical ratings and comments on trainers' fidelity to several elements of each module of the SSTR curriculum. We analyzed these fidelity assessments from 24 of the 25 Rounds One and Two states and found that, overall, most trainers covered all of the curriculum's modules and taught the critical components described in Chapter I. While the TA contractor offered constructive criticism to each trainer to improve their approach, this feedback rarely changed how trainers conducted the trainings.
As the initiative progressed, trainers tailored the SSTR to their local community and the perceived needs of trainees. Some trainers conducted abbreviated trainings and omitted content regarding medical summary reports. The SSTR curriculum covers a great deal of material in a short amount of time, and some trainers saw a tension between providing comprehensive information and not overwhelming case managers. Many trainers abbreviated the curriculum to focus on the essentials of the application process rather than the relationship-building components of the initiative or the sections on engaging and empathizing with homeless clients. They deemed the latter unnecessary since most case managers came to the in-state trainings with substantial experience working with homeless individuals The extent to which trainers provided instruction on completing medical summary reports varied according to DDS staff feedback on how useful they deemed the reports. Some DDS staff reported that they did not use the medical summary (preferring strictly medical evidence without a third party's observation of functionality) while others found it useful.
In several states, informal or substantially modified trainings sometimes supplemented or replaced in-depth SOAR trainings. One organization designed special trainings for benefits specialists, which were conducted by a DDS examiner and DDS psychological consultant; these three and a half hour trainings focused on DDS procedures and requirements. These specialists did not receive the complete SSTR training, but had SOAR manuals available for their reference. Organizational leaders thought that these procedural trainings better met the needs of the benefit specialists than the SOAR training, which they perceived to be designed for clinical staff. In another state, the state lead gave brief presentations to social service organizations regarding the SOAR initiative to raise awareness about SOAR practices, and conducted six-hour trainings for case managers that excluded the community-building components of the SSTR curriculum.
Case managers' feedback about the in-state trainings was overwhelmingly positive. Almost all participants interviewed during our site visits agreed that the training helped them navigate the benefit application process and that the trainers were knowledgeable and engaging. An analysis of data from 470 participant evaluation forms from initial trainings in 23 of the 25 Rounds One and Two states confirmed these positive perceptions. Overall, 97 percent of the respondents agreed or strongly agreed that the training improved their understanding of the disability determination process and how to develop medical evidence, and 96 percent agreed or strongly agreed that the training would help them assist clients with SSI/SSDI applications. Respondents also praised the presenters and the curriculum content. The evaluation form is attached in Appendix D.
In the first round of in-state trainings, participants demonstrated a statistically significant increase in knowledge about the SSI/SSDI application process. We analyzed the results of a pre-post true-false questionnaire administered by the TA contractor during the training (the questionnaire is attached in Appendix B). Because the test changed slightly between Rounds One and Two, we analyzed these data separately. Pre-post data were available for six of the fourteen Round One states and eight of the eleven Round Two states. Table V.1 presents the results.
|Round||Number of participants||Average Score Before Training||Average Score After Training||Difference|
|1||180||48 percent||76 percent||28 percentage points***|
|2||207||62 percent||84 percent||22 percentage points***|
|*** Statistically significant at the .001 level.|
Despite the positive feedback and short-term knowledge gains, case managers, particularly those with little prior experience with the SSI/SSDI application process, often left the trainings overwhelmed at the prospect of having to put the training information into practice. On questionnaires administrated after the first round of in-state trainings, 46 percent of respondents agreed and another 30 percent strongly agreed that the pace of the training was just right not too fast and not too slow. However, during our site visit, some trainers and case managers complained that the training covered too much material too quickly, and that it was unrealistic to expect case managers to obtain a level of knowledge they could apply in practice in two days. Some thought it beneficial for case managers to attend multiple refresher trainings. Others, particularly case managers with prior background in SSI/SSDI applications, said that the training was intense but they appreciated the comprehensive curriculum.
"index.pdf" (pdf, 1.23Mb)
"apb.pdf" (pdf, 330.32Kb)