This section discusses the activities involved in developing the LAS database, including database design, quality control checks, resolution of database errors, and coding of text responses.
A database was designed for data entry with numeric fields to capture nominal or scale responses and text fields for text responses. Survey data were entered into the database exactly as recorded on the survey forms. The database file was saved in comma-delimited format and exported into the Statistical Package for the Social Sciences™ (SPSS) program, along with variable and value labels.
Several steps were taken to ensure the accuracy, validity, and internal consistency of data in the LAS database. A quality control check was performed on all returned surveys to verify that all modules were returned and complete. Problems encountered during this check were recorded on a quality control sheet, for followup with respondents prior to data entry.
Problems encountered during data entry (e.g., illegible responses, questions responded to incorrectly, etc.) were also documented on the quality control sheet. In addition, the database was designed to limit data entry errors by only accepting valid responses. Finally, to check for errors and make corrections, quality control and validity checks were designed and run in SPSS.
To facilitate the resolution of quality control errors in the database, the errors were divided into the following categories:
- Skip pattern violations — a response when skip pattern question directed no response should be given;
- Multiple responses — more than one response when only one response was expected;
- Missing data — questions not completed by the respondent;
- Illegible responses — response could not be read;
- Internal inconsistency — response was inconsistent with another response in the survey. (This occurred primarily among questions asking the number of personnel employed at an agency.); and
- Out of range responses — response did not fit within the specified parameters.
Different mechanisms were employed to resolve errors depending on their type. First, the team checked all errors by reviewing the hardcopy survey. When an error could not be resolved by a review of the questionnaire, an attempt was made to contact the survey respondent for clarification for the following types of errors — multiple responses, missing data, illegible responses, internal inconsistency, and out of range responses, as well as for missing modules and incomplete modules.
Skip pattern errors were recoded in SPSS. The recode was designed to accept the most logical correct response, and recode, or deselect, the "root" or followup response. In some cases, the skip pattern errors resulted in the creation of new codes. For example, some questions were frequently answered with a multiple response, when only one response was expected. Because of the frequency of this error, an additional code was developed to capture both responses selected. This allowed the data to more accurately reflect the actual practice at CPS agencies.
Finally, some errors were resolved through followup with State-level child welfare agencies. For example, after several counties in one State completed and returned their surveys, the State child welfare director indicated that on a few variables, the agencies had provided incorrect statistics. The director wanted to review the completed surveys for these counties, and provide the correct information, if necessary. The survey administration team provided the State with copies of the completed surveys, and the State in turn provided corrected surveys.
Codes were established for text response questions in some of the survey modules and added to the database. All of the open-ended responses in the fifth module were coded. In Modules 2, 3, and 4, the text responses describing the agency's responsibility for different forms of maltreatment were coded. Otherwise, the remaining text response and other/specify fields were coded only if there were more than 45 responses.