Appendix B: The Evaluation of Houston’s Kids
Summer program evaluation description excerpted from: Houston’s Kids: Summer 2006 Pilot Program Final Evaluation Report, September 15, 2006, Roger Durand, Ph.D.
Evaluation of the Summer Program
The process evaluation, conducted by the external evaluator, was done by means of direct observation, a survey of representatives of the Houston’s Kids collaborating organizations, and through other informal “ethnographic” means. (The core collaborative partners included the United Way of the Texas Gulf Coast, the Joint City-County Commission on Children, Alief Independent School District, the YMCA of Greater Houston, Communities In Schools Houston, and the Children’s Museum of Houston.)
I. Evaluation Overview: Types and Activities
In evaluating this project, both a process and an outcomes assessment were conducted. In general, process assessments are concerned with the implementation of a project or program, especially how the project is being implemented, how the various components complement one another, and how the various project participants or “stakeholders” perceive the project. In the present instance, the results of the process assessment were used formatively—i.e., for ongoing project development and improvement, as well as for the purpose of enabling the diffusion and dissemination of implementation information to others considering the adoption of a similar or identical program.
Outcomes assessments, on the other hand, are generally concerned with the comparing of actual to desired or expected goal achievement. In the present instance, the outcomes component were used “summatively”—i.e., to judge the project’s effectiveness in producing desired outcomes.
In conducting the process assessment, the project’s external evaluator (Roger Durand) engaged in the following activities:
- Attended the regular, weekly meetings of the Houston’s Kids collaborative wherein the project’s implementation was discussed;
- Developed and implemented the process evaluation plan;
- Participated in the project’s kickoff activities and demonstration program;
- Attended and participated in a staff meeting of YMCA counselors;
- Met and had email contacts with America’s Promise Alliance staff;
- Conducted an open-ended survey of the collaborative’s partners that focused on sustainability and the evolution of the collaborative;
- Discussed collaborative partners’ perceptions with them over the telephone;
- Met with staff of the United Way of the Texas Gulf Coast concerning the project and its evaluation;
- Kept track of the timing and completion of program activities;
- Was in frequent and sustained contact with Project Director Sul Ross concerning the project’s implementation.
In conducting the process evaluation of the 2006 Houston’s Kids Pilot Project, an open-ended survey of the representatives of the collaborating organizations was conducted during the last week in July 2006. The survey was conducted by email with subsequent telephone contact to clarify and add dimension to some of the responses. Full and complete responses were obtained from eleven participants representing all of the core collaborating Houston’s Kids organizations.
In conducting the outcomes assessment, the external evaluator engaged in the following activities:
- Designed and implemented the project’s outcomes assessment plan;
- Directed discussions at regular meetings of the collaborative about the project’s goals, intended outcomes, standards of success, and results;
- Worked with the staff of the Alief ISD to insure Institutional Review Board (IRB) approval of data collection instruments utilized to gather information from human subjects, especially children;
- Designed a survey of parents whose children participated in the Houston’s Kids Summer Pilot Program;
- Designed a survey (with two different versions) of children and youth who participated in the Summer Pilot Program;
- Designed a set of focus group questions which were utilized by YMCA counselors to obtain information about the needs of children and youth;
- Oversaw the collection of data gathered by means of surveys and focus groups;
- Coordinated the entry onto electronic media of data gathered by means of surveys and focus groups;
- Oversaw the consent process and maintained the confidentiality of all data collected from human subjects for the project;
- Conducted a systematic, statistical analysis of the survey and focus group evidence gathered for the project;
- Interpreted the results derived from the survey and focus group evidence;
- Prepared summary reports on project results.
Details on the methods for the outcomes assessment are discussed at length below.
II. Project Goals and Success Standards
Representatives of the collaborating organizations discussed the goals of the Houston’s Kids Summer Pilot Project, as well as the standards by which to judge it “successful” or not, at considerable length. Indeed, this important discussion continued even through the start of the project at the end of May. Eventually, the representatives achieved consensus concerning goals and success standards.
The representatives ultimately adopted eight goals for the Summer Pilot Project. These goals (see below) were based upon the America’s Promise Alliance Five Promises—resources that young people need to become “productive citizens who contribute to their communities.”
In addition to the Five Promises, the eight goals for the Summer Project and their associated success standards were based upon the important idea of “developing assets” in young people. This idea has been the subject of pioneering work by partners within the America’s Promise Alliance, particularly the YMCA and the Search Institute, through the Abundant Assets Alliance. Central to the “developing assets” idea is the view that children and youth need certain positive experiences and qualities as basic building blocks for their lives. In other words, developmental assets are a strength-based approach to healthy development, a set of factors deemed critical to young people’s growth.
The developing assets idea is supported by research, particularly survey studies, conducted by the Search Institute. One such survey study, Search Institute Profiles of Student Life: Attitudes and Behaviors, interviewed young people in 318 communities and 33 states across the nation (https://www.search-institute.org/surveys/choosing-a-survey/ab/).
The Houston’s Kids collaborative representatives used the results from this extensive investigation to arrive at a set of “success standards” for the Summer Pilot Project (see below). The representatives reviewed the Search Institute’s nationwide findings and adjusted them for two factors: the particular needs of the targeted Katrina-Rita children and the relatively short period planned for the Summer Pilot Project.
III. Outcomes Evaluation Design and the Activities Evaluated
The important yet extended discussion among the Houston’s Kids collaborative representatives about goals and success standards, together with the relatively short time period of the planned summer project, limited the type of evaluation design that was finally adopted to assess the project’s outcomes. More specifically, a planned “pre-post” evaluation design had to be abandoned in favor of a less rigorous one: a post-hoc only design. The post-hoc design generally admits more possible “threats to interval validity” into evaluations, making it more difficult to establish true cause and effect. Nonetheless, given the time available and the project’s available resources, the collaborative’s representatives viewed a post-hoc design as the best option, and the external evaluator concurred.
The limited project summer time period together with the relatively late decision about goals and standards led to two other compromise decisions. First, the Houston’s Kids collaborative representatives had strongly desired to evaluate the impact of the Summer Youth Employment and Training Program for high school students. However, the late decision about goals and standards and the short summer project time period—coupled with an even shorter period of employment training (two weeks) and the subsequent disbursement of youth to jobs throughout the geographically extensive Houston area—together proved difficult barriers to overcome. The collaborative representatives decided against evaluating the Employment and Training Program for the 2006 summer period, choosing instead to conduct a more thorough and comprehensive assessment at a future time (i.e., the expected year-long project to follow). In this decision the external evaluator again completely concurred.
The second compromise decision concerned the eliciting of outcomes evaluation information from Alief ISD principals, teachers, and staff. The original evaluation plan called for surveying these principals, teachers, and staff members concerning their views of students’ progress in “developing assets” as well as of other outcome related matters (e.g., the availability of learning opportunities for children and youth). Again, however, the relatively short time period of the Summer Pilot Project together with the many other activities of the project led the collaborative representatives to decide not to collect such information, but to defer such collection until the anticipated yearlong program. Once again, the external evaluator concurred.
IV. Outcomes Data Collection
Data to assess the Summer Pilot Project outcomes were gathered principally by means of a post-hoc survey of parents whose children participated in the summer program; a post-hoc survey of children and youth participating in the summer program; and a set of focus groups conducted by YMCA counselors who were in direct contact with the children and youth.
All of these data collection instruments were reviewed and subsequently approved by the Institutional Review Board (IRB) of the Alief Independent School District. The same was true of the informed consent procedures employed in the evaluation. Particular care was taken to obtain the written consent of parents in regard to surveying children and youth enrolled in the Summer Pilot Project.
Two slightly different versions of this survey were actually employed: one intended for high school youth, the other for elementary and middle school children. In all cases a parent’s signed written consent form was obtained for each child interviewed. The surveys were administered during a YMCA camp session in the latter part of July or early part of August. In total full and complete survey questionnaires were obtained from 73 children and youth participating in the Summer Pilot Project.
Also included as part of the outcomes component of the evaluation was a selfadministered survey of parents whose child or children participated in the Summer Pilot Project. This survey was administered at two different school locations at the end of July 2006 during a parent “wrap-up” meeting. Survey responses were obtained from 53 parents.
The outcomes evaluation component further included a set of focus group questions and a summary focus group form. Each YMCA counselor was asked to discuss the focus group questions with the child/youth members of his/her group. Following discussion of the questions, which took place during the last few weeks in July, each counselor was asked to record his/her impressions of results on a prepared form. In total impressions were obtained from YMCA counselors that worked with 21 groups of children and youth during the Summer Pilot Project.
Evaluation of the School Year Program
The year-long program created opportunities for three enhancements to the evaluations:
- The use of both pre- and post-surveys;
- The use of comparison groups;
- The use of some additional evaluation instruments.
Survey instruments used for the fall evaluation were similar to those used in summer, with a few small improvements. Three surveys were used: one for parents; one for elementary, intermediate, and middle school students; and one for high school students. Parent surveys were administered to parents with children in the program during the fall semester. Student surveys for elementary, intermediate, and middle school students were sent home with the students and also administered on site for students whose parent permission forms were completed. Finally, a survey was administered in the fall semester to high school students participating in the job training portion of the program.
Surveys were also sent to parents and students in schools without the Houston’s Kids program. Those schools were identified by the district as having similar demographics to the schools with the program.
The pre-test data for the fall evaluation is currently being tabulated. A post-survey will be administered to parents and students (participants in the program and control groups) at the end of the spring semester.
As in the fall, focus groups will also be held to reinforce the survey data and create an opportunity for more qualitative evaluation. Focus groups will occur three times during the school year: once in the fall semester, once in the early spring semester, and once near the end of the spring semester. YMCA counselors facilitate these with students participating in the Houston’s Kids program.
In addition to the surveys and focus groups used in the summer, additional instruments are being used to evaluate the year-long program:
- Grades and behavior of Houston’s Kids participants will be measured against comparable groups of students who did not participate in the program.
- A school environment survey has been administered to teachers, site coordinators, principals, and school nurses at each of the Houston’s Kids sites. There will be a post-test in the spring semester. This school environment study will allow for additional feedback on students’ progress from the full complement of staff who have one-on-one relationships with the students in the program.