OUT-OF-SCHOOL TIME PROGRAM EVALUATION - Education Northwest

1y ago
10 Views
2 Downloads
870.12 KB
91 Pages
Last View : 15d ago
Last Download : 2m ago
Upload by : Mia Martinelli
Transcription

OUT-OF-SCHOOL TIMEPROGRAM EVALUATIONTools for ActionElke GeigerBrenda BritschEducation, Career, and Community ProgramNorthwest Regional Educational Laboratory

The mission of the Northwest Regional Educational Laboratory (NWREL) is to improveeducational results for children, youth, and adults by providing research and developmentassistance in delivering equitable, high-quality educational programs. A private, nonprofitcorporation, NWREL provides research and development assistance to education,government, community agencies, business, and labor. NWREL primarily serves theNorthwest region of Alaska, Idaho, Montana, Oregon, and Washington.The Education, Career and Community (ECC) program at NWREL has a strongcommitment to connecting schools, communities, and families. The ECC Program hasdeveloped a focus area working with a range of out-of-school time programs, including21st Century Community Learning Centers (CCLC) programs in a variety of ways. Forthe past three years, ECC staff members have presented at 21st CCLC statewide bidders’conferences throughout the Northwest, where they assisted attendees with the design andevaluation of their 21st CCLC programs. ECC staff have also provided training andevaluation assistance to 21st CCLC grantees and have participated in national meetingsrevolving around out-of-school time issues.During the past 33 years, NWREL has conducted hundreds of evaluations at the school,district, state, and national levels. The ECC Program has extensive experience evaluatingthe connections between schools and communities. We are currently working withseveral school districts on their out-of-school time evaluations.NWREL evaluators have had extensive experience in developing brief user-friendlysurveys for students, staff, parents, and community members. The materials in thisresource manual have been tested and then revised to ensure they are clear andunderstandable to their respective audiences.If you would like more information about NWREL’s out-of-school time projects andevaluation services, please call 1-800-547-6339, x. 757. This resource is also availableonline at www.nwrel.org/ecc/21century/index.html.i

AcknowledgmentsWe would like to extend our deepest thanks to the many programs — students and staff— with whom we have had the opportunity and the pleasure to work. These programs arepart of the Anchorage School District (AK), Astoria School District (OR), Bering StraitSchool District (AK), Central School District (OR), Crook-Deschutes Education ServiceDistrict (OR), Oregon City School District (OR), Oroville School District (WA), RegionIX Education Service District (OR), Seattle Public Schools (WA), Whitepine SchoolDistrict (ID), and Yamhill Education Service District (OR). We would also like to thankthe parents and teachers of the students for their time in completing surveys and talkingwith us about their out-of-school time program.Several NWREL staff members have also been a part of the process. A special thank-yougoes to Judith Devine, who assisted in the development of the surveys that can be foundin the appendices of this manual. Judith also assisted with the initial analysis of thesurvey data; the data that are displayed are based on her earlier work. Thank you toDenise Crabtree (for cover design), Rich Melo and Eugenia Cooper Potter (for editing),and Rhonda Barton for writing about it in NWREL’s NW Report.A final thank-you goes to our program director, Eve McDermott, for goading us into thisventure and for finding the funds it took to mold our knowledge into a resource to sharewith those who need it.ii

Table of ContentsIntroduction. 1What the Research Says About Out-of-School Time . 2How This Resource Manual Fits Into Your Comprehensive Program Evaluation. 4Getting Started: Surveys and Focus Groups . 6Student Participants . 7Teachers . 9Parents. 10Staff. 11Partnerships. 12Adult Participants. 13Analysis, Display, and Utilization of Results . 14Evaluation Outcomes in the Northwest . 20Taking Action . 38Appendix A: Student Participant Baseline SurveyAppendix B: Student Participant SurveyAppendix C: Spanish Student Participant SurveyAppendix D: Student Participant Focus Group QuestionsAppendix E: Focus Group ProtocolAppendix F: Teacher SurveyAppendix G: Parent SurveyAppendix H: Spanish Parent SurveyAppendix I: Parent Focus Group QuestionsAppendix J: Staff SurveyAppendix K: Staff Focus Group QuestionsAppendix L: Partnership SurveyAppendix M: Adult Participant SurveyAppendix N: Additional Evaluation Resourcesiii

Introduction“The map is not the territory.”—Alfred Korzybski, founder of general semanticsIf you are working with an out-of-school time program, either as a director or as a consultantproviding technical assistance, you undoubtedly want to be well-informed about what theprogram is accomplishing. Evaluating the program on a regular basis is the best way to getfeedback on what is working and what isn’t so you can make continuous improvements.The evaluation process can also provide you with evidence that the program has value.You already may be collecting student outcome data to show that your program is having apositive effect. If you are operating on a grant, you may have reporting requirements such aschanges in student grades and test scores. You may also be looking at improvements in schoolday attendance and behavior. These data are contained in school district databases and are oftenreadily obtainable.But how do you know what all your participants are feeling about the program? How do youknow what parents think about what happens during the program? If your program has anacademic component, does it connect to what happens during the regular school day?This resource manual helps you answer those questions by providing you with stakeholdersurveys and focus group questions for student participants, parents, teachers (survey only),program staff, and program partners (survey only). Surveys for participants of adult CommunityLearning Center (CLC) classes are also included. Each survey is presented with suggestions foradministration as well as with information about what each can assess. Focus group questions arepresented with suggestions for how best to organize and conduct the groups. Finally, this manualsuggests methods of analyzing and displaying data so that you can document accomplishmentsfor present and future grantors as well as promote your program to the community. Examples ofreal outcomes are provided.1

What the Research Says About Out-of-School TimeResearch on out-of-school time programming has dramatically increased in the last few years.The research is becoming more sophisticated and is providing valuable information for programsand those working with programs. While the body of research is still incomplete, there isaccessible literature that can greatly assist programs. This manual briefly summarizes a portionof the research and provides resources for you to access further information. Also included arerecommendations based on the existing research to help guide programming and evaluationefforts.A leader in out-of-school time programming research, the Harvard Family Research Project(HFRP) hosts the Out-of-School Time Learning and Development Project. As a part of thisproject, the HFRP Web site includes an online evaluation database with descriptions of variousout-of-school time program evaluations. The Web site also contains other rich resources such aspublications relating to out-of-school time programming. Visit the Web site out.html.One of the largest and best-known out-of-school time program evaluations looks at LA’s BEST(Better Educated Students for Tomorrow) program. LA’s BEST serves more than 18,000students in 105 elementary schools in Los Angeles, California. The program focuses onproviding a safe environment, as well as enrichment and recreational activities to elementarystudents. The program evaluation was conducted by UCLA’s Center for Study of Evaluationduring the past 10 years. The outcomes found that participation in LA’s BEST programcorrelated with fewer school days missed; positive achievement on standardized tests in math,reading, language arts; positive attitude toward school and self; and improved grades. Find outmore at www.lasbest.org/.The U.S. Department of Education contracted with Mathematica Policy Research, Inc., toevaluate the impact and implementation of the national 21st Century Community LearningCenters program (funded by the USDE and the C.S. Mott Foundation). While the study has beencontroversial because of its findings, it does offer valuable information. Key findings from thefirst year of Mathematica’s evaluation include limited academic impact among participants,improved parental involvement, low levels of student participation, and programs staffed2

predominantly by school-day teachers. Other research has shown that linking out-of-school timeprograms to the school day is beneficial and staffing the program with school-day teachers is oneway to accomplish this. The first year report can be found at: www.ed.gov/pubs/21cent/firstyear.Public/Private Ventures and the Manpower Demonstration Research Corporation conducted anevaluation of the Extended-Service Schools Initiative (ESS), funded by the Wallace-Reader’sDigest Fund. The ESS includes 60 after-school programs in 20 communities. This evaluationassessed a broad range of programming, including the quality of activities, the benefits toparticipants, and program costs. The study isolated key components of programs, such as adultyouth relationships, peer support, decisionmaking and leadership opportunities for youth, andyouth engagement. Outcomes related to program participation included staying out of trouble,improved school attitudes and behaviors, social benefits, and improved skills and selfconfidence. The report can be found at e Afterschool Corporation (TASC) provides funding for nonprofit organizations to partnerwith schools to develop after-school programs. The programs are in place in more than 200schools across New York state. The C.S. Mott Foundation, the Carnegie Corporation, and theWilliam T. Grant Foundation have funded a five-year evaluation of TASC, conducted by PolicyStudies Associates. The evaluation has found that students who participate regularly in theprogram have increased their rates of school attendance and experienced educational gains.Students, parents, and educators also report high levels of satisfaction with the program. Moreinformation on the evaluation can be found at TASC’s Web site: www.tascorp.org.RecommendationsThe following points include recommendations based upon research on the out-of-school timeprogramming topic. Assess outcomes that the program is addressing.It is important to focus research and evaluation efforts on the specific outcomes on which theprogram focuses. For example, if the program has a strong academic component, it’s3

appropriate to measure academic improvement. If the program only focuses on improvingreading skills, however, you would only expect to see improvements in that area. The world of out-of-school time outcomes is extensive.As noted above, the possible outcomes for out-of-school time programs are wide ranging.Programs should not be limited by measuring academic achievement or student and parentsatisfaction only with the program. Again, match the measured outcomes with programgoals, but be sure to think broadly about the types of outcomes the program could beaffecting. Look at how others have assessed particular elements; don’t reinvent the wheel.It is likely that someone else has thought about assessing a particular program componentthat you would like to assess. Review the research and see how other programs have assessedthe outcomes you are looking for to see if you can adapt their methodologies. A word of warning about assessing academic achievement.Out-of-school time programming research has made some links with improved academicachievement. However, these links are not always apparent. If your program includesacademic components, assess the components as specifically as possible. For example, if theprogram focuses on reading comprehension, then assess reading comprehension as well asachievement in language arts class.How This Resource Manual Fits Into YourComprehensive Program EvaluationWhen you are conducting a comprehensive program evaluation, you should be looking atmultiple data sources. These data include attendance rates, student grades and test scores, surveyresults, observations, and interviews/focus groups. While an explanation of how to collect andanalyze all data is beyond the scope of this manual1, it will be beneficial to look at multiplesources to understand how surveys and focus groups fit into your overall evaluation plan. Table 1shows a number of data sources that are relevant to evaluating out-of-school time programs and4

what those data sources present. The starred (*) areas can be assessed using the tools in thismanual.Table 1.Outcomes/OutputsData Source(s)Student achievementGrades, test scores, teacher reports*Student behaviorAttendance and behavior data, surveys*Perceptions of benefits,Student, parent, staff, and teacher surveys and focusenjoyment and quality of programsgroups; adult participant surveys*Perceptions of program quality inStudent, parent, staff, and teacher surveys and focuscore academic areas andgroupssatisfaction with enrichment andsupport activities, including thelink with the regular school day*Satisfaction with servicesParent surveys and focus groups; adult participantsdirected specifically at themsurveys*Success of partnerships, buildingStaff surveys and focus groups; partnership surveysof relationships*Effective communication amongStudent, parent, staff, and teacher surveys and focusstakeholdersgroups*Operational support for programStaff surveys and focus groups; partnership surveyseffectiveness*Tools to help you collect this information can be found in the appendices.Ultimately, much of the work of evaluation will lead to program improvement (formativeevaluation) rather than simply addressing accountability issues (summative evaluation). Yourevaluation should provide useful information that can be directly linked to your program goals.The development and use of logic models can assist you with such alignment (see Appendix Nfor resources on logic models).1Additional evaluation resources are listed in Appendix N.5

Getting Started: Surveys and Focus GroupsOnce you have mapped out a general plan for your comprehensive evaluation, you can begincollecting data. Surveys and focus groups can be conducted even in the early stages of yourprogram. Early data can provide you with some starting points for program development; theymay also be used as a baseline for comparison with future data.While information gathered from surveys and from focus groups are by no means mutuallyexclusive, each is better suited for particular circumstances. Surveys are helpful when you wantinformation from a large group of stakeholders, while focus groups provide opinions from amore limited number of people. You can pass out surveys to 100 people (provided you canhandle the data that the surveys generate, as discussed later in this manual), but conducting 10focus groups (assuming 10 people per group) would likely take at least two days.Survey questions generally limit respondents’ answers, with the exception of a few open-endedquestions that allow for comment. Essentially, surveys utilize designs that allow for maximumstakeholder participation. Focus group questions are all open-ended, allowing the personconducting the group to “dig deeper,” if appropriate. Survey data are generally much easier toanalyze (once the data are entered into an electronic file) than focus group data, which can easilyresult in mounds of notes filled with participant responses. For example, a survey may ask aparent to rate satisfaction with a program’s homework time on a four-point scale where 1 Strongly Disagree and 4 Strongly Agree (limited response). Out of 100 surveys, the averageresponse may be a 3.7, which represents strong satisfaction with homework time. In a focusgroup, a question about homework time may elicit a broad range of responses, such as “Now thatMarissa does her homework in the out-of-school time program, we don’t argue about it afterdinner.” While this response provides much information, if you had 20 different responses, theywould be much harder to summarize than simple, quantifiable survey responses.Both survey and focus group responses provide information about the benefits of havinghomework time as part of regular programming, for example. They are different, however, in theamount and depth of information they offer. Using both techniques will provide you with qualityinformation about your program and will give you a chance to document accomplishments.6

Finally, always keep your program goals in mind as you proceed. You may elect to add questionsto surveys or focus groups to help you assess your accomplishments as they relate to these goals.Student ParticipantsStudent Participant SurveysThe student participant surveys are designed2 to assess several areas, such as participants’ schoolexperience and attitudes, their experience in the out-of-school time program, and reasons theyparticipate in the program.The surveys are intended for participants in fourth grade and above. Because the survey attemptsto assess program impact on student success that is partially linked to school, it is best suited toprograms with an academic component (e.g., homework time, tutoring, or academic enrichment).Examples of how survey results can be displayed are shown in the Displaying Results section.Baseline SurveyThe baseline survey (Appendix A) allows you to get a sense of what students think about schooland the program early on (i.e., a month or two into the program). The baseline may be used forcomparison purposes with the spring survey; however, a meaningful comparison will be limitedif your program does not have a high number of consistent participants.Participant SurveyThe participant survey (Appendix B) is best used with regular program attendees. The definitionof regular can be determined by the program directors. Federal- and state-administered 21stCCLC programs define regular as participating 30 days or more in the program; this definitionmay limit newcomers who have not yet participated a given number of days, but who will likelyattend frequently before the end of the school year (if your program follows the standard school2Questions 1, 2, 5, 12, and 15 are based on the U.S. Department of Education’s 21st CCLC Program AnnualPerformance Report (APR) student surveys. The scales have been adjusted here to allow for a broader range ofresponses than the original “yes/no” choice.7

calendar). You may also want to use the participant survey with non-regular attendees, but lookat the data separately for the two groups. A Spanish version of the survey can be found inAppendix C.Suggestions for DistributionIf the program is following the school calendar, spring (because it is near the end of the schoolyear) is a good time to administer the survey. Student participant surveys are most easilydistributed and collected during program hours. Let students know that the surveys are a way fortheir individual voices to be heard. Thus, students should complete the surveys individually andshould be encouraged to take them seriously, since their answers can help improve the program.It is important to note that some states, such as Alaska, require parental permission beforestudents participate in any surveys. It is also helpful to include information about any datacollection that will occur in program registration materials.The survey respondent is identified through the use of an identification (ID) number (preferablythat of the school or district). This may seem cumbersome for staff. However, an ID numberprotects student confidentiality as well as allowing students to express themselves more freelysince their names are not marked on the survey. The ID number serves to track regular attendeesover several years.Student Participant Focus GroupsThe student focus group questions (Appendix D) mirror the survey questions somewhat.However, they allow participants to tell more about what happens on a daily basis in theprogram. When students talk in small groups about their out-of-school time experience, they willthink of and mention things they might not include on survey open-ended questions. Studentslove to tell stories, and hearing each others’ stories encourages further discussion.When talking with students, the group should include no more than eight students. These groupsare best conducted during regular program hours. Take the group to a quiet corner of the roomfor 10 to 20 minutes, depending on the age group. When talking with students, it is particularlyimportant not to let one or two of the most energetic children monopolize the discussion.8

The best selection method for focus group participants is by random sample, so that each studentwho participates in the program has an equal chance of being selected for the focus group. Asecond option is for the group to be selected by the site coordinator with an eye toward diversity.Sometimes the situation does not lend itself to formal grouping; the focus group questions canthen be used to guide interviews with students one-on-one or in pairs.An overall focus group protocol can be found in Appendix E.TeachersTeacher SurveyThe teacher survey (Appendix F) is designed3 to assess two major areas.Part I is completed for each regular4 student program attendee. Teachers reflect on improvementsin attendance, homework completion, behavior, and so forth, since the student has participated inthe program. (Students are identified both by name and ID for ease of use. If someone other thanprogram staff is entering or analyzing the data, names should be blacked out for studentconfidentiality.)Part II, which is completed once by each teacher, assesses perceptions about the program.Teachers are asked if they believe the program relates to what is taught during the school day,offers a variety of enrichment activities, and assesses communication between themselves andthe program staff. As with the student participant surveys, the teacher survey is best suited toprograms that have an academic component (e.g., homework time, tutoring, or academicenrichment). Examples of how survey results can be displayed are shown later in the manual.3Questions 2 through 11 in Part I are taken from the U.S. Department of Education’s 21st CCLC Program AnnualPerformance Report (APR) teacher surveys. The scales have been adjusted to allow for a broader range of responsesthan the original “yes/no” choice. Also, teachers are asked about why they agree or disagree with statements about astudent’s improvement. For example, some students may not have improved their class attendance becauseattendance was not a problem prior to program participation.4The definition of “regular” can be determined by the program directors. Federal- and state-administered 21stCCLC programs define “regular” as participating 30 or more days in the program; this definition may limitnewcomers who have not yet participated a given number of days, but who will likely attend frequently.9

Suggestions for DistributionIf the program is following the school calendar, it is best to administer this survey late in theschool year. Teacher surveys are most easily distributed to teacher mailboxes or directly to theirclassrooms. Attach a note that explains the purpose of the surveys and tells teachers where toplace surveys when they are completed (e.g., a box in the front office). Be sure to give teachers areasonable deadline (One week is optimal; otherwise, you risk having the survey work its way tothe bottom of the stack).ParentsParent SurveyThe parent survey (Appendix G) asks parents to rate their satisfaction with the out-of-school timeprogram, reflect on the impact it has had on their child(ren), and describe why their child(ren)participate(s). A Spanish version of the survey can be found in Appendix H. Examples of howsurvey results can be displayed are shown later in the manual.Suggestions for DistributionAs with student and teacher surveys, it is best to administer this survey in the spring (i.e., towardthe end of the school year) if the program is following the school calendar. Parent surveys arebest distributed by sending them directly to homes through the postal service. Include a stampedenvelope with a return address so that parents can return their surveys easily. The majordrawback to this method is that mailing can be very costly.There are other options for parent survey administration, though they are less desirable. Onlinesurveys are a possibility, though these can be exclusionary, since not all homes have computersor are connected to the Internet. Having parents complete surveys if they pick up their childrenor on family nights are also possibilities but, again, you may not reach as many parents as youwill if you mail the surveys (unless parents are required to pick up their children).10

The parent group is the most difficult in terms of return rates. When surveys are mailed, there isa decreased likelihood that they will be completed and returned. You can improve the return rateseveral ways. First, include a letter that explains what you are doing and why it is so important.Let parents know that their feedback will allow you to better serve them by providing qualityout-of-school time programming. Give a reasonable deadline for completing the survey (e.g.,“within seven days of receiving this letter” or two weeks from the date on the letter). You mayalso offer an incentive for parents, such as an entry for a drawing that they can return with theircompleted survey. (The drawing entry form should be separate from the rest of the survey so thatit can be easily detached and so that parent names are not on the survey.)Parent Focus GroupsThe focus group questions (Appendix I) allow parents to give more detailed answers than thesurvey. When talking with parents, group size should be 10–12 individuals. These groups can beconducted any time parents are available, and will likely take place on site. Usually, theconversation will last about 30 minutes, depending on the number of parents in the group.Parents love to talk about their children but, as with student groups, it is important to not let oneor two parents monopolize the discussion. Also, it is always a good idea to provide food(anything from cookies and coffee to a pizza event) if possible. Providing childcare will alsoresult in a greater turnout.The site coordinator can arrange the meeting times and places with a selected group of parents.An overall focus group protocol can be found in Appendix E.StaffStaff SurveyThe staff survey (Appendix J) asks site coordinators and program staff to describe their roles andactivities in the program. It asks about professional development needs, communication withteachers and parents, support, and perceived impact of the program on students.11

Suggestions for DistributionAs with other surveys, it is best to administer this survey in the spring (i.e., toward the end of theschool year) if the program is following the school calendar. Staff surveys are best distributedduring regular program hours. A drop box or envelope should be made available forconfidentiality.Staff Focus GroupsThe focus group questions (Appendix K) explore staff backgrounds and philosophies, as well asassessing positive outcomes and challenges in greater detail than on the survey. Group size willvary, but should not exceed 10 individuals.If you are a program director and are conducting the evaluation on your own, you may want toask a professional from outside the program to conduct this group for you.Groups can be conducted any time that staff members are available, either before or after theprogram or on staff development or planning days. Usually, the conversation will last about 30to 45 minutes, depending on the number of staff members in the group. The program director canarrange the meeting times and places. An overall focus group protocol can be found inAppendix E.PartnershipsPartnership SurveyThe partnership survey (Appendix L) focuses on services that partners provide and therelationship between programs and partners. While many partners have a regular presence in theprogram, they are not considered staff in most cases because the out

A leader in out-of-school time programming research, the Harvard Family Research Project (HFRP) hosts the Out-of-School Time Learning and Development Project. As a part of this project, the HFRP Web site includes an online evaluation database with descriptions of various out-of-school time program evaluations.

Related Documents:

7 Shade 50% of the whole figure. 8 Shade 75% of the whole figure. Fill in each blank. 9 43 out of 100 % 10 out of 100 1% 11 5 out of 100 % 12 out of 100 10% 13 90 out of 100 % 14 out of 100 87% 15 21 out of 100 % 16 out of 100 2% 17 8 out of 100 % 18 out of 100 3% 19 4 out of 100 % 20 out of 100 9% 21 35 out of 100

Yellow In A1 Purple Out A2 Purple/White Out A3 Green Out A4 White Out A5 Orange Out A6 Orange/Black Out A7 Dk.Blue Out A8 Red/Blue In A9 Lt.Blue/Black In/Out A10 Black In A11 Pink Out A12 Yellow/Black Out A13 Brown/White In A14 Pink/Black In A15 Purple/Yellow In/Out A16 Green/White In/Out A17 Green/Red In/Out A18 White/

NC COM 0 1 COM 0 COM 1 V A0 Ao-B0 V -V COM COM COM B0-Z0 Z0-A1 A1-B1 B1-Z1 Z1-OUT 0 2 I OUT 0 V OUT 0 V OUT 0 0 V OUT 0 V OUT 0 V V OUT 0 V OUT 0 OUT 0 V OUT 0 V OUT 0 OUT 3 EtherNet/IP On-Machine When using the full Rockwell Automation solution including Studio 5000 Logix Designer

Dwight Elementary School Burr Elementary School King Street Intermediate School Western CT Academy of International Studies Magnet Broadview Middle School Rogers Park Middle School Pathways Academy Westside Middle School Academy Great Plain School Shelter Rock School (2011 & 2014) King Street Primary School Ellsworth Ave School Pembroke Stadley .

OC.1uF GND IN IN EN TPS2014 OUT OUT OUT OC D D-5V 150uF GND D D-5V 150uF GND D D-5V 150uF GND GND IN IN EN TPS2014 OUT OUT OUT OC.1uF.1uF 5V 1000uF opt. D D-5V 150uF GND GND IN IN EN TPS2014 OUT OUT OUT OC.1uF NOTE: Low drop-out Voltage Regulators may be required for powering Hub Contro

mead school district 354 mercer island school dist 400 meridian school district 505 monroe school district 103 morton school district 214 mossyrock school district 206 mt baker school district 507 mt vernon school district 320 mukilteo school district 6 napavine school district 14 newport school district 56-415 nooksack valley sch dist 506

Keira High School, Albury High School, Henry Lawson High School, Santa Sabina College, Mackillop College, Bowral High School, Macquarie Anglican Grammar School, Cobar High School. DAY 2: Danthonia School, St Pius X High School, Gosford High School, Bishop Druitt College, St Columba Anglican School,

Adventure tourism is a “ people business ”. By its very nature it involves risks. Provid-ers need to manage those risks, so partici-pants and staff stay safe. The consequences of not doing so can be catastrophic. ISO 21101 : Adventure tourism – Safety management systems – A practical guide for SMEs provides guidance for small businesses to design and implement safety management systems .