INTERACTIVE USER’S GUIDE FOR EVALUATING LEARNING OUTCOMES .

3y ago
24 Views
2 Downloads
2.35 MB
47 Pages
Last View : 28d ago
Last Download : 3m ago
Upload by : Ryan Jay
Transcription

INTERACTIVE USER’S GUIDE FOREVALUATING LEARNINGOUTCOMES FROMCITIZEN SCIENCETina PhillipsHolly FaulknerMarion FergusonMatthew MinarchekNorman PorticellaRick Bonney1

INTERACTIVE USER’S GUIDE FOR EVALUATING LEARNING OUTCOMES IN CITIZEN SCIENCEOriginally developed by the Program Development and Evaluation group at the Cornell Lab of OrnithologyProject Leader: Tina B. PhillipsInteractive Design: Holly FaulknerUser’s Guide Writers:Website: Jennifer Shirk, www.citizenscience.orgTina B. Phillips, Evaluation AssociateMarion Ferguson, Research AssistantMatthew Minarchek, Research AssistantNorman Porticella, Post Doctoral AssociateRick Bonney, Director, Program Developmentand EvaluationConsultants:Cecilia GaribayKate Haley-GoldmanJoe HeimlichBruce LewensteinKirsten EllenbogenIf you have questions about any aspect of this user’s guide, please contact Tina Phillipstina.phillips@cornell.edu800-843-2473159 Sapsucker Woods Road, Ithaca, NY 14850www.citizenscience.orgRecommended citation: Phillips, T. B., Faulkner, H., Ferguson, M., Minarchek, M., Porticella, N., and Bonney, R. 2017. Interactive User’s Guide for EvaluatingLearning Outcomes in Citizen Science. Ithaca, NY: Cornell Lab of Ornithology.This document is based upon work supported by the National Science Foundation under Grant No.1010744: DEVISE (Developing, Validating, and Implementing Situated Evaluation Instruments). Anyopinions, findings, and conclusions or recommendations expressed in these materials are those ofthe authors and do not necessarily reflect the views of the National Science Foundation.The Cornell Lab of Ornithology is a nonprofit membership institution whose mission is to interpret and conserve the earth’s biological diversity through research,education, and citizen science focused on birds.Copyright 2017 Cornell Lab of Ornithology 159 Sapsucker Woods Road Ithaca, NY 148502

NAVIGATING THIS INTERACTIVE DOCUMENTThis document is a shortened version of the User’s Guide for Evaluating Learning Outcomes fromCitizen Science by the same authors. It contains interactive elements designed to make your readingexperience more engaging. Scroll to move to the next page, and note the interactive icons below:If you see this symbol, hover or rollover the text next to it to activate interactiveitemsIf you see a star, hover or rollover with your mouse to show extra evaluator tips.On some pages, you will see an option to print documents in the upper right handcorner - this will open the print dialog and is set to print only the specified page.PRINTBACK TO TOPpClicking this button will bring you back to the table of contents. One can be foundon every page.3

BACKGROUNDPHASE 1: PLAN YOUR EVALUATIONPHASE 2: IMPLEMENT YOUR EVALUATIONPHASE 3: SHARE YOUR EVALUATIONAPPENDICES4

BACKGROUNDWHAT IS THE PURPOSEOF THIS GUIDE?.to guide practitionersinterested in evaluatingoutcomes from theircitizen-science projects.Thisguideincorporatesmanyperspectives and focuses mainly onsummative evaluations, particularlyon individual learning outcomes:including those that are:&COGNITIVEAFFECTIVEBACK TO TOPpCITIZEN SCIENCE CAN BE.CONTRIBUTORYSCIENTIST - DRIVENCOLLABORATIVEPARTNERSHIPCO-CREATEDCOMMUNITY - DRIVENBEHAVIORALAlthough citizen science has extendedto nearly every scientific discipline,every evaluation will be unique to itsproject.CITIZEN SCIENCE:a.k.a. Public Participation in Scientific Research (PPSR)the engagement of volunteers and scientistsin collaborative research to generatenew science-based knowledge.5Most citizen science projects operatein similar structure and strive to meet acommon set of goals related to eitherresearch, conservation, educationor a combination of these. Thus, evendifferent types of citizen science projectscan share many common outcomes,particularly for participant learning.

BACKGROUNDEVALUATION101?BACK TO TOPpWHY EVALUATE?XEvaluation can happen for any number of purposes, but someof the major reasons that organizations or projects undertakeevaluation include:Evaluation is the systematic collection of datato determine strengths and weaknesses ofprograms, policies, or products, in order toimprove their overall effectiveness.Determining programstrengths and weaknessesGathering evidenceof successSustaining or obtainingadditional fundingUnderstandingaudience needsWHO IS INVOLVED?TYPES OF EVALUATIONEvaluation is often made up of a diverse group of people whoare involved in your project. Knowing who your stakeholdersare and agreeing on the purpose of the evaluation is essentialbefore you begin.Understanding the type of evaluation that you are undertakingis fundamental to the rest of the planning and implementationprocess.occurs during the defining phase of a project to obtainFront-End Evaluation baseline information about an audience.WHEN TO EVALUATEFormative EvaluationEvaluation can happen at any time in a project’s life cycle, butfor best results, it should be considered through the entirelife of a project – before, during, and after.occurs during project development; provides directionfor improving implementation and operation.occurs once a project has been established; used toSummative Evaluation* describe a project’s outcomes, effectiveness, or value.*This guide will focus mainly on Summative Evaluation.6

BACKGROUNDETHICSFollowing a tumultuous timein history (1950s-1990s) whenethical standards for conductingresearch on human subjectswas neglected, contemporarysocial and medical research nowoperates under set of standardsaimed at protecting human rights.Conductingevaluation meansrepecting the security, dignity,and self-worth of respondents,program participants, clients, andother evaluation stakeholders.Any evaluation involving peopleshould attempt to follow thesefive basic ethical standards:q ON THIS PAGEAppendix C - PRINTBACK TO TOPpVoluntary participationConfidentialityRequires that people agree to participatein research without any coercion.Assuresparticipantsthattheirinformation will not be made availableto anyone who is not directly involvedin the study.Informed consentAnonymityTells potential participants about theprocedures and risks involved in theresearch and ensures that they givetheir consent to participate.Guarantees privacy, ensuring thateven the researchers cannot identifyindividuals in the study. In many typesof evaluation, anonymity can be difficultto achieve.Explaining risks of harmEnsures that participants are not placedin a situation where they risk beingharmed physically or mentally.FOR MORE INFORMATION ONETHICS IN EVALUATION, SEE“WHAT IS AN IRB?”A NOTE ABOUT CULTURAL COMPETENCE IN EVALUATIONEvaluators interact with a broad range of people from many political, religious, ethnic, language, and racial groups and need special qualities to conductculturally competent work. In order to conduct a culturally responsive evaluation, you should be conscious of how it might be attentive to issues of culture andcontext. Find ways to make this a part of your planning. For example, you might do this by ensuring that partners or advisors from the community are includedto help inform the evaluation process and implementation.7

BACKGROUNDHOW TO USETHIS GUIDEPLANINVENTORYDEFINEDESIGNBACK TO TOPpDesigning and implementing a quality evaluation does not have to be complex.Throughout this guide, we provide practical advice to increase practitioner comfortand confidence in carrying out evaluations within the context of citizen science.The guide also provides an evaluation framework ( ) that we hope will be widelyadopted by citizen science practitioners to facilitate comparisons of individuallearning outcomes across EREPORTDISSEMINATEThe Appendix ( ) section contains resources, templates, and worksheets that can be modifed for your own use,including a matrix of learning outcomes and indicators for citizen science.8

PHASE 1: PLANINVENTORYDESCRIBE THE PROJECT TO BEEVALUATED AND ITS AUDIENCEYou may want to include things such as:DescriptionProject StaffFunding SourcesIntended AudiencePartnersDeliverablesAdditional DetailsAdditional StakeholdersOrganizational StructureARTICULATE GOALS AND TARGETEDOUTCOMES OF THE PROJECT.Goals are usually broad and abstract,but vague goals are hard to measure.Using goals to develop targetedoutcomes will help determine if thegoals are met. Outcomes are morespecific, and refer to concrete andmeasureable statements.OUTCOMES SHOULD.be aligned to the experience theparticipant will have.include information about thesetting/conditions.include a description of the desiredbehavior.be e-bound9q ON THIS PAGEAppendix A - PRINTBACK TO TOPpThere are three types of outcomes:Programmatic, Community-based,and Individual Learning outcomes.THIS GUIDE WILL FOCUS ON:Individual LearningOutcomes (ILOs)In determining what outcomes mightbe most relevant to citizen-scienceprojects, researchers at the Cornell Labof Ornithology used survey data as wellas the LSIE document and NSF set tocreate a framework ( )for measuringILOs that are common among citizenscience projects.Outcome categories from LSIE and NSFare described here.Project developers and evaluatorsmust work closely to determine themost relevant outcomes for individualprojects. Articulating broad goals andmeasurable outcomes will providea road map for your overall projectevaluation. See Appendix A forexamples of various learning outcomescommonly used in citizen science.

PHASE 1: PLANROLL OVER PIE SLICES FOR DEFINITIONSNot all projects should try to achieve every outcome here:Interest inScience & theEnvironmentBehavior &StewardshipSkills of ScienceInquirySelf-efficacyKnowledge ofthe Nature ofScienceMotivationFIGURE 1: A guiding framework for evaluating individual learning outcomes from citizen-science projects.10BACK TO TOPp

PHASE 1: PLANDESCRIBE THE PROJECT TO BEEVALUATED AND ITS AUDIENCEYou can now draft a preliminarylogic model to share with relevantstakeholders who may be affected bythe evaluation, such as funders, programstaff, and volunteers.LOGIC MODELSLogicmodelsaregraphicalrepresentations of projects that showthe relationships between each projectcomponent and the expected outcomes.focus attention on key interventionsand intended outcomes.are usually presented as inputs,activities, outputs, outcomes, andimpacts.Appendix B - PRINTBACK TO TOPpResources dedicated or consumed by a project; typically includesthings like funding agencies, scientists, staff, volunteers, andtechnology infrastructure.INPUTSWays the project uses the inputs; focused on tasks that directlyrelate to the participants; the activities of volunteers typically tendto revolve around data collection, but may vary widely. Any trainingsfor participants should be included here.ACTIVITIESOUTPUTSThe direct products of the stated activities and demonstrateimmediate results of activities; easy to quantify and focus on thingsdone by participants.OUTCOMESThe changes in behavior that a project is intended to produce asa result of project participation; are more difficult to quantify thanoutputs; often described as short-, medium-, or long-term.LOGIC MODELS.help to articulate objectives andstrategies.q ON THIS PAGELong-term outcomes that are broad in scope; aimed at expandingknowledge and capacity for a particular field of study and meantto provide benefits to society; difficult to measure; are of particularinterest to funding agencies.IMPACTSThis table () provides examples of a program logic model, though your projectwill likely have different, and fewer examples. See Appendix B for a Logic Modeltemplate (adapted from University of Wisconsin Extension, 2005). For more informationon developing a logic model, see W. K. Kellogg Foundation (1998 and 2004).You may also find it helpful to articulate your Theory of Change- More information canbe found here ( ).11

PHASE 1: PLANDEFINEThe next step in planning yourevaluation is working with stakeholdersto define what exactly will be evaluated,determine key evaluation questions,and lay out the evaluation timeline,budget, and limitations.STATE THE PURPOSE OF THEEVALUATIONTry not to evaluate every aspect of theprogram. Instead, focus on one or twomain reasons:Gauge participant learningIdentify project strengths and weaknessesPromote a project more broadlyObtain additional funding or supportClarify program purpose or theoryIncrease organizational capacity buildingReflect on project historyProvide recommendations to improveproject functioningOnce you decide on the main purpose,discuss it with significant stakeholdersand then document what is and whatis not going to be evaluated.?DEVELOP AND PRIORITIZE KEYQUESTIONS THAT YOU HOPE WILLBE ANSWERED AS A RESULT OFTHE EVALUATIONDefining and refining your evaluationquestions is perhaps the most criticalaspect of planning your evaluationbecause it hones in on what to evaluate.Evaluation questions should be broadenough to frame the overall evaluation,yet specific enough to focus theevaluation.You will likely come up with manyquestions for which you would likeanswers, but it is important to rememberthat not all questions can be answeredgiven allotted time and resources, norwill they have the same importance toall stakeholders.12BACK TO TOPpEnsure that the questions are:AnswerableAppropriate for the various stages ofevaluationAligned to the desired outcomesProviding important information forstakeholdersIn addition to these criteria, youshould also prioritize the questions byconsidering the following aspects:RESOURCES NEEDEDTIME REQUIREDVALUE OF THE INFORMATION ININFORMING THE EVALUATION PURPOSEAt the end of this process you shouldfeel comfortable knowing that thequestions you focus on will demonstratemeasurability, relevance, and feasibility,while setting the stage for the rest ofthe evaluation.

PHASE 1: PLANDETERMINE THE INDICATORS FOREACH INTENDED OUTCOMEWith goals, outcomes, and evaluationquestions articulated, the next taskis developing quality indicators.Indicators provide specific types ofinformation that let you know that anoutcome has been achieved.Effective indicators align directly tooutcomes and are:CLEARUNBIASEDMEASURABLESENSITIVE TOCHANGEWhile indicators are measureable, theydo not always need to be quantifiable.They can be qualitative and descriptive.EXAMPLE:GOALParticipation in citizen science willresult in development of scienceinquiry skills.SHORT-TERM OUTCOMEBACK TO TOPpCONSTRUCT A TIMELINE FORTHE EVALUATIONDevelop an estimated timeline thatprovides anticipated start and end datesfor completing key tasks and meetingestablished milestones. These are oftenpresented in calendar format.Within three months of joining theproject, at least half of participantswill be able to successfully collectand submit data.Although all timelines inevitably change,having the timeline be as accurate aspossible early in the evaluation will helpavoid frustration later because of initialunrealistic expectations.INDICATORCONSTRUCT A ROUGH BUDGETThe number of new participantsthat submit data and showincreased confidence in beingable to collect data.Estimate how much the evaluation willcost. This can vary greatly dependingon the complexity of the evaluation.When developing indicators, it isimportant to consider the type ofprogram or activity, the durationof the program, and the logisticsof administering the measurementinstrument.13Obviously larger, more complexevaluations will cost more than thosethat are smaller in scope. Include costsrelated to salaries, travel, consultants,printing, mailing, copying, telephoning,and any necessary software orequipment.

PHASE 1: PLANDESIGNThe last part of the planning phase isdesigning the plan for data collection.The goal is to create a data collectionstrategy that identifies proceduresthat are feasible, cost-effective, andviable to keep the project focusedand on schedule.DETERMINE YOUR STUDY DESIGNThe design should reflect:The types of questions you needansweredThe reason for conducting theevaluationThe amount of resources you cancommit to the evaluationThe information stakeholders hope tolearnThe methods that best address theevaluation questionBACK TO TOPpDifferent study designs are better suited for different types of evaluation questions.Researchers usually consider evaluation designs to be either:QUANTITATIVEResults can be analyzed numerically(ex. surveys).ORQUALITATIVEResults must be interpreted bythe researcher (ex. interviews).If you are comfortable, you can combine these approaches to achieve mixedmethods designs. Combining these quantitative and qualitative methods canincrease the validity of your results by triangulating the findings (Creswell, 2003).In deciding on a study design, consider the following questions:How well do you know your topic? You may need to conduct a literature review tounderstand the topic and determine how past evaluations have been designed.What approach to research are you most comfortable with (i.e., qualitative,quantitative, mixed)?Can you dedicate enough time, resources, and staff expertise to the evaluation? If not,you may need to consider hiring an external evaluator.Who are the stakeholders and what do they hope to learn from the evaluation?14

PHASE 1: PLANFOR EACH OUTCOME, DETERMINETHE POPULATION TO SAMPLEFROM AND THE APPROPRIATESAMPLE SIZEThe sample is a representative subsetof the larger group or population. Arepresentative sample will help minimizesampling bias and error.SIMPLE RANDOM SAMPLEEach member of the population has anequal chance of selection- this is a preferredmethod.STRATIFIED SAMPLEThe procedure for determining samplesize (the number of people you needfor your study) can be complicated,but if you are comfortable with a 95%confidence interval and 5% marginof error, the table below provides ageneral rule of thumb for determiningsample size from a given population.POPULATIONSAMPLE50 OR LESS50 OR LESS500 2001,000 28010,000 370U. S. POPULATION 400Used when you have more than one subsetof the population you need to include.Visit: www.research-advisors.com for a table ofsuggested sample sizes if you are seeking a differentconfidence internal or margin of error.CONVENIENCE SAMPLEFor example, if your citizen scienceproject has approximately 1,000participants, your population would be1,000 and your sample size would beapproximately 280. If your populationis the entire U. S., your sample sizeonly needs to be about 400 to provideaccurate generalizations.Used when your study is not aiming togeneralize the whole population- here youcan include those who are easy to contact.PURPOSEFUL SAMPLEEmphasizes extreme cases or those thatprovide maximum variation.15q ON THIS PAGEAppendix D - PRINTAppendix E - PRINTBACK TO TOPpDRAFT THE DATA COLLECTIONSTRATEGYFor each targeted outcome, identify thedata collection methods to be used,the sample for collection (i.e., before,during, or after the project/intervention),and the source of the data.Common data collection methodsare presented below; many of thesemethods are often used in combination.SurveysProfessional critique/expert reviewInterviewsPortfolio reviewsFocus groupsContent analysisObservationsExamine email/listserve messagesJournalsCase study analysisTests/quizzesLiterature reviewConcept mapsSimulationsTracking & timingWeb analyticsCreative expressionDescriptions and a comparison ofstrengths and weaknesses of these datacollection methods are presented inAppendix D. Once you have completedthe previous steps of the design phase,compile all of the information into a tablesuch as the one shown in Appendix E.

PHASE 2: IMPLEMENTDEVELOPThe previous section described howto plan an evaluation. This sectionexplains how to use the plan to guideimplementation of your evaluation. Thegoal of the implementation phase is tocollect credible data that will increasethe accuracy and utility of the evaluation.CHOOSE YOUR INSTRUMENTInstruments can include:Protocols for inte

NAVIGATING THIS INTERACTIVE DOCUMENT This document is a shortened version of the User’s Guide for Evaluating Learning Outcomes from Citizen Science by the same authors. It contains interactive elements designed to make your reading experience more engaging. Scroll to move to the next page, and note the interactive icons below:

Related Documents:

Bruksanvisning för bilstereo . Bruksanvisning for bilstereo . Instrukcja obsługi samochodowego odtwarzacza stereo . Operating Instructions for Car Stereo . 610-104 . SV . Bruksanvisning i original

10 tips och tricks för att lyckas med ert sap-projekt 20 SAPSANYTT 2/2015 De flesta projektledare känner säkert till Cobb’s paradox. Martin Cobb verkade som CIO för sekretariatet för Treasury Board of Canada 1995 då han ställde frågan

service i Norge och Finland drivs inom ramen för ett enskilt företag (NRK. 1 och Yleisradio), fin ns det i Sverige tre: Ett för tv (Sveriges Television , SVT ), ett för radio (Sveriges Radio , SR ) och ett för utbildnings program (Sveriges Utbildningsradio, UR, vilket till följd av sin begränsade storlek inte återfinns bland de 25 största

Hotell För hotell anges de tre klasserna A/B, C och D. Det betyder att den "normala" standarden C är acceptabel men att motiven för en högre standard är starka. Ljudklass C motsvarar de tidigare normkraven för hotell, ljudklass A/B motsvarar kraven för moderna hotell med hög standard och ljudklass D kan användas vid

LÄS NOGGRANT FÖLJANDE VILLKOR FÖR APPLE DEVELOPER PROGRAM LICENCE . Apple Developer Program License Agreement Syfte Du vill använda Apple-mjukvara (enligt definitionen nedan) för att utveckla en eller flera Applikationer (enligt definitionen nedan) för Apple-märkta produkter. . Applikationer som utvecklas för iOS-produkter, Apple .

och krav. Maskinerna skriver ut upp till fyra tum breda etiketter med direkt termoteknik och termotransferteknik och är lämpliga för en lång rad användningsområden på vertikala marknader. TD-seriens professionella etikettskrivare för . skrivbordet. Brothers nya avancerade 4-tums etikettskrivare för skrivbordet är effektiva och enkla att

Den kanadensiska språkvetaren Jim Cummins har visat i sin forskning från år 1979 att det kan ta 1 till 3 år för att lära sig ett vardagsspråk och mellan 5 till 7 år för att behärska ett akademiskt språk.4 Han införde två begrepp för att beskriva elevernas språkliga kompetens: BI

**Godkänd av MAN för upp till 120 000 km och Mercedes Benz, Volvo och Renault för upp till 100 000 km i enlighet med deras specifikationer. Faktiskt oljebyte beror på motortyp, körförhållanden, servicehistorik, OBD och bränslekvalitet. Se alltid tillverkarens instruktionsbok. Art.Nr. 159CAC Art.Nr. 159CAA Art.Nr. 159CAB Art.Nr. 217B1B