Qualitative Evaluation MethodsPediatric Obesity Mini-CoIINSeptember 21, 2017WELCOME!While you are waiting, please use the webinar “chat” box tobriefly tell us about your team’s experiences with focus groups,interviews, and other qualitative approaches.Donna B. Johnson, RD, PhDProfessor Emeritus, Nutritional Sciences Program & Department of Health ServicesTori Bradford, CHES, MPHResearch Coordinator, University of Washington, Center for Public Health Nutrition
Learning Objectives1. Explain what qualitative methods can add to program evaluationand identify situations/reasons when qualitative methods may beappropriate.2. Identify different types of qualitative evaluation data collection andanalysis and list steps involved in doing them.3. Apply best-practices for qualitative methods in relation to programevaluation.
Qualitative DataInformation that is difficult to measure, count, or express in numerical terms.
Qualitative Evaluation MethodsOpen-ended questions in structuredquestionnairesWorkshops - table-based/groupexercisesSemi-structured and in-depthinterviewsGo-along interviews - shadowingFocus groupsDocument analysisPhoto-elicitation - photovoiceMental models- mind mapsParticipant diaries, logbooks,recordingsObservation
Why use Qualitative Data?Provides contextual data to explain complex situationsComplements quantitative data by explaining “why” and “how” to “get at the story” behindquantitative findingsMay be the most appropriate approach with small samples, early formative evaluation &pilot testing - times when a quantitative evaluation question is prematureAllows for discovery of unintended consequencesProvides an “insider” perspectiveAdds a human voice with richness, depth, and meaningMakes reports more interesting to read
Limitations of Qualitative DataLimited generalizabilityCan be time consuming and costly to do with rigorCan be challenging to analyze and interpret
When to use Qualitative Data?When they are the best way to answer evaluation questions.Evaluation questions: Based on how you think that your program may progress (theory of change) Developed as program is being designed, in consultation with stakeholders,with dissemination in mind Can be about – Process – How is implementation going? Why does it seem to be going that way? Outcome – What impact have we had? Have we had different impacts on different groups of people?
Mini-CoIIN Evaluation ModelFunding Committed;Applications SubmittedProject ImplementationUnderwayHow did the applicationprocess go?How is progress toward goals?How useful were informationpacket materials?How are teams developed andfunctioning?What barriers and facilitatorsexist?How useful is training and TA?What are project goals? Interviews Document reviewWhat other TA and support doteams need? Interviews Document review Electronic surveysLooking to the Future –Sustainability What is project reach andimpact? How are efforts embedded inECE systems? Will project last after fundingends? Why or why not? What state-level benefits areattributable to participation inthe Mini-CoIIN? Interviews
Choosing Your Data Collection MethodsPurpose of evaluation: what method seems most appropriate for purpose and questions youwant answered?Users of evaluation: will method allow you to gather credible and useful information forstakeholders?Respondents from whom you will collect data: Where and how can respondents best bereached? What is culturally and linguistically appropriate? Do data already exist?Resources available: time, money, volunteers, travel expenses, suppliesDegree of intrusiveness: will method be disruptive or be seen as intrusive?Type of information: do you want representative information or do you want to examine therange and diversity of experiences, or tell a story about the target population or a upestd/Selecting%20Data%20Collection%20Methods.pdf
METHODSSurveysADVANTAGES Anonymous completion possibleCan administer to groups of people at sametimeCan be efficient and cost effectiveDISADVANTAGES Forced choices may miss certain responses Wording may bias responses ImpersonalInterviews Can build rapport with participant Can probe to get additional information Can get breath or depth of information Time consuming Can be expensive Interviewing styles and wording may affectresponsesFocus Groups Can get common impressions quickly Can be an efficient way to get breadth anddepth of information in short time frame Need experienced facilitator Can be difficult and costly to schedule a group Time consuming to analyze responsesObservation Can view program operations as they occur Difficult to interpret observed behaviors May influence behaviors of programparticipants May be expensive and time consuming torecord each individual eventDocumentReview Can document historical information aboutyour program Does not interrupt program routine Information already exists May be time consuming Available information may be incomplete Gathering information is dependent on qualityof records kept
How To Do Qualitative EvaluationPlease us the webinar “chat” box to submit anyquestions you have about doing qualitative evaluation
Developing your questions1. Look at your program model – do questions reflect what you expect tohappen as result of the program?2. Review your goals.3. How will you USE these data? If they won’t be used to improve, defend orsustain your program, you don’t need it.4. Will the data help you answer your evaluation question?5. Consider – “do I really need to know the answer to this or will it burden theparticipant?”
Guidelines for choosing words and formingquestions1. Make sure the question applies to the respondent2. Ask one question at a time3. Use simple and familiar words4. Use specific and concrete words to specify the concepts clearly5. Use as few words as possible to pose the question6. Use complete sentences with simple sentence structures7. Make sure “yes” means yes and “no” means noDillman et al. Internet, Mail, and Mixed-Mode Surveys: The Tailored Design Method 3rd Edition. 2009 John Wiley & Sons, Inc.
Data Collection Method 1: Surveys1. Consider open-ended vs closed-ended question formats2. Create your survey tool – write your questions3. Pilot test your questions (ideally with target audience)4. Invite participants to complete survey5. Thank participants for their time; provide incentive (if applicable)Dillman et al. Internet, Mail, and Mixed-Mode Surveys: The Tailored Design Method 3rd Edition. 2009 John Wiley & Sons, Inc.
Best Practice: Increasing survey response rate1. Choose an appropriate survey length for your audience2. Make sure the survey is easy to take and returnDo this . Choose the most effective and reliablemethod to reach your participants Include at least a personalized greeting Use minimal number of questions Vary the format Keep phone surveys short and to thepoint! .instead of this Neglecting to personalize materials Squeezing more questions onto a page tosave costs Using identical formatting for everymailing Designing long-winded telephone surveyquestions
Best Practice: Increasing survey response rate, cont’d.3. Contact participants multiple times4. Choose the right delivery methodSurvey TypeAdvantagesDisadvantagesWeb Based Fast – many who will respond to emailinvitation will do so within a few days Multi media options and graphics areavailable You can adjust which questions participantssee Low data entry costs More accurate answers to sensitive questions Many dislike unsolicited email - be sure toadhere to anti-spam regulations Web surveys don’t reflect the population as awhole People can easily abandon in the middle ofquestionnaire Potential response rate problems in lowereducation or low-literacy populationsPaper/Mail Least expensive Can include diagrams, graphics Respondents can answer at their leisure –doesn’t feel as intrusive High accuracy when forms are scanned Response time is usually longer than othermethods Response rates are often low or areunpredictable and may result in biased results
Survey ExamplePost training evaluation surveys1. Mix of open and closed-ended questions2. 3-5 questions or more3. Captive audience at the end of the training session; Providecertificate for training once evaluation survey is completed andreturned
Data Collection Method 2: Focus Groups1. Determine what you want to learn from the focus groups2. Develop your question guide3. Find a facilitator4. Schedule time and place that is convenient for participants5. Advertise focus group, invite participants6. Hold focus group – facilitator, note taker7. Thank participants; provide incentive – could be food
Categories of focus group questionsOpening: participants get acquainted and feel connectedIntroductory: begins discussion of topicTransition: moves smoothly and seamlessly into key questionsKey: obtains insight on areas of central concern in the studyEnding: helps team determine where to place emphasis and bringsclosure to the discussionKrueger. The Focus Group Kit: Developing Questions for Focus Groups. 1998 Sage Publications, Inc.
Focus Groups in OhioE M I LY TO R R E SO H I O D E PA R T M E N T O F H E A LT H E A R LY C H I L D H O O D O B E S I T Y P R E V E N T I O N P R O G R A ME M I LY.TO R R E S @ O D H . O H I O .G O V
First Steps1. Decided what we wanted feedback on2. Created moderator guidelines3. Created a reference document4. Created an invitation template5. Started recruitment and scheduling
During focus group1. Introductions2. Focus Group set up3. Background on project4. Ease into questions5. Gather essential feedback6. Allow for any other comments7. Give thanks
Hiring a FacilitatorHelped make the process more objectiveExpert at Defining scope of work Understanding what works with participants Administrative set-up Collating and presenting feedback
OutcomesMarch – July 20179 focus groups, 66 participants Center providers, center administrators, family child care providers, schooladministrators, licensing specialist/QRIS monitors, and othersNot a single category without commentTeam is about to embark on revisionsUnintended information
Lessons learnedBudget time for administrative back and forthThink about the utility of a hired facilitatorMake moderator guidelines adaptable for group size and time lengthDuring focus groups Ask for permission at the beginning to keep things moving Be aware of hot button topicsUse a template to record notesUnderstand the limitations of your work
Data Collection Method 3: SemiStructured Interviews1. Draft and test interview questions2. Determine setting – in-person vs phone3. Send questions to participants in advance4. Schedule interviews5. Conduct interview – recording, transcription, or notes6. Thank participants; provide incentive
The InterviewersThe interviewer is an active part of the research processInterviewers should be aware of his or her biases, paradigms, and belief systems should not lead participants to desired or preconceived conclusions should not use non-verbal language to reinforce or discourage certainresponses (e.g., nodding, rolling eyes, etc.)Ideally, interviewers Do not know participants personally Are not staff who are designing or implementing the program Are competent in the language and culture of the participants
The QuestionsUse open-ended questions or conversational prompts - “Tell meabout your experience participating in this program.”Use probes when neededPilot test with practice-runs Try to choose 5-10 people from your target group Try to use an interview setting where respondents are asked to explainreactions to question form, wording and order
Interview ExamplePilot-testing training materials1. Develop interview guide2. Recruit participants – have them completethe training3. Schedule phone call4. 10-15 minutes, use online form to helpmanage data5. Send 10 giftcard
So we have these data, now what:Analysis1. Read text, then read it again2. Code text – inductive vs deductive or by question3. Summarize themes Find typifying quotes4. Think back to your goal and purpose5. Discuss as a team and decide how to apply thisinformation
Analysis PrinciplesLet your objectives guide the analysisDon’t get locked into one way of thinkingQuestions are the raw material of analysisEffective analysis goes beyond wordsComputers can help – or hinderAnalysis must have the appropriate level of interpretationAnalysis must be practicalKrueger. The Focus Group Kit: Analyzing & Reporting Focus Group Results. 1998 Sage Publications, Inc.
Using the data: Pilot Test ExampleQuotes: Probably some ideasabout kids with allergieswould be helpful Would be helpful to addsomething about one yearolds - consideredtoddlers.Quotes: Took over 75 minutes, 90minutes max Took a little over an hour Over 2 hours About an hour and half A little less than an hour Around an hourTheme: Barriers to familystyle dining not addressed inthe training Allergies Younger children lackingfine motor skillsAction: Added content andtwo new resources aboutthese topics to the trainingTheme: Training is too longto be completed in one hourAction: Trimmed content andseveral questions to make itshorter
Best Practice: Incentives – factors to consider1. Budget – what can you afford?2. Incentive type – will everyone get it or will they get entered into a lottery for achance to win?3. Your audience – be sure the incentive is something they want and will value(cash incentives offer the highest response rates.but difficult to administer)4. Delivery method – be sure it’s easy to deliver and redeem!5. When to offer it – surprisingly, incentives offered up front are most effective6. Anonymous surveys – trickier. One way is to create a redirect action at end ofsurvey that sends respondent to a second survey that will collect their e-your-respondents/
Assuring Rigor in Qualitative EvaluationSampleQuestionsInterviewers & data collection/analysisTrustworthiness & Credibility
The SampleBest method: continue to collect data until no new informationemerges (data saturation)Depends on the evaluation purpose and questions Breadth of experience across variation in population larger sample Exploration of a narrow phenomenon in depth in specific group smallersamplePurposive sampling can reduce sampling bias Example: WA state breastfeeding policy study with goals for 15 home-based& 15 center-based ECE spread between rural and urban; 20 hospitals; 20community clinics; 20 community coalitions; 20 businesses.
QuestionsPretest, pretest, pretest to assure that questions are worded so thatyou get the information you really need
Invest in Data Collection & AnalysisAll interviewers, observers, focus group leaders, etc. should be as welltrained and supported as possible: Practice with groups and individuals similar to program participants Include possible challenging situations in simulations Emphasize participant safety confidentiality non-judgmental approachesProvide consultation and oversight during data collectionAll equipment should be frequently tested; use duplicate equipmentMore than one person should be involved in data analysis
Trustworthiness & CredibilityTriangulation: cross-checking data from multiple data sources, methodsof data collection, data collectors, and/or using a mixed-methodsapproach to explore and understand inconsistenciesParticipant Feedback: checking with participants concerning the accuracyof the data and interpretationsAlternative Explanations: think about other possible stories the data maybe tellingConsolidated criteria for reporting qualitative research (COREQ): a 32item checklist for interviews and focus groups. Tong et al. ive
Summary: Qualitative MethodsAre often the best approaches for answering evaluation questionsabout “how” an initiative is going and “why” it is going that wayCan be especially useful early in a project when there is uncertaintyAre supported by thoughtful planning for data collection & analysisthat pays off in quality results
ResourcesA brief introduction - Qualitative Program Evaluation hpA more detailed primer: Qualitative Research Methods in Program Evaluation: Considerations forFederal acyf/qualitative research methods in program evaluation.pdfKruger & Casey. Focus Groups - A Practical Guide for Applied Research. 43860Dillman et al. Internet, Mail, and Mixed-Mode Surveys: The Tailored Design Mode-Surveys-Tailored/dp/0471698687Tong et al. Consolidated criteria for reporting qualitative research (COREQ): a 32-item checklist forinterviews and focus groups. International Journal for Quality in Health Care, 19(6), litative
1. Explain what qualitative methods can add to program evaluation and identify situations/reasons when qualitative methods may be appropriate. 2. Identify different types of qualitative evaluation data collection and analysis and list steps involved in doing them. 3. Apply best-practices for qualitative methods in relation to program evaluation.
what qualitative methods are, the kinds of questions that can best be addressed by this approach, and the role qualitative methods can play in program evaluation. Just as quantitative methods are well suited for some types of questions, qualitative methods are particularly well suited to other, specific types of questions.
ter on qualitative research methods (Morrow & Smith, 2000), a comprehensive introduction to qualitative research methods that drew from the larger body of qualitative methodological literature, particularly in education. Given the diversity and comprehensiveness of the qualitative writings in education and the location of many counseling
Qualitative Analysis of Anions 1 Experiment 10 Qualitative Analysis of Anions Pre-Lab Assignment Before coming to lab: Read the lab thoroughly. Answer the pre-lab questions that appear at the end of this lab exercise. The questions should be answered on a separate (new) page of File Size: 343KBPage Count: 16Explore further(PDF) Experiment Report: Analysis of Anions and Cations .www.academia.eduExperiment 7 Qualitative Analysis: Anionswww.csus.eduLab Experiment #8: Qualitative Analysis of Common Anions .www.youtube.comQualitative Analysis of Anions - Odinitywww.odinity.comLab 13 Qualitative Analysis of Cations and Anionsdoctortang.comRecommended to you b
methods are used more often than qualitative methods in criminology and criminal justice. Importantly, quantitative and qualitative methods differ in several ways. The present study contributes to the literature by presenting a theoretical treatment of quantitative and qualitative research. The study begins by presenting quantitative and .
Managing Qualitative Data SECTION 3: Analyzing Qualitative Evaluation Data SECTION 4: Interpreting Qualitative Results The first section provides a brief review of program evaluation. Drawing from concepts that were described in depth in Volume 2, this review will outline the main ideas of program evaluation. It will set the stage for
qualitative data. (Note that pure qualitative research will follow all of the paradigm characteristics of qualitative research shown in the right column of Table 2.1.) Mixed research – research that involves the mixing of quantitative and qualitative methods or paradigm characteristics. The mixing of
The relationship between qualitative, quantitative and mixed methods research. The importance of the research question in an analysis. The need for methodological rigour in qualitative research. 1.1 Qualitative, Quantitative – A Few Clarifications What do the terms ‘qualitative data’ and ‘quantitative data’ mean? While the
Iowa, 348 P. Sharma, O. P. (1986) Textbook of algae. Tata Mcgrawhill Publishing company Ltd. New Delhi. 396. p. UNESCO (1978) Phytoplankton manual. Unesco, Paris. 337 p. Table 1: Relative abundance of dominant phytoplankton species in water sarnples and stomach/gut of bonga from Parrot Island. Sample Water date 15/1/04 LT (4, 360 cells) Diatom 99.2%, Skeletonema costatum-97.3% HT (12, 152 .