Behavior Assessment In RtI - Direct Behavior Ratings

2y ago
10 Views
2 Downloads
6.95 MB
76 Pages
Last View : 1m ago
Last Download : 3m ago
Upload by : Cade Thielen
Transcription

Behavior assessment in RtI:Considerations whenselecting assessment toolsSandra M. Chafouleas, Ph.D.Neag School of Education, University of ConnecticutT. Chris Riley-Tillman, Ph.D.Department of Psychology, East Carolina UniversityAmy M. Briesch, Ph.D.Department of Counseling and Applied Psychology,Northeastern University

Review strengths and limitations ofdifferent school-based behaviormethods within a multi-tiered model ofassessment

1 in 3 pendingaknows someone whohasdisproportionateamountleftdueof thetimeprofessionrespondingtotoissuesrelated todisciplinesignificantbehaviorand behavior(Publicbychallengespresented2004)ofa Agenda,smallCONSIDER numberstudents (U.S. Dept. ofEd., 2000)1 in 5 children has mentalhealth needs yet theSchool discipline is a topmajority will not receiveconcern by the Americanneeded servicespublic (Rose & Gallup,(Hoagwood & Erwin,2005)1997; U.S. SurgeonGeneral, 1999)

Response to Intervention (RTI) An assessment and intervention process forsystematically monitoring student progressand making data-based decisions about theneed for instructional modifications orincreasingly intensified services (seewww.rti4success.org).

Tertiary (FEW)Reduce complications, intensity,severity of current casesSecondary (SOME)Reduce current casesof problem behaviorPrimary (ALL)Reduce new casesof problem behavior

IEP or other student-specific Increasethe numberbehavior goalrelatedoftoappropriatebehaviorsfor thisacquisition ofappropriategroupof studentssocial skillsIndividualTargeted Decreasenumberof goalIncreases inthepro-socialbehaviorStudent-specificbehaviorof student bodybehaviors forinappropriaterelated to decreasein problem thisDecreasesininappropriategroupbehaviorbehavior of student body Decreases in number of studentsreferred for an evaluation forbehavior related disordersUniversal

. but then, how will you know whatyou are doing is working? To make effective decisions about whichpractices are needed and then toevaluate the outcomes of thosepractices, you need DATA!

Developing evidence-based assessment (EBA)begins through a priori delineation ofa) the purposes of assessment, and thenb) identification of the special requirements for each purpose(and associated criteria for stating when requirement ismet)Commentary by Kazdin (2005)

A Few Caveats to Establishing EBA (Kazdin, 2005):Absence of a gold standardcriterion One measure can’t do it all Multiple measures areneeded to evaluatedifferent facets Co-morbidity of “problems” What are the mostrelevant problem features? Multiple perspectives arevaluable yet agreement may(will) be low! What is THEmeasure Ishould use?

Screening Who needs help? Diagnosis Why is the problem occurring? Progress Monitoring Is intervention working? Evaluation How well are we doing overall?Emphasized bythe NationalCenter onResponse toIntervention

ALL BELOW plus functional assessment dataALL BELOW plus Individual Norm-referenced comparison data More detailed profiles of students’strengths/weaknessesTargeted Formative data sources sensitive toincremental changeEFFICIENT, EXTANT SOURCES such as Total number of office discipline referrals Number of students who are suspended orexpelled Number students referred or found eligible forspecial education, particularly in the area ofemotional disturbanceUniversal

ScreeningDiagnosisEvaluationProgressMonitoring

Extant data Standardized behaviorrating scales Systematic directobservation Direct Behavior RatingCurrentlydominate inclinic andresearch

Definition: Data sources that already exist within the setting(“permanent products”)Advantages: Already available Highly contextually relevant Natural occurrence can reduce/limit reactivity(Adapted from Chafouleas, Riley-Tillman, & Sugai, 2007)

Examples: Office discipline referrals (ODRs) Attendance and tardy records Suspension/expulsion data Special education data (e.g. referrals foremotional disturbance) Data from existing behavior managementplans (e.g. token economy)

“an event in which (a) a studentengaged in a behavior thatviolated a rule/social norm in theschool, (b) a problem behaviorwas observed by a member ofthe school staff, and (c) the eventresulted in a consequencedelivered by administrative staffwho produced a permanent(written) produce defining thewhole event” (Sugai, Horner, &Walker, 2000, p. 96)

MAJOR Abusive Language/InappropriateLanguage/ ProfanityArsonBomb Threat/ ComplianceDisruptionDress Code ViolationFighting/ PhysicalAggressionForgery/ TheftGang Affiliation DisplayHarassment/BullyingInappropriate Display ofAffectionInappropriate Location/Out of Bounds Lying/CheatingOther Behavior PropertyDamage/Vandalism Skip class Truancy Tardy Technology Violation Unknown Use/Possession of Alcohol Use/Possession ofCombustibles Use/Possession of Drugs Use/Possession of Tobacco Use/Possession of WeaponsMINOR Defiance/Disrespect/Non-complianceDisruptionDress Code ViolationInappropriate LanguageOtherPhysical Contact/ PhysicalAggressionProperty MisuseTardyTechnology ViolationUnknownSource: 2009-10 Referral Definitionswww.swis.org

ScreeningDiagnosisEvaluationProgressMonitoring

Discrepancy ratioBut howmuch studentisbehavior/too much?peerbehavior2x significantdiscrepancyWillie’s ODR 10/Mean ODR 3.5DR 2.86

Staff at West HighSchool note concernabout the number offights occurring amongstudents.Staff are re-assigned toincrease levels of activesupervision in thoseareas at key times.ODRs over thepast 2 months arereviewed“Johnny and Sam” arebrought to BehaviorSupport Team foradditional supportplanning.Review revealed a) most fightingincidents are occurring outsidecafeteria and in bus loading areaAND b) Johnny and Sam are themost likely culprits.

FRMS Total Office Discipline ReferralsSustained ImpactPre3000PostTotal ODRs2500200015001000500094-95 95-96 96-97 97-98 98-99 99-00 00-01 01-02 02-03 03-04 04-05 05-06Academic Years

Example for diagnosis withODR?

Complements othersources in providingcontextually relevantinformation Source of progressmonitoringinformation Less resourceintensive (datareadily available!) Limited applicationwithin prevention Tough to establishand maintainconsistent/ accurateuse. Unknownpsychometricadequacy Challenging to createa system for efficientorganization andinterpretation

Screening – yes, but may be limited inprevention/early intervention rolesProgress monitoring – yes, but creating usablesystem for interpretation/presentation can bechallengingDiagnosis – maybe, with regard to addingcontextual relevanceEvaluation – yes, relevance within the specificsetting but limited with regard to normcomparisons

Definition:Tools that require an individual to rate the behavior ofanother based on past observation of that person’sbehaviors (Kratochwill, Sheridan, Carlson, & Lasecki, 1999). Examples: Behavior Assessment System for Children – 2 (BASC-2) Achenbach System of Emprically-Based Assessment(e.g. CBCL) Conner’s Rating Scales – 3 Social Skills Intervention System (SSIS)

Comprehensive scales: large number of items (often100 ) that cluster together to assess a wide range ofbehaviors “General purpose” (Merrell, 2008) Often include broadband and narrow-band syndromes(Ramsey, Reynolds & Kamphaus, 2002). Narrow band scales: focused on one or twobehavioral constructs Attention (Brown ADD Scales; Brown, 2001) Adaptive behavior (Vineland-II; Sparrow, Balla, & Cicchetti, 1984)

Problem entionInternalizingAutism SpectrumSocial lityEmpathyEngagementSelf-Control

Ages 7-1727 items (2 week reflection)NegativemoodI am sad once in a whileall the timeInterpersonalProblemsIneffectivenessI like being withpeopleI do mostthings I do not wantto be withpeople at allO.K.wrongAnhedoniaNegative SelfEsteemI am tired Once in awhileI look O.K.all the timeI look ugly

ScaleItem ExamplePhysicalSymptomsI have pains in my chestMy hands feel sweaty or coldHarmAvoidanceI check to make sure things are safeI worry about doing something stupid or embarrassingSocial AnxietyI have trouble asking other kids to play with meI worry about other people laughing at meSeparation/PanicI keep the light on at nightI avoid going places without my familyAges 8-19

But whataboutscreening andprogressmonitoring?

T score 61-70 elevated riskT score 71 highly elevated risk

Short form (approx. 40 items) Fewer items per scale Recommended for progress monitoring ADHD Index Inattention Hyperactivity/Impulsivity Learning Problems Aggression Executive Functioning Peer Relations Family Relations 10 items that best differentiate children with ADHD fromthose without a clinical diagnosis Recommended for screening and progress monitoring Global Index 10 best items from original Conners’ Rating Scales Progress monitoringTemper outbursts, Excitable/impulsive, Restless, Cries often, Inattentive, Fidgeting, Disturbsother children, Easily frustrated, Fails to finish things, Mood changes quickly

47 items designed to assess scales of AttentionProblems, Hyperactivity, Internalizing Problems,Adaptive Skills

May be most helpful in diagnostic assessment.Provide a commonunderstanding of thespecific behaviors thatare indicative of a givencluster term.May also be suited foruse in screening andevaluative assessmentpractices.May not be sensitive toincremental change. May be feasible only foroccasional use giventime/cost. Many clinically-focused(i.e., focus on problemrather than pro-socialbehavior). Do not directly assessbehavior –rater bias maybe present.

Screening – yes, but scope and size ofmeasures varies widely Progress monitoring – not likely Diagnosis – yes, most common usewithin clinical settings Evaluation – maybe, if the period of timeis sufficient and constructs measuredare relevant

Definition:Data collected by an observer watching anenvironment/person for some period of time Examples: Percentage of intervals observed to be actively engaged Frequency of positive peer initiations throughout theday Recording how long it takes to transition in the hallway(duration)

Studies suggest moderate to high levels ofreported use 67% of school psychologist report using directobservation in 4 of their last 10 case Shapiro &Heick (2004), 63% to 73% of School Psychologistreport moderate to frequent use (Riley-Tillmanet. al, 2008).

Frequency - number of events in a period oftime (e.g., 4 hits in a 6 hour day)Rate - number of events per unit of time (e.g.,4 social initiations per hour)Percentage of opportunities – use ifbehaviors follow specific cues (e.g., followeddirections given on 40% of occasions)

Data recorded during pre-specified intervals of time,then summarized into percentage of intervals ofbehavioral occurrences Time-based techniques result in approximations ofbehavioral events because behavior is sampled inone of three basic ways: Whole interval recording Partial interval recording Momentary time sampling

Duration Total time (e.g., actively engage in reading for 12minutes) Percent of time (e.g., out of seat for 35% of the readingperiod) Average time per event (e.g., each temper tantrumlasted an average of 7.5 minutes)Latency – time for behavior to begin after prompt orantecedent cue provided (e.g., on average 2 minutesto begin task after teacher direction given)

BASC-2 Student Observation System (Reynolds & Kamphaus, 2004) 15-minute observation w/ 30-second intervals Response to teacher, Peer Interaction, Works on School Subjects, TransitionMovement, Inappropriate Movement, Inattention, Inappropriate Vocalization,Somatization, Repetitive Motor Movements, Aggression, Self-Injurious Behavior,Inappropriate Sexual Behavior, Bowel/bladder problems Academic Engaged Time Code of the SSBD (Walker & Severson, 1990) Time spent engaged in academic material Let stopwatch run Divide AET by Total Time Behavioral Observation of Students in Schools (Shapiro, 2004) 15-minute observation w/ 15-second intervals Active/Passive Engaged, Off-task motor/verbal/passive, Teacher-Directed Instruction

Direct Observation Form (Achenbach, 1986) 10-minute observation w/ 10-minute intervals On/Off-Task ADHD School Observation Code (Gadow et al., 1996) 15-minute observation w/ 15-second intervals Interference, Motor Movement, Noncompliance, Non-physical aggression, Off-task Classroom Observation Code (Abikoff & Gittelman, 1985) 30-minute observation Interference, Minor Motor Movement, Gross Motor Standing/Vigorous, Physical/Verbal Aggression,Solicitation of Teacher, Off-Task, Noncompliance, Out of Chair, Absence of Behavior State-Event Classroom Observation System (Saudargas, 1997) 20-minute observation w/ 15-second intervals School Work, Looking Around, Social Interaction with Child/Teacher, Out of Seat, Raise Hand, CallingOut, Approach Teacher

ScreeningDiagnosisEvaluationProgressMonitoring

A single SDO is rather feasible – 10-15min.Feasibility though decreases as observationnumbers increase Assuming a min number of observations (5), this balloonsto 50-75 minutes of observation with additional entry/exittime. Over 100 cases (a rather typical school psychologist yearlyload), this is 5,000 – 7,500 minutes, or 83 – 125 hours.

Dawn’s Percentage of Off-Task Behavior in History ClassBaselineFunctionNotFunctionNotFunctionBased SM function- Based SM function- Based SMBased SMBased SM

Sample Intervention GraphNumber of Times theStudent is Reported to 12345678910 11 12 13 14 15 16 17 18 19 20Days

DR for OffTask 43/17 2.5 x50Percent of nC

Highly flexibleUseful in progressmonitoring Directness Standardizedprocedures Minimal cost formaterials Potential reactivityObserver error/driftLimited feasibility re:training andintrusiveness Difficult to monitorlow frequencybehaviors Generalizability

Screening – not likely in universalassessment Progress monitoring – yes Diagnosis – maybe, particularly ifwithin FBA Evaluation – not likely

Definition:A tool that involves a brief rating of a target behaviorfollowing a specified observation period (e.g. classactivity) by those persons who are naturally occurring inthe context of interest Examples: Behavior Report CardHome-School NoteDaily Progress ReportGood Behavior NoteCheck-In Check-Out Card

ExampleDBRscales

MondayStudent(specify behavior here)JKStudentL(specify behavior here)JKLStudent(specify behavior here)JKLStudent(specify behavior here)JKLTuesdayWednesdayThursdayFriday

Example:StandardForm forSingle-itemDBR scalesDownload:www.directbehaviorratings.com

Academic Engagement:Actively or passively participating inthe classroom activity.Respectful:Compliant and polite behavior inresponse to adult direction and/orinteractions with peers and adults.AcademicallyEngagedKEYS TOSUCCESSRespectfulDisruptive Behavior:A student action that interruptsregular school or classroom activity.NonDisruptive

1) Complete top portion of the form Student’s name, Date, Rating period(s) Review behavior definitions and rating directions2) Have the form ready for completion following eachpre-identified activity period e.g., Reading block, independent seat work3) Immediately following the activity period, completethe ratings Do not complete the rating if you aren’t confident youdirectly observed the student for a sufficient amount oftime

Ratings should correspond to the proportion of time thatyou actually observed the student display the targetbehavior. Complete immediately following the activity period. Do not complete if you did not observe for a sufficientamount of time.When rating, each behavior should be consideredindependently of the other targets. That is, total ratingsacross behaviors do not have to equal 100%. For example, a student may be engaged 50% of the time,and disruptive 20%. A student may also be engaged for100% of the time, and disruptive for 10%.

ScreeningDiagnosisEvaluationProgressMonitoring

///0000000000000777777777PPPPPPPPPMMMMMMMMM1 0987654321011 2/31 2/24 /12 / 5 /71//712P6//71 2P M7/21/17P M/00211P M/ 7/120P M11//M722011//7P42011/7P M2 / 1 70//P M10 71 8/21P M90 7///P M2 20 710/7P M/1 / 3 08 0P M1 / 4 /7M0P1 / 7 /8M0/8 / 0 8/0 88111111111havi“Local” Cut-PointsDi r ec tBehavi orNormative Cut-Points.Rat i ngs :Af t er noonA c a d e m i cE n g a g e m e n tD i s r u p t i v e B e h a v i o rorRatin

How Often?We recommend (5 to)10 datapoints per phase,but the emphasis is onideographic analysis andhigh/low stakes decisionsAcademically EngagedDisruptive Behavior

/111 3/0/1 711 4/0 PM/1 711 5/0 PM/1 711 6/0 PM/2 711 6/0 PM/2 711 7/0 PM/2 711 8/0 PM/2 711 9/0 PM/3 7P012 /07 M/3P12 /07 M/4P12 /07 M/5P12 /07 M/6PM12 /0712 /7/ PM/1 0712 0/0 PM/1 712 1/0 PM/1 712 2/0 PM/1 712 4/0 PM/1 712 7/0 PM/1 712 8/0 PM/1 79/ PM012 7 PM/1/ 20/2/ 0708P1/ M3/1/ 084/1/ 087/1/ 088/0811Direct Behavior RatingDirect Behavior Ratings: AfternoonDBR Afternoon(11/13 to 1/8)109876543210

behaviorratings.com/index.html

Highly flexibleUseful in progressmonitoring Directness Potential forstandardizedprocedures Minimal cost formaterials GeneralizabilityRater bias is likelypresent Trainingrequirementsunknown Limited psychometricknowledge beyondDBR-SIS

Screening - maybe Progress monitoring - yes Diagnosis – maybe, particularly ifwithin FBA Evaluation – not likely

Extant dataStandardized behavior rating scalesSystematic direct observationDirect Behavior RatingWHICH TO USE? Consider Psychometric adequacyUsabilityContextual relevance

Why do I need data?At what level should theproblem be solved?What is the purpose ofassessment?(Primary, Secondary, Tertiary)(Screening, Progress Monitoring,Evaluation, Diagnosis)Which data do I need?Which tools are bestmatched to assess thebehavior of interest?Contextual relevanceWhat decisions will bemade using these data?PsychometricAdequacyWhat resources areavailable to collectdata?UsabilityWhich tools can answer these questions?Adapted from Chafouleas, Riley-Tillman, & Sugai, 2007

ALL BELOW, with emphasis on functionalassessment dataIndividualEXTANT DATABEHAVIOR RATING SCALESTargetedSYSTEMATIC DIRECT OBSERVATIONDIRECT BEHAVIOR RATINGEXTANT DATABEHAVIOR RATING SCALES developed foruniversal screeningDIRECT BEHAVIOR RATINGUniversal

How do we develop school “buy-in” and capacityregarding roles in prevention related to socialbehavior and mental health? How do we facilitate capacity for schools toinclude universal screening? How can schools integrate a common logic andlanguage within the domains of social behavior? How do we forge new directions in thedevelopment and evaluate of assessments that aretechnically adequate, contextually relevant, andusable in schools?

Further information:Chafouleas, S.M., Riley-Tillman,T.C., & Sugai, G.(2007). School-Based Behavioral Assessment:Informing Instruction and Intervention. New York:Guilford.Note. This presentation can be downloaded @uconn.edurileytillmant@ecu.edua.briesch@neu.edu

Tools that require an individual to rate the behavior of another based on past observation of that person’s behaviors (Kratochwill, Sheridan, Carlson, & Lasecki, 1999). Examples : Behavior Assessment System for Children – 2 (BASC-2) Achenbach System of Emprically-Based Assessment (e.g. CBCL) Conner’s Rating Scales – 3

Related Documents:

RTI and Behavior Supports! 7/11/2010 Jeffrey Sprague, Ph.D. (jeffs@uoregon.edu) 1 RTI for Behavior Support: Applying the RTI Logic to Il tiImplementing PBIS at Tiers 1, 2 and 3 Jeffrey R. Sprague, PhD Institute on Violence and Destructive Behavior 7/11/2010 Jeffrey Sprague,Ph.D. (jeffs@uoregon.edu) 1 University of Oregon Agenda

This manual is intended to provide guidance for implementation of RtI, but RtI is a process that will continue to evolve. Oconto Falls RtI Belief Statements and Guiding Principles All Children can learn and all Children can make progress. Our RtI framework starts with a strong core instruction that is differentiated to student needs.

overview of DDS terms, please see the RTI Data Distribution Service User’s Manual. Chapter 1 1-2. 2-1 2. Configuring RTI Recorder . This configuration file provides a configur ation that can be used with the tutorial found in the RTI Recorder Getting Started Guide to learn about how to modify RTI

implemented under an RTI model of behavior are supported by scientific research to improve student social and behavior functioning. 5. Treatment integrity - Refers to the notion that interventions or supports being implemented in an RTI model for behavior should be implemented as intended to enable appropriate and legally defensible decision .

interventions. The study attempted to determine if the high school implemented the essential components of RtI with fidelity, defined as “implemented RtI as it was intended by the program developers” (Mellard & Johnson, 2008, p. 240). The study found that not all essential components of RtI were implemented with

1-1 1. Welcome Chapter 1 Welcome to RTI Spreadsheet Add-in for Microsoft Excel Welcome to RTI Spreadsheet Add-in for Microsoft Excel.This revolutionary component of RTI Data Distribution Service allows you to rapidly analyze, visualize, and respond to your real-time data, transf

Bench scale Pilot scale Commercial scale RTI has an extensive pipeline of fluidized-bed processes Many technologies are entering pilot- scale and commercial-scale demonstration phases CFD modeling offers tremendous benefits for RTI's scale-up and commercialization efforts www.rti.org 5/14/2010 3 RTI's Fluidized-bed Reactor Technologies

SAP Payroll Solution for RTI - Configuration Highlights The following elements of system configuration comprise the SAP solution together with the XI/PI connection. Feature: GBCHG Node 11 of GBCHG feature will return the go-live date of the RTI solution. RTI must be activated for the whole of the tax reference.