A COMPARISON OF CURRICULUM BASED MEASURES OF ORAL READING .

3y ago
25 Views
2 Downloads
1,019.48 KB
65 Pages
Last View : 26d ago
Last Download : 3m ago
Upload by : Jayda Dunning
Transcription

A COMPARISON OF CURRICULUM BASED MEASURES OFORAL READING FLUENCYByTrish MerrillB.A. University of Southern Maine, 2006M.P.P. University of Southern Maine, 2010M.S. University of Southern Maine, 2016A Dissertation Presented in Partial Fulfillment of the Requirements for the DegreeDoctor of Psychology (in School Psychology)The University of Southern MaineMarch, 2018Advisory Committee:Rachel Brown, Associate Professor of Educational and School Psychology, AdvisorEileen Harris, School Psychologist, Kennebunk MaineGarry Wickard, Assistant Professor of Educational and School Psychology

ProQuest Number: 10838430 All rights reserved INFORMATION TO ALL USERSThe quality of this reproduction is dependent upon the quality of the copy submitted.In the unlikely event that the author did not send a complete manuscriptand there are missing pages, these will be noted. Also, if material had to be removed,a note will indicate the deletion. ProQuest 10838430Published by ProQuest LLC (2018 ). Copyright of the Dissertation is held by the Author. All rights reserved.This work is protected against unauthorized copying under Title 17, United States CodeMicroform Edition ProQuest LLC. ProQuest LLC.789 East Eisenhower ParkwayP.O. Box 1346Ann Arbor, MI 48106 - 1346

ii 2018 Trish MerrillAll Rights Reserved

iiiLIBRARY RIGHTS STATEMENTIn presenting this dissertation, THE COMPARISON OF CURRICULUM BASEDMEASURES OF READING FLUENCY, in partial fulfillment of the requirements for anadvanced degree at the University of Southern Maine, I agree that the Library shall makeit freely available for inspection. I further agree that permission for copying, as providedfor by the Copyright Law of the United States (Title 17, U.S. Code), of this Dissertationfor scholarly purposes may be granted. It is understood that any copying or publication ofthis Dissertation for financial gain shall not be allowed without my written permission. Ihereby grant permission to the University of Southern Maine Library to use myDissertation for scholarly purposes.Signature:Trish MerrillDate: 3/19/2018

ivA COMPARISON OF CURRICULUM BASED MEASURES OFORAL READING FLUENCYBy Trish Merrill, MPP, MSDissertation Advisor: Dr. Rachel BrownAn Abstract of the Dissertation PresentedIn Partial Fulfillment of the Requirements for theDegree of Doctor of Psychology(in School Psychology)March, 2018Curriculum Based Measurements (CBM) are a widely-used tool for Response toIntervention (RTI) progress monitoring. In addition, they can be used in thedetermination of learning disabilities and special education qualification. The mostwidely used type of CBM is a measure of oral reading fluency (ORF). This type involveshaving a student read out loud for 1 minute while the examiner records any errors. Alsoknown as reading curriculum-based measures (RCBM), various published forms ofRCBM have been documented to be reliable and valid measures of all aspects of readingskills. Nonetheless, not all RCBM forms are the same, and the differences in featuresacross published versions could affect student scores. This study examined the textualcomposition of three different published versions of RCBM probes to determine passagesimilarity and difficulty. The study also examined the consistency in student readinglevels across the RCBM passage sets. A total of 202 students completed three passagesfrom each of the selected probe sets for a total of nine passages each. Results indicated

vthat all RCBM passages were correlated with each other and with a statewide assessmentof reading. Mixed results were obtained when analyzing correlations between RCBM anda computer administered universal screening measure in reading. Significant differenceswere found in the overall number of words read correctly, dependent on the passage set.Significant differences were also noted in the number of students identified as at-risk ofreading difficulties or in need of reading intervention based on each of the RCBMpassage sets as compared to other standardized tests of reading. Regarding the textualcomposition of the three versions, passage sets appeared similar when similar lengthpassages were compared, however, descriptive statistics suggested that passage leveldifficulty may vary depending on the passage within the set.

viACKNOWLEDGEMENTSI would like to thank Dr. Rachel Brown for her support throughout my time in theprogram and with the completion of this dissertation. I would also like to thank mycommittee members, Dr. Eileen Harris and Dr. Garry Wickerd for your interest andinvolvement in this project.Thank you to the school district where I was an intern and where I completed thisstudy. Thank you for having me, teaching me, and allowing me to conduct this research.Thank you to the volunteers who listened to the young readers. To the teachers, thankyou for your flexibility and support. The care you have for the students in your schoolwas inspiring. Finally, thank you to the first, second and third graders who participated inthis study.I would also like to thank all of my practicum and internship supervisors, whohave shared their knowledge and resources and provided guidance throughout theprogram: Dr. Alexis Kiburis, Dr. Rebekah Bickford, Dr. Eileen Harris, Dr. RichardGuare, Dr. Peg Dawson, Dr. Mark Steege and Dr. Jamie Pratt.Finally, I would like to thank my parents, Teresa and Charles LaPointe and myMorgan family (Mama Morgan, Dad Morgan, Jakoba, Jamesie, Dirk, Kris, Jantje andSietske). Thank you all of your love and support throughout the years. And most of allthank you to Prashant. From preparation for the GRE to the “completion” of this project,you have been there. Thank you for letting me practice my reading lessons andstandardized tests on you. Thank you for the many walking discussions that shaped thisproject, the ongoing brainstorming sessions, and for your statistical and text analyticsexpertise. You always push me to do the best work, and it was the best to share this

viiproject with you. I can’t wait to continue this work and my life with you. Me tumse pyarkathee hoon.

viiiTABLE OF CONTENTSLIST OF TABLES . ixLIST OF FIGURES . xCHAPTER 1: INTRODUCTION . 1Previous Studies . 4Technical Properties of Selected RCBM . 6AIMSweb . 7DIBELS. 7FAST . 9Research Questions . 12CHAPTER 2: METHOD . 13Design . 13Participants. 13Materials . 15Procedures . 17Data Analysis . 19CHAPTER 3: RESULTS . 23Research Question 1: Score Similarities and Differences . 23Research Question 2: Consistency in Grade Level Placement Accuracy . 29Research Question 3: Correlation Analysis . 34Research Question 4: Textual Analysis . 35CHAPTER 4: DISCUSSION. 40Limitations and Implications for Future Research. 46CHAPTER 5: SUMMARY . 48REFERENCES . 50BIOGRAPHY OF THE AUTHOR. 54

ixLIST OF TABLESTable 1.1 Comparison of the Technical Characteristics of AIMSweb, DIBELS Next, &FAST RCBM Passages for Grades 1 - 3 . 11Table 2.1 Demographic Characteristics of Study Participants . 14Table 2.2 Special Education Identification Categories by Grade Level . 15Table 2.3 Inter-Observer Agreement by Rater . 18Table 3.1 Reading Passage Score Means (SD) by Student Level Characteristics andPublisher . 25Table 3.3 Means and Standard Deviations for Probes by Grade Level . 27Table 3.4 Analysis of Variance (ANOVA) Differences Based on Order of Administration. 27Table 3.5 Descriptive Statistics for Within-Batch Effects (Practice Effect) . 28Table 3.6 Analysis of Variance (ANOVA) Differences Based on Order of AdministrationWithin-Batch (Practice Effect) . 28Table 3.7 Average Number of Words Read Correctly (WRC) by Publisher and GradeLevel . 29Table 3.8 Overall Placement Agreement . 30Table 3.9 Placement Agreement by Grade . 30Table 3.10 Comparison of Packages in Identifying Students below Grade Level . 31Table 3.11 Comparison of Actual Risk and STAR Risk to AIMSweb, DIBELS Next, &FAST Risks by numbers and (percentages) . 32Table 3.12 Specificity, Sensitivity, and Area Under the Curve (AUC) for each RCBMPackage . 33Table 3.13 Predicted and Actual Risk Levels for each RCBM Package . 33Table 3.14 Differences Between Specificity and Sensitivity by Publisher . 34Table 3.16 Correlation Analysis Results for Second Grade . 35Table 3.17 Correlation Analysis Results for Third Grade. 36Table 3.18 Average Number of Words per Passage by Publisher and Grade . 36Table 3.19 Average Number of Irregular Words per Passage by Company and Grade . 37Table 3.20 Average Number of Syllable Types per Passage by Publisher and Grade . 38

xLIST OF FIGURESFigure 3.1: ROC Curve AIMSweb .33Figure 3.2: ROC Curve DIBELS .33Figure 3.3: ROC Curve FastBridge .33

CHAPTER 1: INTRODUCTIONAccording to the U.S. National Institute of Educational Statistics, only 36% offourth graders nationwide are reading at or above the proficient level on the NationalAssessment of Education Progress (NAEP, 2015). However, research clearly articulatesthat nearly all struggling readers can learn to read when provided explicit and systematicinstruction in all five areas of reading, including phonemic awareness, phonics, fluency,vocabulary, and comprehension (Kilpatrick, 2015). With the passing of the No Child LeftBehind (NCLB) legislation in 2002, schools began to be held accountable for studentoutcomes as demonstrated by their performance on high stakes tests (Deno, 2015).Additionally, legislation mandated that scientifically-based programs and curriculum beused in teaching students to read (Ritchey & Goeke, 2006). The NCLB Act stated that “alearning system or program of reading instruction must be based on scientifically basedreading research” be used by schools (Ritchey & Goeke, 2006, p.172). In the most recentfederal legislation, the Every Student Succeeds Act (ESSA), signed into law in Decemberof 2015, standards for evidence-based interventions were upheld and definitions of“evidence based interventions” were provided to further assist schools in using thosecurriculum with the best rationale for effectiveness. “The term evidence-based isunderstood to mean that a particular practice has been shown to be effective in two ormore studies with different groups and settings of students” (Brown, 2016).At the same time, curriculum-based measures (CBM) have become increasinglycommonplace as quick and inexpensive tools that can identify students at risk ofacademic failure and also monitor the effectiveness of evidence-based interventions beingimplemented (Deno, 2015). Schools across the country are currently engaged in efforts to

2implement Response to Intervention (RTI), also known as a Multi-Tiered System ofSupport (MTSS) model, as “part of their efforts to screen and identify students who areacademically at risk and then to monitor their growth rates as they move into differenttiers, or levels of intensified intervention” (Deno, p. 21, 2015). With the passing of theIndividuals with Disabilities Education Improvement Act (IDEA) of 2004, schools wereencouraged to use problem-solving methods alongside CBM to prevent learningproblems. Recognizing that traditional processes of special education referral andprogramming often delay early intervention, CBM have been adopted for use withinproblem solving models of identification and intervention (Deno, 2015). The commonlyaccepted model of problem solving includes the following steps: (a) identifying theproblem, (b) defining the problem, (c) exploring alternative interventions,(d) applying thealternative intervention, and (e) analyzing the effects of the intervention (Deno, 2015).CBM can be used to identify and define the problem and to analyze the effects ofintervention.Although CBM assessment tools were oringinally developed to monitor theprogress of students with disabilities, they can also be used to meet the needs of earlyidentification and effectiveness of educational programs for students who were strugglingto learn in the general education classroom (Hosp, Hosp, & Howell, 2007). Whilesystems existed in other domains, such as in Applied Behavior Analysis (ABA), therewas no coherent system for implementing an intervention and measuring its effectivenessin the academic domain (Hosp et al., 2007). The early development of CBM was led byStan Deno and Phylis Mirkin the 1970s and 1980s at the Minnesota Institute for Research

3on Learning Disabilities (Hosp et al., 2007). CBM were developed in the areas ofreading, mathematics, spelling, and writing.Oral reading fluency is the most commonly used and well-researched CBM(Wayman et al., 2007). In addition to measuring decoding skills, oral reading fluencymeasures have been found to correlate with reading comprehension (Fuchs et al., 1988;Reschly, 2009). Each package of CBM in oral reading fluency consists of a series ofshort stories or passages that students are asked to read aloud for one minute. The numberof words read correctly in one minute is then calculated and compared to establishedbenchmarks to determine the student’s performance relative to grade level standards andsame-aged peers. Each package of oral reading fluency probes includes standardizedprocedures for administering and scoring the assessment. When used for universalscreening, the student’s performance is used to determine which students requireadditional instruction. When used as progress monitoring assessments, students’ scorescan be compared over time to determine the rate of improvement in reading whenprovided with appropriate intervention.A number of different sets of CBM reading passages (e.g., RCBM) arecommercially available (e.g., AIMSweb, DIBELS, EasyCBM, FAST). Each of thecommercially available RCBM were developed according to different word and sentenceselection methods. Initially, the passages used in RCBM were pulled directly from gradelevel curricula (Ardoin & Christ, 2009). Due to the variation in curricula used acrossgrade levels as well as student familiarity with the passages, this was an inconsistentmeasure of true reading ability that resulted in high levels of test error (Ardoin & Christ,2009). Using passages directly from the curriculum resulted in “inconsistent student

4performance” that was likely a result of differing levels of text difficulty that dependedmore on which curriculum and which passage was used, and less on student ability(Ardoin & Christ, 2009).As an alternative, educators and researchers began to compose uniformcurriculum-neutral passages for grade level assessment, relying heavily on readabilityformulas (Ardoin & Christ, 2009). While many studies have documented the technicaladequacy of RCBM for predicting later reading proficiency, a remaining question existsaround the equivalence of passages within and between packages (Christ, 2015; Ardoin& Christ, 2009). Passages were developed based on a variety of different readabilityformulas, though most used a frequency count of the characteristics of text such as thenumber of syllables per word, words per sentence, or number of high frequency words(Ardoin & Christ, 2009). One problem with this method was that these characteristicsresulted in a measure more closely aligned with reading comprehension than decodingability (Ardoin & Christ, 2009). Additionally, research showed that the readability scoresof passages did not predict student performance on those passages (Christ, 2015; Ardoin& Christ, 2009). Additional studies have identified elements of passages that were mostcorrelated with reading fluency, including the number of syllables per 100 words, thenumber of words in a passage not included on a high frequency word list, the number ofdecodable words per passage, the number of words with more than one syllable perpassage, and overall sentence length (Wayman et al., 2007).Previous StudiesIn a study conducted using generalizability theory (G theory), a statisticalmethodology that attributes the amount of error in test scores to the source, Poncy (2005)

5found the majority of variance in RCBM probes was due to individual student differencesand grade level, as would be expected. However, as much as 10% of the variation inscores was attributable to passage variation in difficulty, and an additional 9% of thevariation was not explained.In studies comparing AIMSweb, DIBELS and FAIP-R (the earlier versio

Table 1.1 Comparison of the Technical Characteristics of AIMSweb, DIBELS Next, & FAST RCBM Passages for Grades 1 - 3 . reading, mathematics, spelling, and writing. Oral reading fluency is the most commonly used and well-researched CBM . Fluency. curriculum . p p p p (Reading. Reading

Related Documents:

3.0 TYPES OF CURRICULUM There are many types of curriculum design, but here we will discuss only the few. Types or patterns are being followed in educational institutions. 1. Subject Centred curriculum 2. Teacher centred curriculum 3. Learner centred curriculum 4. Activity/Experience curriculum 5. Integrated curriculum 6. Core curriculum 7.

Comparison table descriptions 8 Water bill comparison summary (table 3) 10 Wastewater bill comparison summary (table 4) 11 Combined bill comparison summary (table 5) 12 Water bill comparison – Phoenix Metro chart 13 Water bill comparison – Southwest Region chart 14

figure 8.29 sqt comparison map: superior bay (top of sediment, 0-0.5 ft) figure 8.30 sqt comparison map: 21st avenue bay figure 8.31 sqt comparison map: agp slip figure 8.32 sqt comparison map: azcon slip figure 8.33 sqt comparison map: boat landing figure 8.34 sqt comparison map: cargill slip figure

chart no. title page no. 1 age distribution 55 2 sex distribution 56 3 weight distribution 57 4 comparison of asa 58 5 comparison of mpc 59 6 comparison of trends of heart rate 61 7 comparison of trends of systolic blood pressure 64 8 comparison of trends of diastolic blood pressure 68 9 comparison of trends of mean arterial pressure

Water bill comparison summary (table 3) 10 Wastewater bill comparison summary (table 4) 11 Combined bill comparison summary (table 5) 12 Water bill comparison - Phoenix Metro chart 13 Water bill comparison - Southwest Region chart 14 Water bill comparison - 20 largest US cities chart 15

akuntansi musyarakah (sak no 106) Ayat tentang Musyarakah (Q.S. 39; 29) لًََّز ãَ åِاَ óِ îَخظَْ ó Þَْ ë Þٍجُزَِ ß ا äًَّ àَط لًَّجُرَ íَ åَ îظُِ Ûاَش

Collectively make tawbah to Allāh S so that you may acquire falāḥ [of this world and the Hereafter]. (24:31) The one who repents also becomes the beloved of Allāh S, Âَْ Èِﺑاﻮَّﺘﻟاَّﺐُّ ßُِ çﻪَّٰﻠﻟانَّاِ Verily, Allāh S loves those who are most repenting. (2:22

1.6.1 A religious knowledge curriculum 26 1.6.2 A religious studies curriculum 27 1.6.3 A religious education curriculum 27 1.7 Religious knowledge and the national curriculum 29 1.8 Religious knowledge and the humanities 31 1.9 Conclusion 35 CHAPTER TWO: REVIEW OF THE RELIGIOUS KNOWLEDGE CURRICULUM IN NIGERIA 36 2.1 The development of the religious knowledge curriculum in 36 2.1.1 The .