Supplemental Information For Appendix A Of The Common Core .

3y ago
21 Views
2 Downloads
587.28 KB
10 Pages
Last View : 11d ago
Last Download : 3m ago
Upload by : Kaleb Stephen
Transcription

Supplemental Information for Appendix A of the Common Core State Standardsfor English Language Arts and Literacy: New Research on Text ComplexityI.Summary IntroductionAppendix A of the Common Core State Standards (hereafter CCSS) contains a review of the researchstressing the importance of being able to read complex text for success in college and career. Theresearch shows that while the complexity of reading demands for college, career, and citizenship haveheld steady or risen over the past half century, the complexity of texts students are exposed to hassteadily decreased in that same interval. In order to address this gap, the CCSS emphasize increasing thecomplexity of texts students read as a key element in improving reading comprehension.The importance of text complexity to student success had been known for many years prior to therelease of the CCSS, but its release spurred subsequent research that holds implications for how theCCSS define and measure text complexity. As a result of new research on the quantitative dimensions oftext complexity called for at the time of the standards’ release1, this report expands upon the three-partmodel outlined in Appendix A of the CCSS in ELA/Literacy that blends quantitative and qualitativemeasures of text complexity with reader and task considerations. It also presents new field-tested toolsfor helping educators assess the qualitative features of text complexity.II.New Findings Regarding the Quantitative Dimension of Text ComplexityThe quantitative dimension of text complexity refers to those aspects—such as word frequency,sentence length, and text cohesion (to name just three)—that are difficult for a human reader toevaluate when examining a text. These factors are more efficiently measured by computer programs.The creators of several of these quantitative measures volunteered to take part in a research studycomparing the different measurement systems against one another. The goal of the study was toprovide state of the science information regarding the variety of ways text complexity can be measuredquantitatively and to encourage the development of text complexity tools that are valid, transparent,user friendly, and reliable.2 The six different computer programs that factored in the research study arebriefly described below:1The full report, Measures of Text Difficulty, and other resources, can be accessed on www.achievethecore.org/text-complexity.The following list of participants in the research study is not an exhaustive list of programs that exist for the purpose of measuring textcomplexity, nor is their inclusion intended as an endorsement of one method or program over another.21

ATOS by Renaissance LearningATOS incorporates two formulas: ATOS for Text (which can be applied to virtually any text sample,including speeches, plays, and articles) and ATOS for Books. Both formulas take into account threevariables: words per sentence, average grade level of words (established via the Graded VocabularyList), and characters per word.Degrees of Reading Power (DRP ) by Questar Assessment, Inc.The DRP Analyzer employs a derivation of a Bormuth mean cloze readability formula based on threemeasureable features of text: word length, sentence length, and word familiarity. DRP text difficultyis expressed in DRP units on a continuous scale with a theoretical range from 0 to 100. In practice,commonly encountered English text ranges from about 25 to 85 DRP units, with higher valuesrepresenting more difficult text. Both the measurement of students’ reading ability and thereadability of instructional materials are reported on the same DRP scale.Flesch-Kincaid (public domain)Like many of the non-proprietary formulas for measuring the readability of various types of texts,the widely used Flesch-Kincaid Grade Level test considers two factors: words and sentences. In thiscase, Flesch-Kincaid uses word and sentence length as proxies for semantic and syntactic complexityrespectively (i.e., proxies for vocabulary difficulty and sentence structure).The Lexile Framework For Reading by MetaMetricsA Lexile measure represents both the complexity of a text, such as a book or article, and anindividual’s reading ability. Lexile measures include the variables of word frequency and sentencelength. Lexile measures are expressed as numeric measures followed by an “L” (for example, 850L),which are then placed on the Lexile scale for measuring reader ability and text complexity (rangingfrom below 200L for beginning readers and beginning-reader materials to above 1600L for advancedreaders and materials).Reading Maturity by Pearson EducationThe Pearson Reading Maturity Metric uses the computational language model Latent SemanticAnalysis (LSA) to estimate how much language experience is required to achieve adult knowledge ofthe meaning of each word, sentence, and paragraph in a text. It combines the Word Maturitymeasure with other computational linguistic variables such as perplexity, sentence length, andsemantic coherence metrics to determine the overall difficulty and complexity of the language usedin the text.SourceRater by Educational Testing ServiceSourceRater employs a variety of natural language processing techniques to extract evidence of textstanding relative to eight construct-relevant dimensions of text variation: syntactic complexity,vocabulary difficulty, level of abstractness, referential cohesion, connective cohesion, degree ofacademic orientation, degree of narrative orientation, and paragraph structure. Resulting evidenceabout text complexity is accumulated via three separate regression models: one optimized forapplication to informational texts, one optimized for application to literary texts, and one optimizedfor application to mixed texts.Easability Indicator by Coh-MetrixOne additional program—the Coh-Metrix Easability Assessor, developed at the University ofMemphis and Arizona State University—factored in the research study but was not included in the2

cross analysis. It analyzes the ease or difficulty of texts on five different dimensions: narrativity,syntactic simplicity, word concreteness, referential cohesion, and deep cohesion.3 This measure wasnot included in the cross analysis because it does not generate a single quantitative determinationof text complexity, but it does have use as a tool to help evaluate text systematically. The CohMetrix Easability Assessor creates a profile that offers information regarding the aforementionedfeatures of a text and analyzes how challenging or supportive those features might be in studentcomprehension of the material.The research that has yielded additional information and validated these text measurement tools wasled by Jessica Nelson of Carnegie Mellon University, Charles Perfetti of University of Pittsburgh andDavid and Meredith Liben of Student Achievement Partners (in association with Susan Pimentel, leadauthor of the CCSS for ELA). It had two components: first, all the developers of quantitative tools agreedto compare the ability of each text analyzer to predict the difficulty of text passages as measured bystudent performances on standardized tests. Second, they agreed to test the tools’ ability to predictexpert judgment regarding grade placement of texts and educator evaluations of text complexity byexamining a wide variety of text types selected for a wide variety of purposes. The first was measuredby comparing student results in norming data on two national standardized reading assessments to thedifficulty predicted by the text analyzer measures. The second set of data evaluated how well each textanalyzer predicted educator judgment of grade level placement and how well they matched thecomplexity band placements used for the Appendix B texts of the CCSS. In the final phase of the work,the developers agreed to place their tools on a common scale aligned with the demands of collegereadiness. This allows these measures to be used with confidence when placing texts within gradebands, as the common scale ensures that each will yield equivalent complexity staircases for reachingcollege and career readiness levels of text complexity.4The major comparability finding of the research was that all of the quantitative metrics were reliablyand often highly correlated with grade level and student performance based measures of text difficultyacross a variety of text sets and reference measures.5 No one of the quantitative measures performedsignificantly differently than the others in predicting student outcomes.6 While there is variancebetween and among the measures about where they place any single text, they all climb reliably—though differently—up the text complexity ladder to college and career readiness. Choosing any one ofthe text-analyzer tools from second grade through high school will provide a scale by which to rate textcomplexity over a student’s career, culminating in levels that match college and career readiness.In addition, the research produced a new common scale for cross comparisons of the quantitative toolsthat were part of the study, allowing users to choose one measure or another to generate parallel3Narrativity measures whether the passage is story-like and includes events and characters. Syntactic simplicity refers to the ease of thesentence syntax. Word concreteness measures the degree to which words in the passage are imaginable versus abstract. Referential cohesionis the overlap between sentences with respect to major words (nouns, verbs, adjectives). Deep cohesion measures causal, spatial and temporalrelations between events, actions, goals, and states.4As a condition of participating, each developer also committed to offering (a) transparency in revealing both the text features it analyzed andthe general means of analysis, (b) a program that calibrated text difficulty by grade or band level to match the Common Core Standards’expectations regarding measuring text complexity, and (c) a version of its quantitative tool that could be adapted for public access at theindividual user level.5When running the passages through Flesch-Kincaid measures, researchers found no single answer for what the Flesch-Kincaid score was for aspecific text. The score depended on which version of the Flesch-Kincaid program was run and how that particular program counted syllables,sentence length, and the like. Because Flesch-Kincaid has no ‘caretaker’ that oversees or maintains the formula, researchers had to makedecisions about how to count syllables and sentence length as they programmed the formula to get a ‘read’ on text(s).6Some of the quantitative measures aligned more closely with human judgment regarding where to situate a text within a complexity band,though these measures did not better predict student performance.3

complexity readings for texts as students move through their K-12 school careers. This common scale isanchored by the complexity of texts representative of those required in typical first-year credit-bearingcollege courses and in workforce training programs. Each of the measures has realigned its ranges tomatch the Standards’ text complexity grade bands and has adjusted upward its trajectory of readingcomprehension development through the grades to indicate that all students should be reading at thecollege and career readiness level by no later than the end of high school.Figure 1: Updated Text Complexity Grade Bands and Associated Ranges from Multiple Measures7CommonCore Bandnd2 – 3rdDegrees ofReadingPower Flesch8KincaidThe LexileFramework ReadingMaturitySourceRater2.75 – 5.1442 – 541.98 – 5.34420 – 8203.53 – 6.130.05 – 2.48thth4.97 – 7.0352 – 604.51 – 7.73740 – 10105.42 – 7.920.84 – 5.75thth7.00 – 9.9857 – 676.51 – 10.34925 – 11857.04 – 9.574.11 – 10.669.67 – 12.0162 – 728.32 – 12.121050 – 13358.41 – 10.819.02 – 13.9311.20 – 14.1067 – 7410.34 – 14.21185 – 13859.57 – 12.0012.30 – 14.504 –56 –8thth9 – 10th11 – CCRIII.ATOSNew Tools for Evaluating the Qualitative Dimension of Text ComplexitySimultaneously with the work on quantitative metrics, additional fieldwork was performed with the goalof helping educators better judge the qualitative features of text complexity. In the CCSS, qualitativemeasures serve as a necessary complement to quantitative measures, which cannot capture all of theelements that make a text easy or challenging to read and are not equally successful in rating thecomplexity of all categories of text.Focus groups of teachers from a variety of CCSS adoption states, and representing a wide variety ofteaching backgrounds, used the qualitative features first identified in Appendix A to develop and refinean evaluation tool that offers teachers and others greater guidance in rating texts. The evaluation toolviews the four qualitative factors identified in Appendix A as lying on continua of difficulty rather than asa succession of discrete “stages” in text complexity. The qualitative factors run from easy (left-handside) to difficult (right-hand side). Few (if any) authentic texts will be at the low or high ends on all ofthese measures, and some elements of the dimensions are better suited to literary or to informationaltexts. Below are brief descriptions of the different qualitative dimensions:7The band levels themselves have been expanded slightly over the original CCSS scale that appears in Appendix A at both the top and bottom ofeach band to provide for a more modulated climb toward college and career readiness and offer slightly more overlap between bands. Thewider band width allows more flexibility in the younger grades where students enter school with widely varied preparation levels. This changewas provided in response to feedback received since publication of the original scale (published in terms of the Lexile metric) in Appendix A.8Since Flesch-Kincaid has no ‘caretaker’ that oversees or maintains the formula, the research leads worked to bring the measure in line withcollege and career readiness levels of text complexity based on the version of the formula used by Coh-Metrix.4

(1) Structure. Texts of low complexity tend to have simple, well-marked, and conventional structures,whereas texts of high complexity tend to have complex, implicit, and (in literary texts)unconventional structures. Simple literary texts tend to relate events in chronological order, whilecomplex literary texts make more frequent use of flashbacks, flash-forwards, multiple points of viewand other manipulations of time and sequence. Simple informational texts are likely not to deviatefrom the conventions of common genres and subgenres, while complex informational texts might ifthey are conforming to the norms and conventions of a specific discipline or if they contain a varietyof structures (as an academic textbook or history book might). Graphics tend to be simple and eitherunnecessary or merely supplementary to the meaning of texts of low complexity, whereas texts ofhigh complexity tend to have similarly complex graphics that provide an independent source ofinformation and are essential to understanding a text. (Note that many books for the youngeststudents rely heavily on graphics to convey meaning and are an exception to the abovegeneralization.)(2) Language Conventionality and Clarity. Texts that rely on literal, clear, contemporary, andconversational language tend to be easier to read than texts that rely on figurative, ironic,ambiguous, purposefully misleading, archaic, or otherwise unfamiliar language (such as generalacademic and domain-specific vocabulary).(3) Knowledge Demands. Texts that make few assumptions about the extent of readers’ life experiencesand the depth of their cultural/literary and content/discipline knowledge are generally less complexthan are texts that make many assumptions in one or more of those areas.(4) Levels of Meaning (literary texts) or Purpose (informational texts). Literary texts with a single level ofmeaning tend to be easier to read than literary texts with multiple levels of meaning (such as satires,in which the author’s literal message is intentionally at odds with his or her underlying message).Similarly, informational texts with an explicitly stated purpose are generally easier to comprehendthan informational texts with an implicit, hidden, or obscure purpose.5

Figure 2: Qualitative Dimensions of Text ComplexityCategoryNotes and comments on text, supportfor placement in this bandWhere to place within the band?Beginningof lowergradeEnd oflowergradeBeginningof highergradeEnd ofhighergradeNOTsuited tobandStructure (both storystructure or form of piece)Language Clarity andConventions (includingvocabulary load)Knowledge Demands (life,content, cultural/literary)Levels of Meaning/PurposeOverall placementIV.JustificationReader and Task Considerations and the Role of TeachersWhile the research noted above impacts the quantitative and qualitative measures of text complexity,the third element of the three-part model for measuring text complexity—reader and taskconsiderations—remains untouched. While the quantitative and qualitative measures focus on theinherent complexity of the text, they are balanced in the CCSS’ model by the expectation that educatorswill employ professional judgment to match texts to particular tasks or classes of students. Numerousconsiderations go into such matching. For example, harder texts may be appropriate for highlyknowledgeable or skilled readers, who are often willing to put in the extra effort required to read hardertexts that tell a story or contain complex information. Students who have a great deal of interest ormotivation in the content are also likely to handle more complex texts.6

The RAND Reading Study Group, identified in the 2002 report Reading for Understanding, also namedimportant task-related variables, including the reader’s purpose (which might shift over the course ofreading), “the type of reading being done, such as skimming (getting the gist of the text) or studying(reading the text with the intent of retaining the information for a period of time),” and the intendedoutcome, which could include “an increase in knowledge, a solution to some real-world problem, and/orengagement with the text.”9 Teachers employing their professional judgment, experience, andknowledge of their students and their subject are best situated to make such appraisals.V.The Issue of Text Quality and Coherence in Text SelectionSelecting texts for student reading should not only depend on text complexity but also on considerationsof quality and coherence. The Common Core State Standards emphasize that "[t]o become college andcareer ready, students must grapple with works of exceptional craft and thought whose range extendsacross genres, cultures, and centuries. Such works offer profound insights into the human condition andserve as models for students’ own thinking and writing."10 In addition to choosing high quality texts, it isalso recommended that texts be selected to build coherent knowledge within grades and across grades.For example, the Common Core State Standards illustrate a progression of selected texts across gradesK-5 that systematically build knowledge regarding the human body.11 Considerations of quality andcoherence should always be at play when selecting texts.VI.Key Considerations in Implementing Text ComplexityThe tools for measuring text complexity are at once useful and imperfect. Each of the tools describedabove—quantitative and qualitative—has its limitations, and none is completely accurate. The questionremains as to how to best integrate quantitative measures with qualitative measures when locatingtexts at a grade level. The fact that the quantitative measures operate in bands rather than specificgrades gives room for both qualitative and quantitative factors to work in concert when situating texts.The following recommendations that play to the strengths of each type of tool—quantitative andqualitative—are offered as gu

The Lexile Framework For Reading by MetaMetrics A Lexile measure represents both the complexity of a text, such as a book or article, and an individuals reading ability. Lexile measures include the variables of word frequency and sentence length. Lexile measures are expressed as numeric measures followed by an L _ (for example, 850L),

Related Documents:

Bruksanvisning för bilstereo . Bruksanvisning for bilstereo . Instrukcja obsługi samochodowego odtwarzacza stereo . Operating Instructions for Car Stereo . 610-104 . SV . Bruksanvisning i original

10 tips och tricks för att lyckas med ert sap-projekt 20 SAPSANYTT 2/2015 De flesta projektledare känner säkert till Cobb’s paradox. Martin Cobb verkade som CIO för sekretariatet för Treasury Board of Canada 1995 då han ställde frågan

service i Norge och Finland drivs inom ramen för ett enskilt företag (NRK. 1 och Yleisradio), fin ns det i Sverige tre: Ett för tv (Sveriges Television , SVT ), ett för radio (Sveriges Radio , SR ) och ett för utbildnings program (Sveriges Utbildningsradio, UR, vilket till följd av sin begränsade storlek inte återfinns bland de 25 största

Hotell För hotell anges de tre klasserna A/B, C och D. Det betyder att den "normala" standarden C är acceptabel men att motiven för en högre standard är starka. Ljudklass C motsvarar de tidigare normkraven för hotell, ljudklass A/B motsvarar kraven för moderna hotell med hög standard och ljudklass D kan användas vid

LÄS NOGGRANT FÖLJANDE VILLKOR FÖR APPLE DEVELOPER PROGRAM LICENCE . Apple Developer Program License Agreement Syfte Du vill använda Apple-mjukvara (enligt definitionen nedan) för att utveckla en eller flera Applikationer (enligt definitionen nedan) för Apple-märkta produkter. . Applikationer som utvecklas för iOS-produkter, Apple .

Issue of orders 69 : Publication of misleading information 69 : Attending Committees, etc. 69 : Responsibility 69-71 : APPENDICES : Appendix I : 72-74 Appendix II : 75 Appendix III : 76 Appendix IV-A : 77-78 Appendix IV-B : 79 Appendix VI : 79-80 Appendix VII : 80 Appendix VIII-A : 80-81 Appendix VIII-B : 81-82 Appendix IX : 82-83 Appendix X .

Appendix G Children's Response Log 45 Appendix H Teacher's Journal 46 Appendix I Thought Tree 47 Appendix J Venn Diagram 48 Appendix K Mind Map 49. Appendix L WEB. 50. Appendix M Time Line. 51. Appendix N KWL. 52. Appendix 0 Life Cycle. 53. Appendix P Parent Social Studies Survey (Form B) 54

Supplemental information within a month of institution. 37 C.F.R. § 42.123(a) (a) Motion to submit supplemental information. Once a trial has been instituted, a party may file a motion to submit supplemental information in accordance with the following requirements: (1) A request for