961464 ITBS GuidetoRD

3y ago
15 Views
3 Downloads
1.35 MB
176 Pages
Last View : 2m ago
Last Download : 3m ago
Upload by : Tia Newell
Transcription

961464 ITBS GuidetoRD.qxp10/29/103:15 PMPage iContentsPart 1 Nature and Purposes of The IowaTests . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1The Iowa Tests. . . . . . . . . . . . . . . . . . . . . . . 1Major Purposes of the ITBS Batteries . . . . . 1Validity of the Tests . . . . . . . . . . . . . . . . . . . 1Description of the ITBS Batteries . . . . . . . . 2Names of the Tests . . . . . . . . . . . . . . . . . 2Description of the Test Batteries . . . . . . . 2Nature of the Batteries . . . . . . . . . . . . . . 2Nature of the Levels . . . . . . . . . . . . . . . . 2Grade Levels and Test Levels. . . . . . . . . 3Test Lengths and Times . . . . . . . . . . . . . 3Nature of the Questions . . . . . . . . . . . . . 3Mode of Responding . . . . . . . . . . . . . . . . 3Directions. . . . . . . . . . . . . . . . . . . . . . . . . 3Other Iowa Tests . . . . . . . . . . . . . . . . . . . . . 6Iowa Writing Assessment . . . . . . . . . . . . 6Listening Assessment for ITBS . . . . . . . . 6Constructed-Response Supplementto The Iowa Tests . . . . . . . . . . . . . . . . . 6Other Manuals . . . . . . . . . . . . . . . . . . . . . . . 6Part 2 The National Standardization Program . . 7Planning the National StandardizationProgram. . . . . . . . . . . . . . . . . . . . . . . . . . . 7Procedures for Selecting theStandardization Sample . . . . . . . . . . . . . . 7Public School Sample . . . . . . . . . . . . . . . 7Catholic School Sample . . . . . . . . . . . . . 8Private Non-Catholic School Sample . . . 8Summary . . . . . . . . . . . . . . . . . . . . . . . . . 8Design for Collecting theStandardization Data . . . . . . . . . . . . . . . . . 8Weighting the Samples . . . . . . . . . . . . . . . . 8Racial-Ethnic Representation . . . . . . . . . . 12Participation of Students inSpecial Groups . . . . . . . . . . . . . . . . . . . . 12Empirical Norms Dates . . . . . . . . . . . . . . . 14School Systems Included in the 2000Standardization Samples. . . . . . . . . . . . . 16New England and Mideast . . . . . . . . . . 16Southeast . . . . . . . . . . . . . . . . . . . . . . . 17Great Lakes and Plains. . . . . . . . . . . . . 19West and Far West . . . . . . . . . . . . . . . . 22Part 3 Validity in the Development andUse of The Iowa Tests . . . . . . . . . . . . . . . 25Validity in Test Use . . . . . . . . . . . . . . . . . . 25Criteria for Evaluating AchievementTests . . . . . . . . . . . . . . . . . . . . . . . . . . . . 25Validity of the Tests . . . . . . . . . . . . . . . . . . 25Statistical Data to Be Considered . . . . . . . 26Validity of the Tests in the Local School . . 26Domain Specifications . . . . . . . . . . . . . . . . 27Content Standards and DevelopmentProcedures . . . . . . . . . . . . . . . . . . . . . . . 28Curriculum Review . . . . . . . . . . . . . . . . 28Preliminary Item Tryout . . . . . . . . . . . . . 28National Item Tryout . . . . . . . . . . . . . . . 28Fairness Review . . . . . . . . . . . . . . . . . . 30Development of Individual Tests . . . . . . 30Critical Thinking Skills . . . . . . . . . . . . . . 43Other Validity Considerations . . . . . . . . . . 44Norms Versus Standards . . . . . . . . . . . 44Using Tests to Improve Instruction . . . . 44Using Tests to Evaluate Instruction . . . . 45Local Modification of Test Content . . . . 45Predictive Validity . . . . . . . . . . . . . . . . . 46Readability. . . . . . . . . . . . . . . . . . . . . . . 48Part 4 Scaling, Norming, and EquatingThe Iowa Tests . . . . . . . . . . . . . . . . . . . . . 51Frames of Reference for ReportingSchool Achievement . . . . . . . . . . . . . . . . 51Comparability of Developmental ScoresAcross Levels: The Growth Model. . . . . . 51The National Standard Score Scale . . . . . 52Development and Monitoring of NationalNorms for the ITBS . . . . . . . . . . . . . . . . . 55Trends in Achievement TestPerformance . . . . . . . . . . . . . . . . . . . . . . 55Norms for Special School Populations . . . 60Equivalence of Forms . . . . . . . . . . . . . . . . 60Relationships of Forms A and B toPrevious Forms . . . . . . . . . . . . . . . . . . . . 61i

961464 ITBS GuidetoRD.qxp10/29/103:15 PMPage iiPart 5 Reliability of The Iowa Tests . . . . . . . . . . 63Methods of Determining, Reporting,and Using Reliability Data . . . . . . . . . . . . 63Internal-Consistency Reliability Analysis . . 64Equivalent-Forms Reliability Analysis . . . . 74Sources of Error in Measurement . . . . . . . 75Standard Errors of Measurement forSelected Score Levels. . . . . . . . . . . . . . . 77Effects of Individualized Testing onReliability . . . . . . . . . . . . . . . . . . . . . . . . . 83Stability of Scores on the ITBS . . . . . . . . . 83Part 6 Item and Test Analysis. . . . . . . . . . . . . . . 87Difficulty of the Tests . . . . . . . . . . . . . . . . . 87Discrimination . . . . . . . . . . . . . . . . . . . . . . 94Ceiling and Floor Effects . . . . . . . . . . . . . 100Completion Rates . . . . . . . . . . . . . . . . . . 100Other Test Characteristics . . . . . . . . . . . . 100Part 7 Group Differences in Item and TestPerformance . . . . . . . . . . . . . . . . . . . . . . 107Standard Errors of Measurement forGroups. . . . . . . . . . . . . . . . . . . . . . . . . . 107Gender Differences in Achievement . . . . 107Racial-Ethnic Differences inAchievement . . . . . . . . . . . . . . . . . . . . . 114Differential Item Functioning . . . . . . . . . . 116Part 8 Relationships in Test Performance . . . 121Correlations Among Test Scores forIndividuals . . . . . . . . . . . . . . . . . . . . . . . 121Structural Relationships Among ContentDomains . . . . . . . . . . . . . . . . . . . . . . . . 121Levels 9 through 14. . . . . . . . . . . . . . . 126Levels 7 and 8. . . . . . . . . . . . . . . . . . . 126Levels 5 and 6. . . . . . . . . . . . . . . . . . . 126Interpretation of Factors . . . . . . . . . . . 126Reliabilities of Differences in TestPerformance . . . . . . . . . . . . . . . . . . . . . 127Correlations Among Building Averages . . 127Relations Between Achievement andGeneral Cognitive Ability . . . . . . . . . . . . 127Predicting Achievement from GeneralCognitive Ability: Individual Scores . . . . 131Obtained Versus ExpectedAchievement. . . . . . . . . . . . . . . . . . . 136Predicting Achievement from GeneralCognitive Ability: Group Averages . . . . . 143iiPart 9 Technical Consideration forOther Iowa Tests. . . . . . . . . . . . . . . . . . . 149Iowa Tests of Basic SkillsSurvey Battery . . . . . . . . . . . . . . . . . . . . 149Description of the Tests . . . . . . . . . . . . 149Other Scores . . . . . . . . . . . . . . . . . . . . 149Test Development . . . . . . . . . . . . . . . . 149Standardization . . . . . . . . . . . . . . . . . . 149Test Score Characteristics. . . . . . . . . . 150Iowa Early Learning Inventory . . . . . . . . . 150Description of the Inventory . . . . . . . . 150Test Development . . . . . . . . . . . . . . . . 151Standardization . . . . . . . . . . . . . . . . . . 151Iowa Writing Assessment . . . . . . . . . . . . 151Description of the Test. . . . . . . . . . . . . 151Test Development . . . . . . . . . . . . . . . . 152Standardization . . . . . . . . . . . . . . . . . . 152Test Score Characteristics. . . . . . . . . . 152Constructed-Response Supplementto The Iowa Tests . . . . . . . . . . . . . . . . . 153Description of the Tests . . . . . . . . . . . . 153Test Development . . . . . . . . . . . . . . . . 154Joint Scaling with the ITBS . . . . . . . . . 154Test Score Characteristics. . . . . . . . . . 154Listening Assessment for ITBS . . . . . . . . 155Description of the Test. . . . . . . . . . . . . 155Test Development . . . . . . . . . . . . . . . . 155Standardization . . . . . . . . . . . . . . . . . . 155Test Score Characteristics. . . . . . . . . . 157Predictive Validity . . . . . . . . . . . . . . . . 157Integrated Writing Skills Test . . . . . . . . . . 157Description of the Tests . . . . . . . . . . . . 157Test Development . . . . . . . . . . . . . . . . 158Standardization . . . . . . . . . . . . . . . . . . 158Test Score Characteristics. . . . . . . . . . 158Iowa Algebra Aptitude Test . . . . . . . . . . . 159Description of the Test. . . . . . . . . . . . . 159Test Development . . . . . . . . . . . . . . . . 159Standardization . . . . . . . . . . . . . . . . . . 159Test Score Characteristics. . . . . . . . . . 160Works Cited. . . . . . . . . . . . . . . . . . . . . . . . . . . . . 161Index . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 167

961464 ITBS GuidetoRD.qxp10/29/103:15 PMPage iiiTables and FiguresPart 1: Nature and Purposes of The Iowa Tests . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1Table 1.1 Test and Grade Level Correspondence . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3Table 1.2 Number of Items and Test Time Limits. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4Part 2: The National Standardization Program . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7Table 2.1 Summary of Standardization Schedule . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9Table 2.2 Sample Size and Percent of Students by Type of School . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10Table 2.3 Percent of Public School Students by Geographic Region . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10Table 2.4 Percent of Public School Students by SES Category . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10Table 2.5 Percent of Public School Students by District Enrollment . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11Table 2.6 Percent of Catholic Students by Diocese Size and Geographic Region . . . . . . . . . . . . . . . . . . . 11Table 2.7 Percent of Private Non-Catholic Students by Geographic Region . . . . . . . . . . . . . . . . . . . . . . . . 12Table 2.8 Racial-Ethnic Representation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13Table 2.9 Test Accommodations—Special Education and 504 Students. . . . . . . . . . . . . . . . . . . . . . . . . . . 15Table 2.10 Test Accommodations—English Language Learners . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15Part 3: Validity in the Development and Use of The Iowa Tests . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 25Figure 3.1 Steps in Development of the Iowa Tests of Basic Skills. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 29Table 3.1 Distribution of Skills Objectives for the Iowa Tests of Basic Skills, Forms A and B . . . . . . . . . . . 31Table 3.2 Types of Reading Materials . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 33Table 3.3 Reading Content/Process Standards. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 34Table 3.4 Listening Content/Process Standards . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 34Table 3.5 Comparison of Language Tests by Battery . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 36Table 3.6 Computational Skill Level Required for Math Problem Solving and Data Interpretation . . . . . . . 41Table 3.7 Summary Data from Predictive Validity Studies . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 47Table 3.8 Readability Indices for Selected Tests . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 49Part 4: Scaling, Norming, and Equating The Iowa Tests . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 51Table 4.1 Comparison of Grade-to-Grade Overlap . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 53Table 4.2 Differences Between National Percentile Ranks . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 56Figure 4.1 Trends in National Performance . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 57Table 4.3 Summary of Median Differences . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 58Figure 4.2 Trends in Iowa Performance. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 59Table 4.4 Sample Sizes for Equating Forms A and B . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 61Part 5: Reliability of The Iowa Tests . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 63Table 5.1 Test Summary Statistics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 65Table 5.2 Equivalent-Forms Reliabilities, Levels 5–14 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 74Table 5.3 Estimates of Equivalent-Forms Reliability . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 75Table 5.4 Mean (Grades 3–8) Reliability Coefficients: Reliability Types Analysis by Tests . . . . . . . . . . . . . 76Table 5.5 Test-Retest Reliabilities, Levels 5–8. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 77Table 5.6 Standard Errors of Measurement for Selected Standard Score Levels . . . . . . . . . . . . . . . . . . . . 78Table 5.7 Correlations Between Developmental Standard Scores, Forms A and B . . . . . . . . . . . . . . . . . . 84Table 5.8 Correlations Between Developmental Standard Scores, Forms K and L . . . . . . . . . . . . . . . . . . 85iii

961464 ITBS GuidetoRD.qxp10/29/103:15 PMPage ivPart 6: Item and Test Analysis. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 87Table 6.1 Word Analysis Content Classifications with Item Norms . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 88Table 6.2 Usage and Expression Content Classifications with Item Norms . . . . . . . . . . . . . . . . . . . . . . . . 89Table 6.3 Distribution of Item Difficulties . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 90Table 6.4 Summary of Difficulty (Proportion Correct) and Discrimination (Biserial) Indices . . . . . . . . . . . . 95Table 6.5 Ceiling Effects, Floor Effects, and Completion Rates. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 101Part 7: Group Differences in Item and Test Performance . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 107Table 7.1 Standard Errors of Measurement in the Standard Score Metric for ITBSby Level and Gender . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 108by Level and Group . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 109Table 7.2 Male-Female Effect Sizes for Average Achievement . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 111Table 7.3 Descriptive Statistics by Gender . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 112Table 7.4 Gender Differences in Achievement over Time . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 113Table 7.5 Race Differences in Achievement . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 114Table 7.6 Effect Sizes for Racial-Ethnic Differences in Average Achievement . . . . . . . . . . . . . . . . . . . . . 115Table 7.7 Fairness Reviewers. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 117Table 7.8 Number of Items Identified in Category C in National DIF Study . . . . . . . . . . . . . . . . . . . . . . . . 119Part 8: Relationships in Test Performance . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 121Table 8.1 Correlations Among Developmental Standard Scores . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 121Table 8.2 Reliabilities of Differences Among Scores for Major Test Areas:Developmental Standard Scores . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 128Table 8.3 Reliabilities of Differences Among Tests: Developmental Standard Scores . . . . . . . . . . . . . . . 128Table 8.4 Correlations Among School Average Developmental Standard Scores. . . . . . . . . . . . . . . . . . . 131Table 8.5 Correlations Between Standard Age Scores and Developmental Standard Scores . . . . . . . . . 137Table 8.6 Reliabilities of Difference Scores and Standard Deviations of Difference ScoresDue to Errors of Measurement . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 139Table 8.7 Correlations, Prediction Constants, and Standard Errors of Estimate for School Averages . . . 145Part 9: Technical Consideration for Other Iowa Tests . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 149Table 9.1 Test Summary Statistics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 150Iowa Tests of Basic Skills–Survey Battery, Form ATable 9.2 Average Reliability Coefficients, Grades 3–8 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 152Iowa Writing AssessmentTable 9.3 Correlations and Reliability of Differences . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 153Iowa Writing Assessment and Iowa Tests of Basic Skills Language TotalTable 9.4 Internal-Consistency Reliability. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 155Constructed-Response SupplementTable 9.5 Correlations and Reliabilities of Differences. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 155Constructed-Response Supplement and Corresponding ITBS SubtestsTable 9.6 Test Summary Statistics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 156Listening Assessment for ITBSTable 9.7 Correlations Between Listening and ITBS Achievement . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 156Table 9.8 Correlations Between Listening Grade 2 and ITBS Grade 3 . . . . . . . . . . . . . . . . . . . . . . . . . . . 157Table 9.9 Test Summary Statistics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 158Integrated Writing Skills Test, Form MTable 9.10 Correlations Between IWST and ITBS Reading and Language Tests . . . . . . . . . . . . . . . . . . . . 159Table 9.11 Test Summary Statistics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 160Iowa Algebra Aptitude Test–Grade 8Table 9.12 Correlations Between IAAT and Algebra Grades and Test Scores . . . . . . . . . . . . . . . . . . . . . . 160iv

961464 ITBS GuidetoRD.qxp10/29/10PART 13:15 PMPage 1Nature and Purposes of The Iowa Tests The Iowa TestsValidity of the TestsThe Iowa Tests consist of a variety of educationalachievement instruments developed by the facultyand professional staff at Iowa Testing Programs atThe University of Iowa. The Iowa Tests of BasicSkills (ITBS ) measure educational achievementin 15 subject areas for kindergarten through grade 8.The Iowa Tests of Educational Development (ITED ) measure educational achievement in ninesubject areas for grades 9 through 12. These testbatteries sh

Guide to Research and Developmentfor the ITED contains technical information about that test battery and related assessments. Major Purposes of the ITBS Batteries The purpose of measurement is to provide information that can be used to improve instruction and learning. Assessment of any kind has value to

Related Documents:

If your child needs more practice for the 3rd Grade Iowa Assessments/Iowa Test of Basic Skills (ITBS), you can get 20% off our 3rd Grade Practice Pack with this coupon code. What’s included in the pack: Access to a total of 897 practice questions comprised of: o Two full-length

ABSTRACT Background and Purpose: Iliotibial Band Syndrome (ITBS) is the second leading cause of pain in runners and there are a number of theories related to its etiology. Multiple theories exist for the etiology of ITBS related symptoms including anterior-posterior friction of the IT ba

fasciitis, and trigger point [6-9]. However, the information about the efficacy of DN for ITBS is limited to some case reports and only a cease series [10-12], and a few studies have assessed SWT [13]. Therefore, the current study was designed to investigate the efficacy of DN versus SWT in ITBS management. Materials and Methods 1. Study Population

Vocabulary For each of the following two questions, students need to determine which one of the four answers has most nearly the same meaning as the underlined word above it. Students have one minute to answer both questions. 1. To crumple the paper A. file B. crush C. throw D. tear 2. A fortunate person A. amusing B. smart C. successful

The Practice of Statistics, 4th edition - For AP* STARNES, YATES, MOORE . Jenny earned a score of 86 on her statistics test. The class mean was 80 and the standard deviation was 6.07. She earned a score of 82 . Basic Skills (ITBS) can be described by a smooth curve drawn through the tops of the bars. on

ITBS/CogAT, Fall 2010 Fifth Grade Page 1 FIFTH GRADE TEACHER PACKET IOWA TESTS OF BASIC SKILLS Introduction Norm-reference testing will provide information about the performance of your students as they compare to a national group of students (called the norm group) who took the same test under the same conditions at the same time of the year.

for national science tests such as the TerraNova, the Iowa Tests of Basic Skills (ITBS), and the Stanford Achievement Test, Tenth Edition (SAT-10). The format of the questions found in these practice tests is very similar to the format of the questions found in the actual national science tests. Physical Science

Conditional Random Fields: An Introduction Hanna M. Wallach February 24, 2004 1 Labeling Sequential Data The task of assigning label sequences to a set of observation sequences arises in many fields, including bioinformatics, computational linguistics and speech recognition [6, 9, 12]. For example, consider the natural language processing