Measuring Performance Fairly

3y ago
8 Views
3 Downloads
260.61 KB
25 Pages
Last View : 1m ago
Last Download : 3m ago
Upload by : Averie Goad
Transcription

Measuringperformance fairlyDeveloping intake adjustedperformance measures inVictorian government schools2011 Revision

Published by theCommunications Divisionfor Data, Outcomes and Evaluation DivisionDepartment of Education andEarly Childhood DevelopmentMelbourneMarch 2011 State of Victoria (Department of Educationand Early Childhood Development) 2011The copyright in this document is owned by the State of Victoria(Department of Education and Early Childhood Development), or inthe case of some materials, by third parties (third party materials). Nopart may be reproduced by any process except in accordance with theprovisions of the Copyright Act 1968, the National Education AccessLicence for Schools (NEALS) (see below) or with permission.An educational institutionsituated in Australia which isnot conducted for profit, or abody responsible foradministering such aninstitution, may copy andcommunicate the materials,other than third party materials,for the educational purposes ofthe institution.Authorised by the Department of Educationand Early Childhood Development,2 Treasury Place, East Melbourne, Victoria, 3002.

ContentsIntroduction to the 2011 Revision1.2.Using data to improveTaking a comprehensive view of performance4Two kinds of value added measurement5Intake adjusted performance measurement5Presenting intake adjusted performance7Using intake adjusted performance data8Working towards value added performance measures9Ongoing development9Technical AppendixWorking parameters11An outline of the model12The model - intake measures13The model – outcome measures17The model – matching academic composition measures tooutcomes21The model - possible future enhancements23

Introduction to the 2011 RevisionIn 2009, the introduction of intake adjusted school performancemeasures accompanied a new approach to transparency andaccountability in Victorian government schools.Since that time, additional work has been undertaken to strengthen themethodological approach used to calculate these measures, increasingtheir applicability to the broadest range of school settings and buildingon the range of intake measures used to account for variations inschools level measures of performance.This revision of the document aims to document only those changesthat have been introduced in 2011. For clarity, all such changes havebeen labelled with the words “2011 update” and marked in bold font.3Measuring performance fairly

1. Using data to improveVictorian schools have been at the forefront of educational data use formany years. On a daily basis, data from a range of sources feeds intothe decisions we make about our work – teachers use data toindividualise learning and to monitor how effective they are insupporting every student to progress; school leadership teams usedata to evaluate the success of their improvement strategies; andschool councils use data to ensure their overall plans for the school aremeeting the needs of their local community.Victorian schools have also seen many innovations in data over thelast decade, from the early days of “like” school groups through to theimplementation of the Victorian Essential Learning Standards,NAPLAN and percentile comparison charts. And we have pioneeredthe use of attitudinal data from students, teachers and parents todevelop a thorough understanding of our strengths and weaknesses.We are now ready to take the next step.Taking a comprehensive view of performanceWhen it comes to data, what we measure must be determined by whatwe value.In Victoria, we are clear about the outcomes we are striving to achievefor our children and young people. From birth through to adulthood, weare not interested in simply getting students to attain higher test scores– we are aiming for much more than that. So we want our data torecognise the comprehensive range of outcomes we’re aiming toachieve.In Victorian government schools, the Accountability and ImprovementFramework clearly articulates three outcome areas for students: student learning, student engagement and wellbeing, and student transitions and pathways.To help us improve outcomes in all three areas, we need a sensitiveand sophisticated way to understand our performance. To get thisunderstanding it is important that we take into account the context ofeach school, the challenges it faces, and consequently the value it isadding in improving student outcomes.Studies in Victoria and around the world tell us that a student’sbackground makes a difference to his or her outcomes. This doesn’tmean we should expect less of students from more disadvantagedbackgrounds. Nor does it mean that the capacity of these students toachieve is necessarily less than others. But it does mean that if we areto judge each school’s performance fairly – if we are to compareschools - then we first need to take account of students’ differingstarting points.Using data to improve4

Two kinds of value added measurementOne way to do this is to track the progress of every student in everyschool over time and construct measures of how cohorts of students ineach school progress, taking into account the rate at which studentswith similar characteristics progress in other schools. This type ofmeasurement is referred to as “value added”.Value added measurement in Victoria will become possible in the nearfuture, once we have a unique identifier for every student and once weare able to use NAPLAN data to track cohorts over a three year periodas they progress through their schooling from Year 3 to Year 5, from 5to 7 and from 7 through to 9.In the meantime, there is another type of value added measurementwe can use – contextual value added. Recently, we have beenworking, with support from Professor Stephen Lamb from theUniversity of Melbourne, to develop contextualised value addedmeasures using data from Victorian government schools. Many ofthese developments draw on the work of Professor David Jesson fromthe University of York, UK.Rather than looking at the growth in learning outcomes for individualstudents, contextual value added measures look at the differences inschool performances across a range of outcomes after adjusting fordifferences in student background characteristics between the schools,including the student learning outcomes of each school’s students. Weadjust for the social composition of the school and we adjust for theacademic composition.Importantly, contextualised value added measures allow us to do twothings.First, they allow us to look at outcomes beyond the student learningdomains, so that we can understand how we’re performing againstmeasures of student engagement and wellbeing and measures ofstudents’ transitions and pathways. They allow us to measure all ofwhat we value.And secondly, they focus the measurement on the whole schoolperformance, rather than on that of individual students or teachers.These measures cannot, for instance, be used to attribute performanceto a particular cohort of students as they are never based on only onecohort; they use contextual data and broad measures of studentoutcomes across the entire school.Intake adjusted performance measurementWe know that a multitude of factors influence school performanceoutcomes; from parent and community values, to teachers’ content5Measuring performance fairly

knowledge and instructional expertise, and through to the capacity ofthe school’s leadership team. Some of these factors we can measure.Many we either cannot or do not. A key strength of contextual valueadded measures is that they take account of the factors we canmeasure and tell us how much variation between schools is explainedby them. By inference, the unexplained component – the value added– can then be attributed to what we’re not accounting for. A part of thisis the performance of the school.In Victoria, we plan to use the term intake adjusted performance todescribe these measures, rather than ‘contextual value added’. Doingso will reserve the term value added for the time when we can developstudent level growth measures.In one sense, intake adjusted performance measures are easy toexplain: they measure the performance of each school after takingaccount of the factors we know make the biggest difference to thevariations in outcomes between schools.To take a concrete example, we know that where students live canmake a big difference to where they go after leaving secondary school.For instance there are often fewer further education and employmentoptions in rural areas compared to metropolitan areas. So we take therurality of the school into account when measuring the success of theirpost-school destination outcomes – we “adjust” for rurality.But once we get past this basic notion – that we adjust for the factorswe know make the biggest difference – things start to get morecomplex. This is because different factors affect different outcomes indifferent ways.Taking another example, we know that the school completion rate ofindigenous students is around half that of non-indigenous students. Sowe need to take account of the indigenous composition of eachschool’s cohort in considering their post-compulsory outcomes. Addthis to the adjustments made for rurality and other factors and thingsquickly get very complex.Typical school intake characteristics that we measure and can takeaccount of include:- a measure of the school’s academic composition- the school’s Student Family Occupation (SFO) Density- the proportion of students funded under the Program for Studentswith Disabilities (PSD)- the proportion of Indigenous students- the proportion of refugee students- the proportion of students with English as a Second Language (ESL)- the school’s rurality, and- the school size.Using data to improve6

The technical appendix at the back of this document gives a completedescription of how the intake adjusted performance measures arecalculated, including the intake characteristics that make a differenceto each of our outcomes measures, and which measures of schools’academic composition are used for each outcome.Presenting intake adjusted performanceWith intake adjusted performance measures, what we’re ultimatelyinterested in is the extent to which each school is performing higherthan, lower than, or broadly similar to the level of performance wemight estimate given the intake characteristics of their studentpopulation.Based on what we know about the effect each intake measure has oneach outcome, we can plot the estimated performance of a school andthen measure the gap between that estimated performance and theactual performance. We call this gap the standardised residual, as it ismeasured in units of standard deviation.For our purposes, a standardised residual between -1 and 1 meansthat the school is performing within a similar range to other schoolsgiven their intake characteristics. A standardised residual lower than -1means the school is performing at a lower level, while a value of morethan 1 means the school is performing at a higher level.The standardised residuals tell us whether the school, taking intoaccount the students it has, is performing higher than, lower than, orbroadly similar to other schools, taking into account the students theyhave.We can plot these residual scores on a chart. Figure 1.1 gives anexample for a school’s performance on the Year 5 NAPLAN Readingtest, where the blue bar represents the school’s standardised residualand the horizontal scale is in units of standard deviation.Figure 1.1: Year 5 NAPLAN Reading: a school with intakeadjusted performance at the higher levelSimilarLower-1Higher 1The critical factor in Figure 1.1 is not how far the blue bar extends tothe right – away from the middle of the chart - but whether or not itcrosses the 1 line. In this case it does, showing that the school’soutcome is more than a full standard deviation higher than what wemight estimate after adjusting for the appropriate intake characteristics.7Measuring performance fairly

Figure 1.2 shows another example. Even though this school’sstandardised residual is slightly negative – to the left of the middle ofthe chart – the important thing is that it is still between -1 and 1,meaning that the school’s intake adjusted performance is within asimilar range after adjusting for its intake.Figure 1.2: Year 7-10 Real Retention Rate: a school with intakeadjusted performance at a similar levelSimilarLower-1Higher 1The DetailsThe technical appendix to this document gives a detailed account ofthe principles and processes used to construct the intake adjustedperformance measures, including which intake characteristics areadjusted for in the calculation of each outcome.Using intake adjusted performance dataIn Victorian government schools, we have long-established ways ofworking with different types of data – for instance, we favour consistentpatterns in the data over point-in-time figures; we use absolutemeasures to monitor trends over time and relative measures todetermine areas for improvement. We reserve judgement in ourinterpretations. We do not apportion blame for the past but insteadfocus on the desired future, seeking to answer the questions ‘whatwould it take to improve these results?’ and ‘how will we know if it’sworking?’And just as we would never use a single piece of assessment data togive a definitive understanding of a student’s learning, we would notuse intake adjusted performance as a definitive measure of our school.These data provide us with another perspective – they can highlightareas where we’re doing well, and areas where we may need to focusadditional attention and support. But they should always be interpretedalongside the other data we have in our schools.Importantly, intake adjusted performance measures tell us how we’regoing after adjusting for our intake, but they do not tell us about ourstudents’ absolute outcomes and they do not replace or supersedethose measures. The different types of data we have available arecomplementary, and it is the cumulative weight of the evidence wehave about our performance that allows us to set out our improvementplans with confidence.Using data to improve8

As always, in using these data we need to bear in mind that data alonenever provides a complete picture of a school’s performance – for that,we must always add our own contextual and professional knowledge toour interpretations. Intake adjusted performance data, as with all data,are only a starting point for the professional discussions about how wecan further improve outcomes for students. These discussions must bemoderated by the contextual factors that influence our work – theexpectations of our local communities, the organisational structuresthat we work within, our shared goals and our professional practiceitself – the factors that are not captured by the data.Working towards value added performancemeasuresBy late 2010, the first cohorts of Australian students will haveundertaken two successive NAPLAN tests. These students will haveprogressed from Years 3, 5 and 7 in 2008, through to Years 5, 7 and 9in 2010. It will be the first chance Australia will have to investigate thegrowth of students at a “whole-of-population” level.It will also be an opportune time to investigate how value addedmeasurement can best work in an Australian context. As part of afour-year plan to deliver the goals of the Melbourne Declaration onEducational Goals for Young Australians, governments around thecountry have agreed that they will, “where appropriate, develop valueadded measures for schools’ performance and analysing student1results over time.”These new value added measures will provide us with another usefulperspective on the success of our teaching and learning. It isimportant though, that we don’t lose sight of the other outcomes wevalue; those relating to students’ engagement and wellbeing, and tostudents’ transitions and pathways. The intake adjusted schoolperformance measures will continue to provide us with a fair measureof our performance in these areas.Ongoing developmentThe introduction of intake adjusted measures of school performance inVictoria is the first phase of an ongoing process of development. As welearn more about how these measures can be used to support schoolsin their improvement efforts, we will work to enhance the underlying1MCEETYA four-year plan 2009 – 2012: A companion document for the MelbourneDeclaration on Educational Goals for Young Australians, MCEETYA 2009(http://www.mceetya.edu.au/verve/ resources/MCEETYA Four Year Plan (20092012).pdf)9Measuring performance fairly

statistics: using a wider range of intake measures where this makessense and including new data sets as they emerge.Using data to improve10

2. Technical AppendixWorking parametersThis model of intake adjusted school performance measurement wasconstructed under a set of working parameters that influenced some ofthe methodological decisions.Importantly, the intake adjusted measures were developed using realworld data and as a result, are subject to all the characteristics of realworld data including data entry errors, missing cases and evencontextual circumstances in some schools that mean they simply don’tfit “the model”.In a similar vein, it is worth remembering that the data used toconstruct the model were not originally designed for this purpose. Thisdoesn’t mean that they can’t be used in a meaningful way to constructintake adjusted measures, but we do need to be mindful of anyresulting limitations.Design principlesFollowing are the principles under which the intake adjusted schoolperformance measures were developed. Included under eachprinciple is a short description or example of some of the implicationsthe principle has for the overall model.(1) The measures must be applicable to all Victorian governmentPrimary, Secondary and Primary/Secondary schools whereverpossible.For this first iteration of these measures, only Primary, Secondaryand Primary/Secondary school types have been included in themodelling. We will continue work to include other school types,such as Special schools, over time.2011 Update: Community Schools and select entry schoolshave been excluded from the model as the intake measuresused do not adequately account for their performance.(2) Only existing, readily available data sets can be used.Another way to put this principle is to say that new data could notbe developed specifically for the purposes of constructing thesemeasures – it is the principle of “collect once, use many times.” Inthe interests of minimising school workload and using theavailable data for a multitude of purposes, what was available wasused. The key implication for the methodology is that many of thedata available to Victorian government schools are alreadyaggregated; they are not reported at the student, or class, orteacher level and so the methodology could not take account ofthe multilevel nature of the data.(3) The methodology must be replicable across a defined set ofoutcome variables relating to Student Learning, Engagement andWellbeing, and Transitions and Pathways.Finally, the methodology needed to be applied across the range ofoutcomes that are valued and measured in Victorian government11Measuring performance fairly

schools. It was of little value, for instance, to develop a procedureto calculate intake adjusted performance measures for VCE thatcouldn’t then be applied to attendance data.An outline of the modelThe model is constructed using a series of multiple ordinary leastsquares linear regressions.Regression analysis is a statistical technique that gives us anunderstanding of how one variable (the dependent variable) tends tochange when a number of other independent variables are varied.For simplicity, we can refer to the dependent variable as the outcomewe’re interested in (such as teacher judgements against the VELS),and the independent variables as the contextual intake measures wehave for our schools, such as SFO, or scho

7 Measuring performance fairly The technical appendix at the back of this document gives a complete description of how the intake adjusted performance measures are calculated, including the intake characteristics that make a difference to each of our outcomes measures, and which measures of schools’ academic composition are used for each outcome.

Related Documents:

per IEC 60751 Class A Measuring deviation of the transmitter per IEC 60770 0.25 K Total measuring deviation according to IEC 60770 Measuring deviation of the measuring element the transmitter Measuring span Minimum 20 K, maximum 300 K Basic configuration Measuring range 0 . 150

Measuring Performance on the Seven Dimensions.30 Measuring Criminal Victimization Measuring Success in Calling Offenders to Account Measuring Fear and the Subjective Sense of Security Measuring the Level of Safety and Civility in Public Spaces. vi THE "BOTTOM LINE" OF POLICING W .

Trained Health Service Workers: measuring training-related knowledge and skills back on the job; assessing change in competence due to prior training events Providers: measuring specific performance skills on the job; gauging efficiency, quality of care in practice Teams: measuring performance as team members;

Accuracy specifications (4 . 20 mA version) Tolerance value of the measuring element 2) per IEC 60751 Class A Measuring deviation of the transmitter per IEC 62828 0.25 K Total measuring deviation in accordance with IEC 62828 Measuring deviation of the measuring element transmitter Inf

measuring cup measuring spoons dry/solid measuring cups Methods: K. Dip in. Level off. L. Spoon in lightly, level off. M. Pack firmly, level off. N. Pour, view at eye level. Measuring Equipment Ingredients Measuring Method 3/4 cup milk 1 cup brown sugar 1/2 cup flour 1 teaspoon vanilla 1/4 cup oil 1 cup granu

MF Standard Measuring Microscope M-004 MF-U High-accuracy Measuring Microscope M-005-006 Hyper MF/MF-U Ultra-high-accuracy Measuring Microscope M-007 TM Toolmakers' Microscope M-008 Vision Unit 9 0 0 - M QM-Data200 Data Processor M-010 FS-70 Microscope Head Un

The effect of the model ambiguity on the measurement process may be minimized by establishing certain limits: (1) for 600 thread measuring wires, variations in diameter over the one inch central interval of the wire shall not exceed 10 micro-inches as determined by measuring between a flat measuring contact and a cylindrical contact.File Size: 958KBPage Count: 22Explore furtherUsing the Three-Wire Method to Measure Threadslittlemachineshop.comTHREAD MEASURING WIRE FORMULAS - Osborn Productsosbornproducts.comHow do you measure pitch diameter on a modified buttress .www.practicalmachinist.comRecommended to you based on what's popular Feedback

E. Kreyszig, “Advanced Engineering Mathematics”, 8th edition, John Wiley and Sons (1999). 3. M. R. Spiegel, “Advanced Mathematics for Engineers and Scientists”, Schaum Outline Series, McGraw Hill, (1971). 4. Chandrika Prasad, Reena Garg, "Advanced Engineering Mathematics", Khanna Publishing house. RCH-054: Statistical Design of Experiments (3:1:0) UNIT 1 Introduction: Strategy of .