2017 K12ACADEMICREPORT
3 2017 K12 ACADEMIC REPORT
TABLE OF CONTENTS Letter from Nate Davis, Executive Chairman;Aand Stuart Udell, Chief Executive Officer2 AO Foreword: Changes in State Testing andCSchool Accountability4 12 Public School Programs: Performance AnalysisKand Innovation7The “State” of State Testing in 2015–20168Market Demand for Online Learning28K12 Driving Innovation: Accountability Dashboards40Appendices47Appendix A: School Comparisons to the States (2015–2016)48Appendix B: F ree and Reduced Price Lunch and Special EducationEligibility by School Compared to State99Appendix C: K12 Private School Profiles (2015–2016)102Appendix D: A lphabetical Guide to Schools Included in 2017K12 Academic Report103This report contains certain forward-looking statements within the meaning of the Private Securities Litigation Reform Act of 1995. We have tried, whenever possible, to identifythese forward-looking statements using words such as “anticipates,” “believes,” “estimates,” “continues,” “likely,” “may,” “opportunity,” “potential,” “projects,” “will,” “expects,”“plans,” “intends,” and similar expressions to identify forward-looking statements, whether in the negative or the affirmative. These statements reflect our current beliefs and arebased upon information currently available to us. Accordingly, such forward-looking statements involve known and unknown risks, uncertainties, and other factors that couldcause actual academic performance to differ materially from those expressed in, or implied by, such statements. These risks, uncertainties, factors, and contingencies include, butare not limited to: test result presentations and data interpretations; descriptions of testing and academic outcomes; individual school, grade, and subject performance reporting;educational achievements; the potential inability to further develop, maintain, and enhance our curriculum products, instructional services, and teacher training; the reductionof per pupil funding amounts at the schools we serve; reputation harm resulting from poor academic performance in the managed schools with whom we contract; challengesfrom online public school or hybrid school opponents; failure of the schools we serve to comply with applicable education requirements, student privacy, and other applicableregulations; inability to recruit, train, and retain quality teachers and employees; and other risks and uncertainties associated with our business described in the Company’sfilings with the Securities and Exchange Commission. Although the Company believes the expectations reflected in such forward-looking statements are based upon reasonableassumptions, it can give no assurance that the expectations will be attained or that any deviation will not be material.
A Letter from Nate Davis, Executive Chairman;and Stuart Udell, Chief Executive OfficerThe fifth annual K12 Academic Report continues our commitment to accountability andtransparency. It includes all K12 online and blended public school programs with publiclyavailable state test results for 2015–2016 in grades 3–8 English Language Arts and/or Reading and Mathematics as well as in high school for assessments in English andMathematics/Algebra 1.This report provides test results for 2015–2016 comparingschool performance to state performance and shows thedifference between school and state performance forrepresentative schools. We are encouraged to see certainexams and grades in some schools that have exceededthe state proficiency percentages, and we have institutedprograms to improve academic performance across all theschools we serve.The focus on improving instruction in 2015–2016 was (1)reporting critical data to schools in a timely manner so thatmidyear adjustments could be made, and (2) expanding andrefining the Instructional Coaching program to strengthenteacher effectiveness in English Language Arts/Readingand in Mathematics. Teachers routinely received coachingsupport from experienced teachers in the online andblended environment. Although this transformation acrossall the schools we serve will likely take more than one year,we know that this investment in teachers is an investment inhelping students learn and achieve.Our analyses indicates that many K12 school1 programscontinue to underperform their states in Mathematics, whichis not uncommon in public schools with high proportions ofeconomically disadvantaged students.2, 3 We are still seeingthe impact of poverty as students who are eligible for freeand reduced price lunch continue to underperform studentswho are not eligible for subsidized meals. Again, this is acommon reality in brick and mortar public schools as well.All schools need to meet the nonacademic needs ofstudents who suffer from the broader impact of poverty.K12 has taken and continues to take this challenge seriously.We expanded our Family Academic Support Team (FAST)initiative to mitigate many of the nonacademic challengesfacing students. While supporting the individual needsof students, in 2015–2016, we also initiated a nationalinstructional coaching program for both new and returningteachers to increase their abilities to support every student.And we sustained our ongoing research initiatives todetermine the efficacy of instructional programs.Leveraging the research findings and best practices withinour schools and in the industry, we developed a newAcademic Excellence Framework as a guide and a set ofcriteria to improve instructional effectiveness in the onlinelearning environment. This new plan was launched in 2016–2017 across all our managed public school programs and wewill be reporting on its efficacy in future academic reports.We have extended our view of persistence—studentswho remain continuously enrolled for three or more yearscontinue to outperform students who are enrolled oneyear or less. Again, this is a reminder that the impact ofmobility occurs in brick and mortar schools as well as in T his report sometimes refers to “K12 schools” or “our schools” or “K12 students” as a shorthand way to describe the online and blended public schools we serve pursuant to acontract with an independent not-for-profit board or school district governing board. We do not mean to suggest or imply that K12 Inc. has any ownership or control over thoseschools. Because the independent boards seek a managed contractual arrangement, the references to “K12 schools” and similar language are simply for ease and do not describe alegal relationship. We are honored to be selected as a vendor to the public boards we serve.2J. Isaacs, & K. Magnuson, Income and Education as Predictors of Children’s School Readiness, (Washington, DC: Center on Children and Families, Brookings Institute, 2011).3The Impact of Poverty on Student Outcomes (Hanover Institute, 2015, January).12 2017 K12 ACADEMIC REPORT
online and blended schools. Students need stability in theireducational environment through graduation to be able tosucceed. Our Students First initiative this year also includedthe introduction of a customizable graduation planning toolto keep students on-track for commencement. The toolfeatures a centralized repository with complete course credithistory to help schools thoughtfully manage each student’spersonalized graduation roadmap. Characterized by a handydashboard and data analytics capabilities, the tool identifiescredit gaps so that educators can step in and provideassistance exactly when it is needed, as opposed to after itis too late.students who struggle in traditional school environments andwill enhance the learning for the many advanced learners inonline and blended schools.One of the distinct advantages of online and blendedlearning environments is that many more data points arereadily available to heads of school than in traditional brickand mortar schools. While we protect individual student dataconsistent with state and federal privacy laws, aggregatedstudent engagement information in the online and blendedenvironment helps us to understand learning patterns andhow students choose to use their instructional time. Our goalis to identify the different ways we can motivate students tolearn rigorous content while stimulating their engagement.Online and blended schools and programs face many of thesame challenges of brick and mortar schools. We continueto share what we have learned through blogs, white paperspublished throughout each year, and presentations ateducator and policy meetings. We look for partnershipsacross the online and blended learning environment.The K12 Academic Report is part of our broad researchefforts at K12. We are committed to continuing to researchthe relationship between student achievement and variablessuch as school structure, teacher effectiveness, learnerpreferences for synchronous or asynchronous instructionalsessions, as well as any other engagement behaviors thatwill help us better meet the needs of every student. Weregularly collect and examine data at the classroom, school,regional, and national levels to ensure that we are doingeverything possible to support student learning. These datahold promise for enhancing the learning outcomes of manyIn 2015–2016, K12 served more than 100,000 students ingrades kindergarten through grade 12 and graduated morethan 5,800 high school seniors. We are proud of the familieswho choose K12 managed public schools as well as thosewho use our course offerings at their local traditional brickand mortar districts/schools. These families are searching forthe best solutions for their students, and our goal is to meetand exceed their expectations.We will extend our research to cover new initiatives in futurereports as well as in research briefs, white papers, and blogs.All of us at K12 are committed and dedicated to supportingthe academic success of students and families who choosethe online learning environment. We know that we can onlysucceed when our students succeed—so we begin and endeach day with “Students First.”Nate DavisExecutive ChairmanStuart UdellChief Executive Officer3
CAO FOREWORD:Changes in State Testing and School AccountabilityThe 2017 K12 Academic Report continues our commitment to effective practices and innovationdesigned to improve the learning experience for students. The main body of this AcademicReport is structured to focus on three areas of interest. First, we present a description of thecontinuing changes in state testing programs with examples of results for grades 3–8 andhigh school. Second, we provide an update on market demand for new approaches to onlinelearning. Third, we preview an innovative approach to school accountability that focuses onstudents. In the Appendices, we report the 2015–2016 assessment results and demographicsfor the online and blended public school programs that K12 managed during that year.The “State” of State Testing in 2015–2016States have historically wanted autonomy in establishingcurricula and testing programs. While the consortia,established through grants from the federal government in2010, appeared to have caused states to agree on commonassessments (PARCC and SBAC),4 states began withdrawingfrom these collaborative ventures in 2014–2015. The statetesting environment continued to change in the 2015–2016school year. More states chose to depart from the consortia,leading to more and more states having their own stateassessment programs.The number of states using PARCC shrank from 11 states plusthe District of Columbia in 2014–2015 to eight states plusthe District of Columbia in 2015–2016 and in SBAC from 18in 2014–2015 to 15 in 2015–2016. These shifts resulted in sixstates moving to their own new state assessment systems in2015–2016. Only 23 states used the same assessments in2015–2016 that they had administered in prior years. Of the33 states plus the District of Columbia in which K12 managedpublic school programs, the number of states with newstate testing programs was 11 in 2014–2015 and 11 in 2015–452016.5 This continued the challenges in interpreting schoolperformance year over year.K12 works diligently to improve the learning experienceand the learning outcomes for students who choose toparticipate in online and blended schools. In order toensure that we are making the right decisions about teacherand administrator training, curriculum structure, interimassessments, etc., we have developed several different waysto support credible and valid interpretation of academicperformance year-over-year in such a changing testingenvironment. In this report, the reader will see examples ofone approach—that is comparing school performance tostate performance by subject and grade to understand theextent to which schools are performing on par with the stateaggregate percentage of students at or above proficiency.In other documents produced through our rigorous researchprogram, we also report out on school comparisons using amethodology which normalizes scores around proficiencycut-scores. All of our research is focused on improvingteaching, the curriculum, and learning in the environment forstudents who choose to attend an online or blended schoolmanaged by K12.Partnership for Assessment of Readiness for College and Career (PARCC) and the Smarter Balanced Assessment Consortium (SBAC). C alculated from data found in the following articles representing grades 3–8 and high school: L. Jurkowitz & S. Decker, “The National Testing Landscape,” Education Week /map-the-national-k-12-testing-landscape.html, and S. Bannerjee, “State Testing: An Interactive Breakdown of 2015-16 Plans. .html4 2017 K12 ACADEMIC REPORT
Due to the discontinuity of 11 states withdrawing from thetesting consortia, establishing their own state-specific testingprograms in 2015–2016, it becomes challenging to fullyunderstand whether schools are becoming more or lesseffective—since the measures are changing. At the sametime, we continue to see school and state results reportedbetween the end of the school year and the beginning ofthe second quarter of the next school year. This lag betweentesting and reporting makes it impossible to intervenein a timely manner so that students who need additionalacademic support can be ready for the following school year.Stability of state testing programs and more timely deliveryof results back to the schools are important to helping eachstudent learn at grade level.K12 continues to report school results from 2015–2016in terms of the percent proficient by grade for EnglishLanguage Arts or Reading, and Mathematics in grades 3–8,and their equivalent content areas in high school. And,because some of the managed public schools are still partof PARCC, some are part of SBAC, some have used thesame state-specific testing programs for several years, andothers have launched a new testing program in 2015–2016,we compare school results with either the consortiumaggregate or with the state aggregate.Market Demand for Online LearningK12 continues to innovate in response to market demands.Our partnership with urban school districts providesopportunities for delivering a blended learning model forstudents. The Chicago Virtual Charter School, foundedin 2006, was one of the earliest comprehensive blendedmodels in the country. Public districts and schools want toincorporate technology into teaching and learning and K12 iseager to support them to best meet and exceed their needs.Additionally, K12 understands the necessity of preparingstudents both academically and technically for college andcareer opportunities in their future. Over the course of the2016–2017 school year, K12 expanded career technicaleducation (CTE) programs across six schools and up to eightdifferent CTE areas of focus.K12 Driving Innovation:Accountability DashboardsWe continue to study the relationship between student mobilityand poverty on academic performance. Because both mobility(movement from school-to-school) and poverty have longbeen recognized as having a negative effect on studentlearning, it makes sense to recognize that “success” in aschool with high mobility and high levels of family povertymay not be the same as “success” in a school without thoseexternal pressures. It is time to use these findings to innovateschool accountability systems to recognize that all studentsare not the same and that measures of success shouldreflect the student and family populations that each schoolserves along with the unique mission of each school. Webelieve that measures of school effectiveness need to takethese differences into account through a student-centeredaccountability model.The Every Student Succeeds Act (ESSA) gives individualstates more flexibility in shaping their accountabilitysystems and assessments. We anticipate that states willmake varying use of summative and interim assessments tomeasure within-year growth along with movement towardstandards mastery. We hope that many states may revisetheir growth models to incorporate interim results as well asfactoring in both mobility and poverty. Finally, we anticipatethat schools and states will choose to report informationthat their stakeholders (families) are interested in, such asteacher turnover, student attendance, etc. In the sectionon K12 Driving Innovation: Accountability Dashboards,we offer a dashboard approach that supports a new andmore transparent reporting approach in order to recognizeadditional measures that contribute to student success.Everything we examine and research in our online andblended school programs is focused on improving thelearning experience and outcomes for the students andfamilies who choose this option for public education.We remain committed to this goal.Margaret Jorgensen, Chief Academic Officer5
K12 Public School Programs:Performance Analysisand Innovation7
The “State” of State Testing in 2015–2016State testing programs have long had both great potential as well as caused great frustration.From educators' perspectives, these programs improve teaching and learning. From theperspectives of students and families, they take too much time away from learning and add toomuch stress on students to demonstrate "on demand" what they have learned. In 2015–2016,these same potential benefits and costs remained. The tension between these two positionspoints to important information that both educators and families need to know: (a) What dostudents know and what do they not know? and (b) How do we reduce the length (and stress)of the testing experience while still capturing valid and reliable information about each student’slearning? The assessment results provide, after all, an important, reliable, and valid source ofinformation about what students know and can do.In addition to these foundational tensions, states havehistorically wanted autonomy in setting content standards,selecting testing vendors, specifying specific content to beassessed, and setting the cut scores that determine studentproficiency. Adding complexity to this is the process of statesperiodically updating learning standards, leading to newstate assessments, shifts in professional development forteachers, and new learning goals for students. These factorshave resulted in state assessment results that are not directlycomparable across states, nor even across years within astate. The National Assessment of Educational Progress(NAEP) does provide state-by-state comparable informationbut the assessment used is administered to samples ofstudents, not entire populations; it is administered to onlycertain grade levels and in certain content areas—and it isnot administered on an annual basis. While NAEP providesa valid and reliable longitudinal view of education in theUnited States, it does not provide information to drive eitherinstruction or school improvement.The introduction of the Common Core State Standards(CCSS) and the creation of two multistate testing consortiawere the recent attempts of the federal government tobring commonality across the states with respect to bothcontent standards and assessment rigor and experience.8 2017 K12 ACADEMIC REPORTMany educators, families, and state leaders pushed backon these efforts from the beginning and by 2015–2016, thedepth of discontent with the common content standardsand especially with the testing consortia (Partnership forReadiness for College and Career [PARCC] and SmarterBalanced Assessment Consortium [SBAC]) was evident.Indicators of discontent included families opting out fortheir students, refusing to have their children participate inPARCC, or SBAC testing; administrators noting that the lagtime between test administration and score reporting wasthe same or longer than state-specific testing programs;educators complaining about how early in the school yearthe assessments were administered (February, March, April);and virtually everyone complaining about the extended timespent in testing as opposed to learning.For states adopting new state assessments, there areadditional delays resulting from the processes of settingproficiency standards on those new assessments.The political process alone can take months beginningwith convening standard setting committees, reviewingdata against the proposed proficiency standards, and,finally, obtaining state board of education approval ofthose standards.
Federal intervention in education vis-à-vis the Race to theTop funding of both adoption of CCSS and membership inthe testing consortia was an opportunity to at least allowdirect comparisons of state academic performance ingrades K–12. But the notable dropouts from each of the twotesting consortia has essentially limited comparing schoolperformance year-over-year across states.The reality of 2015–2016 and going forward will be thatmore states are likely to drop out of the consortia seekingless expensive assessment alternatives. Families willcontinue to push back on multiple days of testing in vocal,if not large numbers. Scoring and reporting will continue totake months, not weeks, complicated periodically by newproficiency standards adopted through a lengthy politicalreview process.But the most important issue not being addressed is thequality and timeliness of the assessment results to helpteachers better meet the needs of families. In a surveyconducted with teachers in Tennessee, 65 percent ofeducators said that assessments do not help redefineteaching practices.6 In response to this deficiency, K12 isinnovating around using interim assessments and predictivedata so that teachers know whether students have masteredthe content needed to reach proficiency and are trendingpositively toward goals. We are developing methodologiesto allow for valid direct comparisons across differentassessments (interim and summative) and to ensure thatteachers have the information they need, when they needit, to ensure that students are on track to proficiency andhigher achievement. While we recognize the need for stateboards of education and other regulatory entities to holdpublic schools to specific performance standards, the coreof school performance and improvement is student learning,and student learning is best informed by timely, reliable, andvalid assessment information.Length of Time Between Testingand ReportingIn the 21st century, current information is available in realtime in banking, entertainment, news, etc., it is concerningthat testing information takes weeks, or even months, to beavailable for families and educators. If the primary purposeof assessment is to tell teachers and families what studentsknow and can do and what they need to learn to progress6through the grade levels, delays of weeks or months makethat information less valuable. While this delayed data canhelp when making decisions about the health of a wholeschool or district, but the irony is that these aggregatedata come from students and those students are not beinghelped to improve their learning in a timely manner.Figure 1 shows the lag between the first day of studenttesting and the date those scores were available to theschools for state assessment administered in grades 3–8.The 21 schedules reported in Figure 1 are for those statesthat have either a large school (enrollment 3,000) ormultiple schools managed by K12 in 2015–2016. Note thatonly two states (Florida and Texas) reported results beforethe end of the school year in which the assessments wereadministered. Three states delivered results after October1, 2016, and 16 states delivered reports during the summer.This suggests that, for most students, the opportunity tointervene and remediate before the 2015–2016 school yearended was not available, and in too many states, data werenot available to customize instruction before schools andteachers developed plans for the next school year.The initial purpose of the standards-based assessmentmovement was to define and measure student performanceon the specific content standards for each subject and gradethat students need to have mastered in order to be successfulin the subsequent grade. Unfortunately, the long period oftime between test administration and the receipt of studentscore reports makes it very difficult for teachers to interveneand provide the appropriate remediation for students beforethey begin the next school year. This lag between testadministration and report availability is caused by differentreasons. Some of the delays are caused by scoring of openended questions in addition to multiple-choice questions.Some of the delays are caused by equating processes toensure comparability year-over-year. Some of the delays arebecause of the standard-setting process when state testingprograms change. The standard-setting process is politicaland requires the adoption of proficiency category cut scoresby state boards of education. Some of the delays are causedby vendor errors and technology issues. Regardless of thereasons, the use of assessment results to intervene in atimely manner for students who need additional instruction isdelayed until the next school year when these students likelyhave different teachers and are beginning the new schoolyear—already behind. J . Gonzales, “Many Tennessee Teachers Find State’s Standardized Assessments Unhelpful, Survey Says,” Tennessean.com (August 9, 6001/9
FIGURE 1: State Testing Time Between Testing and Reporting (Grades 3–8, 2015–2016 School OHPASCTXUTWIDays Without TestingFigure 2 shows the same pattern for high school assessments.FIGURE 2: State Testing Time Between Testing and Reporting (High School, 2015–2016 School CODCFLGAIDILINMAMNNCNMOHPASCTXUTWIDays Without TestingNote: District of Columbia school only served students in grades 3–8; therefore, data is reported in Figure 1.The opportunity cost of this lag is dramatic. While there aremany differences across industries, an analogous industry ishealthcare. It is difficult to imagine that customers would be10 2017 K12 ACADEMIC REPORTsatisfied with this lag between seeing a doctor for a diagnosisand receiving the diagnosis and treatment plan months later.
In the 2015–2016 school year, the landscape ofstate testing continued to shift. As shown in Figure3, in 2010, 19 states had joined SBAC; 13 states hadjoined PARCC; and 13 states had joined both SBACand PARCC. By May of school year 2015–2016, thetotal number of states still in one or both consortiawas down to 20. Many states switched to newstate-specific assessments. Most states using aconsortia assessment followed the consortia’srecommendations for cut scores to determineproficiency, but some set their own cut scores.There were other challenges as well. For example,Tennessee invalidated all state assessment resultsdue to assessment scoring issues. Additionally somestates suspended school accountability ratings.7One can hypothesize that the driver leading towithdrawal from the consortia was funding sincethe Race to the Top funds were no longer availableand states had to pay the full price for development,deployment, scoring, and reporting. Anotherhypothesis is that the market response, especiallyto PARCC, was that the tests were too difficult andrequired too many days of testing. From a policyperspective, education has always been a stateright. Without a significant benefit to each state fromthe consortia and without continued strong supportfor the CCSS, it is not a surprise to see statesreverting to their historical practice of building theirown state assessments. This movement has beendriven by strong grass roots initiatives of parentsand families to take back control over their children’seducation.8 Families have opted out in largenumbers in some states.9 One source reported that,in 2014–2015, more than 675,000 students refusedto participate in state testing across the UnitedStates.10 In addition, the “so-called Mommy Lobby”has been vocal about CCSS, and this has sparkeda strong political backlash across a range of voterconstituencies.FIGURE 3: The Shifting Landscape of State TestingTHE SHIFTING LANDSCAPE OF STATE TESTINGNUMBER OF STATES PARTICIPATINGK12 Public School ProgramsPerformance Analysis2015–2016: IntroductionSBAC50454035302520151050PARCCBoth SBAC & PARCCSBAC, PARCC, or Both201020162010 TESTINGLANDSCAPE2016 TESTINGLANDSCAPE20102016SBACPARCCBoth SBAC & PARCCState-DesignedNearly half of all states have dropped SBAC and PARCC testsLENGTH OF TIME:TEST TAKEN TO RESULTS RETURNEDGRADES 3–8 WEEKS WITHOUT DATA37.64030201022.7%11.1%0Weeks Without DataMinMaxMeanHIGH SCHOOL WEEKS WITHOUT DATA37.6403022.0%20104.6%0Weeks Without DataMinMa
K12 Academic Report . all the schools we serve will likely take more than one year, we know that this investment in teachers is an investment in helping students learn and achieve. Our analyses indicates that many K12 school 1. . succeed when
Head of Visual Arts Visual Arts Visual Arts Head of PE Department Physical Education Physical Education Performing Arts & Clubs Org. Librarian Lab Technician muge.ataman@enka.k12.tr irem.nekay@enka.k12.tr melike.caki@enka.k12.tr ugur.cavus@enka.k12.tr ugur.saricam@enka.k12.tr francois.blanc@enka.k12.tr kibar.polat@enka.k12.tr ozge .
Patrick Tillett Math Teacher mcheatham.lec@lee.k12.nc.us Math Teacher Beth Vaughn Science Teacher bvaughn@chatham.k12.nc.us Anna Blackwell Social Studies Teacher gwashington@chatham.k12.nc.us Walter Johnson Bus & Mktg Teacher wjohnson@chatham.k12.nc.us Kendra Bell Family & Consumer Sci kbell@chatham.k12.nc.us
Cameraon Roomer Staff Member Davis roomecam@ycs.k12.pa.us Alex Kadyszewski Community Member Communities in Schools kadysale@ycs.k12.pa.us Robert Jamison Jr. Staff Member McKinley jamisrob@ycs.k12.pa.us Steve Little Administrator William Penn Senior High School littlste@ycs.k12.pa.us Brandon Shiposh Administrator Ferguson shipobra@ycs.k12.pa.us .
Ann Marie Kondrad Other York City SD kondrann@ycs.k12.pa.us Gina Chroniger Building Principal Goode Sch chrongin@ycs.k12.pa.us Jessica Hoover Other Goode Sch hoovejes@ycs.k12.pa.us Beth Falzone General Education Teacher William Penn SHS falzobet@ycs.k12.pa.us Dr Berry Superintendent York City SD berryand@ycs.k12.pa.us
mike.narkiewicz@glide.k12.or.us NORTH DOUGLAS SD #22 Superintendent: Jody Cyr jody.cyr@northdouglas.k12.or.us OAKLAND SD #1 Superintendent: Patti Lovemark patti.lovemark@oakland.k12.or.us RIDDLE SD #70 Superintendent: Dave Gianotti dave.gianotti@riddle.k12.or.us ROSEBURG SD #4 Superintendent: Jared Cordon jcordon@roseburg.k12.or.us SOUTH UMPQUA .
Chuck Cawthon Paraprofessional ccawthon@hart.k12.ga.us Sonia Cobb Director of Alternative Program scobb@hart.k12.ga.us TeacherVicky Hardyvhardy@hart.k12.ga.us Guidance Susan Carman Counselor susan.carman@hart.k12.ga.us HCCA CounselorClaudette Gillespieclaudette.gillespie@hart.k12.ga.us Paul Griffith Lead Counselor pgriffith@hart.k12.ga.us
jarini@csh.k12.ny.us Counselors Ms. Heather Friedland Grades 9-12 hfriedland@csh.k12.ny.us Ms. Mary-Jo Hannity Grades 9-12 mhannity@csh.k12.ny.us Ms. Lori Messina Grades 9-12 lmessina@csh.k12.ny.us Ms. Jennifer Pickering Grades 7-8 jpickering@csh.k12.ny.us Mr. Jonathan Woods Grades 9-12 jwoods@csh.k12.ny.us Office Assistants Ms. Tracy Groeninger
Heidi Wieczerzak hwieczerzak@keansburg.k12.nj.us Caruso Pre-K/ Out of District Carolyn Scott cscott@keansburg.k12.nj.us Bolger Jeffrey Johnson jjohnson@keansburg.k12.nj.us KHS Peggy Daniel mdaniel@keansburg.k12.nj.us KHS Shannon Collier scollier@keansburg.k12.nj.us KHS/Bolger England Bruce bengland@keansburg.k12.nj.us Caruso