Ending Campus Sexual Violence

10m ago
5.31 MB
22 Pages
Last View : 15d ago
Last Download : 6m ago
Upload by : Jewel Payne

Ending Campus Sexual Violence:Outcomes from theCulture of Respect Collective ProgramAllison Tombros Korman, MHS; Jennifer E. Henkle, MSW, CSW; Alexis Wesaw, MASEPTEMBER 2020

Table of ContentsI. Executive Summary . . . . . . . . . . . . . . . . . . . . 3II. Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . 4III. Participating Institutions. . . . . . . . . . . . . . . 5IV. Collective Program Description. . . . . . . . . 6V. Methodology . . . . . . . . . . . . . . . . . . . . . . . . . 9VI. Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12VII. Discussion . . . . . . . . . . . . . . . . . . . . . . . . . 18VIII. References. . . . . . . . . . . . . . . . . . . . . . . . . 22Copyright 2020 by the National Association of Student Personnel Administrators (NASPA), Inc. All rights reserved. No part of thispublication may be reproduced, stored in a retrieval system, or transmitted in any form or by any means, now known or hereafterinvented, including electronic, mechanical, photocopying, recording, scanning, information storage and retrieval, or otherwise, exceptas permitted under Section 107 and 108 of the 1976 United States Copyright Act, without the prior written permission of NASPA.NASPA does not discriminate on the basis of race, color, national origin, religion, sex, age, gender identity, gender expression,affectional or sexual orientation, or disability in any of its policies, programs, and services.SEPTEMBER 20202

Executive SummaryThe problem of sexual violence in America’s colleges and universities isundeniable: one in four female, nearly one in four transgender/genderqueer/nonbinary, and one in fourteen male students will experience sexual violencewhile pursuing a four year degree (Cantor et al., 2019). Institutions of highereducation have a legal, and many would argue a moral, obligation to prevent andrespond to sexual violence. But for many institutions, addressing sexual violenceupstream - thinking beyond the immediacy of the problem - is a challenge sooverwhelming, they don’t know where to begin.The Culture of Respect Collective, a NASPAsignature initiative, is a two-year programdesigned specifically to address the enormityof this issue. Grounded in a comprehensive,evidence-informed framework, the programguides institutions of higher educationthrough a rigorous process of selfassessment and targeted organizationalchange. This report examines the experienceand outcomes of Collective Cohorts 1 and2, particularly the meaningful programmaticand policy changes they made that furtherthe goal of ending campus sexual violence.At the end of their two years in the program,participating institutions completed or madeprogress on 85% of the objectives theySEPTEMBER 2020set for themselves around strengtheningsexual violence prevention and responseefforts, and 77% saw increased collaborationbetween departments and colleaguesin this vital work. Ninety-two percent ofparticipating institutions also saw an increasein required prevention programming forundergraduate students, and there was anoverall rise in institution level interventions.This report considers the factors thatfacilitated and impeded institutions’ successin implementing the program, as well as howthe growing number of Collective institutionscan impact higher education’s understandingof the problem, and how best to address it.3

IntroductionOver the last decade, sexual violence in highereducation has been a focus for colleges anduniversities, student activists, parents andfamilies, and the media alike. Particularlyfollowing the release of the 2011 Dear ColleagueLetter (Ali, 2011) by the Obama administration,institutions of higher education began to seean increase in lawsuits and complaints to theDepartment of Education’s Office of Civil Rights(OCR) by both reporting and responding parties;primarily, these focused on the handling ofsexual misconduct cases at the institutional level(Anderson, 2019). As institutions responded tothese complaints and students demanded actionand accountability, it became clear there weretoo few resources available to address this issue.Specifically, there was a paucity of programs orframeworks to effectively guide colleges anduniversities in making the organizational andcultural change necessary to prevent sexualviolence at the primary level of intervention.In response to this gap, Culture of Respectdeveloped the Culture of Respect Collective(“the Collective”). The Collective is a two-yearprogram that brings together institutions of highereducation who are dedicated to ending campussexual violence and guides them through arigorous process of self-assessment and targetedorganizational change. The Collective is groundedin evidence-based practices and emergingevidence, with each diverse cohort relying on anexpert-developed comprehensive framework,cross campus collaboration, and peer-led learningto make meaningful programmatic and policychanges. The Collective was designed broadlyto facilitate implementation at institutions of alltypes. Specifically, it was developed to includeSEPTEMBER 2020ABOUT CULTURE OF RESPECTIn 2013, Culture of Respect was founded byparents of college-aged students who werealarmed by the high rate of sexual violenceon college and university campuses. With ateam of public health and violence preventionresearchers from New York University andColumbia University and experts in advocacy,student affairs, higher education policy, andlaw, they created the Culture of RespectEngagement Blueprint (CORE Blueprint,[Culture of Respect, 2017a]), a six-pillarstrategic road map that engages students,parents, faculty, administrators, healthprofessionals, athletes, and other campusstakeholders in implementing the leadingpractices to shift campus culture to one thatis free from sexual violence. In 2015, Cultureof Respect became part of NASPA, where theycontinue to execute their mission: to buildthe capacity of educational institutions to endsexual violence through ongoing, expansiveorganizational change.tailored technical assistance that would allow theinformation being shared to be meaningful andadaptable based on the specific needs of eachinstitution and the student populations they serve.This report looks closely at the data andoutcomes from the first two cohorts of theCollective. Cohort 1 began the program in January2017 and Cohort 2 in January 2018, with eachcohort graduating two years following theirlaunch in December 2018 and 2019, respectively.Participation was monitored throughout thetwo year program and staff worked directlywith institutional representatives to supportthe completion of each programmatic goal asoutlined in this report.4

Participating InstitutionsInstitutions were recruited for participationin the program in a variety of ways, includingvia outreach among NASPA members andother higher education associations andgroups, and through networks of colleaguesand anti-sexual violence organizations.Cohort 1 recruitment included a partnershipwith the nationally-recognized studentdriven organization It’s On Us, who providedparticipating institutions supplementalsupport in engaging student activists. Toapply, institutions submitted an applicationand letter of support from a senior-leveladministrator, and upon acceptance,provided remuneration for their participationin the program.In total, Cohorts 1 and 2 of the Collectiveconsisted of 68 institutions of highereducation: 53 in Cohort 1 and the remainderin Cohort 2. Thirty-six percent of participatinginstitutions were identified as small collegesor universities (under 5,000 students);38% were medium-sized (5,001 to 14,999students); 15% were large institutions (15,001to 29,999 students); and 10% were verylarge institutions (over 30,000 students).Across Cohorts 1 and 2, 62% were publicinstitutions, 38% private institutions, 20%religiously affiliated institutions, and 10%were community colleges.Figure 1MAP OF PARTICIPATING INSTITUTIONSSEPTEMBER 20205

Collective ProgramDescriptionFigure 1The Collective program model isgrounded in the CORE Blueprint, anexpert-developed framework for helpinginstitutions address sexual violence (seeFigure 2). The CORE Blueprint is organizedaround six key areas – the six pillars –that help ensure institutions are workingto create a comprehensive institutionalstrategy for adresssing sexual violence(Culture of Respect, 2017a).SURVIVOR SUPPORTwith options on reportingOngoingSELF-ASSESSMENTCLEAR POLICIESon misconduct, investigations,adjudications, and sanctionsThe CORE Nwith student groupsand leadersfor the entire campusPUBLICDISCLOSUREof statisticsCollective institutions began the programby establishing a multidisciplinary team ofstakeholders from across the institution, led byone to two team leads; these individuals wereoften positioned in Title IX or prevention offices.These Campus Leadership Teams (CLTs) werein some cases a new group formed specificallyto engage in the work of the Collective, whileother institutions adapted an existing sexualviolence or Title IX task force or working group.Culture of Respect provided guidance on themake-up of the CLT, including a list of potentialstakeholders to engage, as well as those rolesfor whom participation was required, namelyrepresentatives from Title IX; prevention, wellnessand/or health promotion staff; faculty; senior leveladministrators (vice president for student affairsor similar); and students, both undergraduate andgraduate (as applicable). CLT size and make-upvaried greatly by institution with the smallest CLTcontaining five members and the largest 70; theaverage CLT contained 22 members.SEPTEMBER 2020Both the team leads and CLTs engaged in anonboarding process conducted via webinarby Culture of Respect staff, and team leadscompleted an initial survey (“Launch Survey”) toassess their current institutional climate, programorientation process, and existing knowledgeabout sexual violence prevention and response.The CLTs’ first task was completing the COREEvaluation (Culture of Respect, 2017b), arobust self-assessment instrument developedby Culture of Respect. Questions in the COREEvaluation instrument are tied to CORE Blueprintrecommendations from across the six pillars.The CORE Evaluation guides institutionalleaders in inventorying their efforts to addresssexual violence; indentifying how these effortsare codified into policy; and assessing howthis information is shared with the campuscommunity and evaluated. The tool is updatedannually to incorporate new or emergingresearch and practices.6

Collective ProgramDescription (continued)Culture of Respect staff requested that theinstrument be completed collaboratively within eachCLT. Institutions met this request in a variety of ways,including meeting with the entire group over one ortwo sessions, planning a series of meetings betweenkey staff members on the working group, or havingone employee complete the assessment with somelevel of feedback from their colleagues. AlthoughCulture of Respect staff presented the first as thepreferred approach, institutions were encouraged toadapt this process to meet their needs and make itfeasible for their campus to complete the instrument.If inconsistencies were apparent in submissions,Culture of Respect staff followed up with a phonecall and, in collaboration with the CLTs, made anynecessary changes. On average, CLTs took four and ahalf hours to administer the CORE Evaluation, eitherin one meeting or broken up into multiple meetings.Each institution’s CORE Evaluation results wereanalyzed electronically (see “Methodology”) and byCulture of Respect staff, which informed the creationof a comprehensive baseline report provided to theinstitution’s team leads. Each baseline report included: A numeric baseline score for each of the sixpillars, as well as a cumulative score;1Each institution’s baseline report was designedto guide the CLTs and team leads in creatingan Individualized Implementation Plan (IIP): anactionable plan to improve their efforts to preventand resond to campus sexual violence. IIPs werecomposed of a series of objectives developed bythe CLT, informed by their report’s opportunitiesfor growth and checklist(s). Culture of Respectcoached team leads in developing objectives thatwere specific, measurable, attainable, realistic, andtimebound (SMART) and provided feedback on theinitial draft via a rubric to inform the final IIP.Year two of the program was dedicated toimplementing the IIP. Team leads were chargedwith supporting their CLTs in operationalizing theobjectives included in the IIP, either individually orvia subcommittees (often organized by pillar). CLTswere encouraged to continue to meet monthly intheir second year to foster accountability to the IIPprocess and to each other. In spring of year two,team leads submitted a midpoint IIP update to allowCLTs to take stock of progress to date and assesswhat objectives could realistically be achieved bythe end of the second year, and also completeda Midpoint Survey to assess satisfaction with theprogram to that point. Qualitative feedback, organized by pillar,identifying institutional strengths in sexualviolence prevention and response as indicatedby the responses to the CORE Evaluation, aswell as opportunities for growth; and A checklist summarizing consistency with keyfederal laws and guidance, as indicated byresponses to the CORE Evaluation.2Institutions also received an annotated copy of theirCORE Evaluation results.SEPTEMBER 2020To establish a scoring schema for the CORE Evaluation, responses wereassigned a point value based on how closely they aligned with best practicerecommendations from the CORE Blueprint. Select responses were weightedbecause of their impact or difficulty of implementation.1 In September 2017, the Trump administration rescinded the 2011 DearColleague Letter and the 2014 Questions and Answers on Title IX and SexualViolence, thereby impacting what fell under the umbrella of federal law andguidance (Ali, 2011; Lhamon, 2014). When Cohort 2 received their baselinereports in spring 2018, the checklist was adapted to include federal laws andguidance that still applied, while practices that were no longer required butcontinued to be considered good practice were moved to a second checklist,entitled “Recommendations Checklist.” Cohort 2’s baseline report also includedsample comparison data to Cohort 1.2 7

Collective ProgramDescription (continued)Figure 3COLLECTIVE PROGRAM MODELProgram LaunchCLT FormationJANUARY — MARCH year 1CORE EvaluationBaselineSPRING year 1Develop ActionPlan (“IIP”)IIP FeedbackSUMMER — FALL year 1CORE EvaluationEndpointImplementation, Technical Assistance, Professional Development, and Peer-led LearningFALL year 1 — FALL year 2NOVEMBER year 2Throughout the program, but particularly in year two, team leads, CLT members, and other campus stakeholderswere encouraged to participate in technical assistance, educational, and networking opportunities providedby Culture of Respect and NASPA (see Figure 4). These offerings were designed to increase knowledge aroundthe prevention of and response to campus sexual violence and related topics so participants could moresuccessfully operationalize their IIPs, and to connect program participants to learn and benefit from each other’sknowledge and experience.Figure 4PROFESSIONAL DEVELOPMENT OPPORTUNITIES PROVIDED AND PARTICIPATION22 professionaldevelopmentwebinars* Examples: A Social Marketing Approach to Addressing the Normalization of Stalking on CollegeCampuses; Blurred Lines: Student Led Discussions of Rape Culture in the Black Community atHistorically White Institutions; How the Arts Break Barriers and Help Us Heal; and Community College,First Gen, and Military-Connected Students: Approaches to Sexual Violence Prevention & Response Average participation: six webinars per institution throughout the Collective *Does not include onboarding webinars21 roundtablediscussions Topic: Certified Peer Education (CPE); prevention programming for the “Red Zone”; creating visualdepictions (flow charts, infographics) of the reporting process; and discussions of Title IX staffingstructures within various institutions Average participation: three roundtables per institution throughout the CollectiveSix nationalconferencesGroup listserv Included NASPA Annual Conference and NASPA Strategies Conferences Average participation: one national conference per institution throughout the Collective A group listserv for crowdsourcing innovative practices and solutions to problems faced in the fieldAt the conclusion of their second and final year in the program, participating institutions readministered theCORE Evaluation using the same process and participants as at baseline (to the extent possible). Participantsalso submitted a final IIP update, documenting the extent to which they accomplished the objectives theycreated, and an endpoint survey (“Closeout Survey”) to evaluate the change in knowledge and campus climate,as well as their program satisfaction.SEPTEMBER 20208

MethodologyThe data presented in this report werecollected from several surveys: the COREEvaluation; the Collective Launch Survey;the Collective Midpoint Survey; and theCollective Closeout Survey.CORE EVALUATION: CHANGES FROM 2NDTO 3RD EDITIONThe CORE Evaluation was designed asan instrument that would continuallyevolve to reflect the best and emergingpractices in the field. As such, updateswere made between the second and thirdeditions, and continue to be made forsubsequent cohorts. Between the twoeditions, questions were edited for clarity,as needed, and examples of new contentadded included:Each cohort completed their baseline CORE Evaluationat the start of the program, with Cohort 1 completingthe CORE Evaluation 2nd Edition in early 2017, andCohort 2 completing the CORE Evaluation 3rd Editionin early 2018. Team leads received a PDF copy ofthe evaluation, as well as a link to the evaluationinstrument in Qualtrics. Instructions for completionincluded definitions of key terms and a list of datasources and documents needed. Raw response datafrom Qualtrics were exported to Stata (a data analysisand statistical software) for coding and analysis.Culture of Respect staff translated the data analysisinto a comprehensive baseline report provided backto the institution’s team leads, in addition to providingan annotated copy of the institution’s evaluationresponses, as submitted to Qualtrics. E mployees as survivors of sexualviolence (i.e. access to supportiveservices and information provided inemployee trainings and materials); A ccessibility of sexual misconductpolicies; U se of informal resolution processes,including restorative justice;Each cohort completed their endpoint CORE Evaluationat the end of the program, with Cohort 1 completingthe CORE Evaluation 2nd Edition in late 2018, andCohort 2 completing the CORE Evaluation 3rd Edition inlate 2019. Endpoint evaluation raw response data wascoded and analyzed in Stata, using exactly the samemethod that was used for the baseline data.SEPTEMBER 2020 A vailability of sexual healthpromotion services and programs;and U se of tools and processes to informand standardize issuing of timelywarnings.9

Methodology (continued)This report features CORE Evaluation data from institutions that completed both their baseline and endpointevaluations: 23 institutions from Cohort 1 and eight institutions from Cohort 2 (see Figure 5).3 For the analysispresented in this report, aggregate baseline numeric scores for each pillar were compared to aggregate endpointnumeric scores for each pillar. Additionally, aggregate baseline scores for select questions were compared toaggregate endpoint numeric scores for select questions.Figure 5COLLECTIVE METHODOLOGY AND TIMELINEAdministeredendpoint COREEvaluation,Nov. 2018 –March 2019(n 23)Administeredbaseline COREEvaluation,March –Nov. 2017(n 42)Cohort 1:53 newinstitutionsRolled over toCohort 2 (n 3)Cohort 2:15 new institutions 3 rolled overfrom Cohort 1Did not completeprogram (n 27)Administeredbaseline COREEvaluation,May – June 2018(n 14)Rolled over toCohort 3 (n 1)Administeredendpoint COREEvaluation,Dec. 2019 –May 2020(n 8)Did not completeprogram (n 9)Please see “Discussion” section for additional information about factors that facilitated or impeded successful Collective completion and the most commonly cited reasons thatinstitutions were not able to complete the program. Figure 5 includes institutions who were unable to keep pace with programmatic deadlines of their original cohort but did notwish to withdraw from the program; these institutions “rolled over” to the next cohort.3 SEPTEMBER 202010

Methodology (continued)The CORE Evaluation instrument presents twolimitations. First, because it relies on self-reportedinformation, social desirability bias is a concern:institutions may have been hesitant to report anynoncompliance with federal laws or admit to anyfractures in their approach to addressing sexualviolence on campus. Yet, the varied responsesdemonstrate that institutions were willing tobe honest about their current practices as partof an effort to make meaningful programmaticchanges. Additionally, because institutionsdiffered in their approaches for administering theinstrument, there was variability in how questionswere answered. If relevant stakeholders werenot consulted, some responses could have beenrecorded inaccurately.related to efforts to address sexual violence andparticipants’ experience with Collective programcomponents. Ideally, the same participants fromeach cohort would complete all three surveysand their responses would be tracked over time;however, staffing changes at Collective institutionsmade it impossible to conduct a longitudinalanalysis of data from these three surveys. Forthe analysis presented in this report, responsesfrom individuals in Cohort 1 and Cohort 2 wereaggregated by question, for each survey.Table 1SURVEY DISTRIBUTION ANDRESPONSE RATEIn addition to the data from the CORE Evaluation,this report also features data from the CollectiveLaunch Survey, the Collective Midterm Survey,and the Collective Closeout Survey, all designedby Culture of Respect staff (see Table 1 for surveydistribution and response rate). Participatinginstitutions’ team leads and CLT members wereinvited via Qualtrics to participate in the surveys.The Launch Survey collected data on the currentconditions on participants’ campuses relatedto capacity to enact organizational changerelated to campus sexual violence, participants’experience with the Collective orientation,and participants’ requests for professionaldevelopment and training. The Midterm Survey—administered approximately one year aftereach cohort began—collected data on programimplementation on campus and participants’experience with Collective program components.As the cohorts were coming to an end,participants were asked to complete a CloseoutSurvey, which collected data on campus climateSEPTEMBER 2020Launch SurveyCohort 1Cohort 2Date administeredEarly 2017Early 2018# of participants invited14933# of responses7510Response rate50%30%Date administeredEarly 2018Early 2019# of participants invited6760# of responses3016Response rate45%27%Date administeredLate 2018Late 2019# of participants invited12550# of responses2610Response rate21%38%Midterm SurveyCloseout Survey11

ResultsChanges in CORE EvaluationScores and SuccessfulCompletion of ObjectivesCumulatively, participating institutions’ COREEvaluation scores increased from baselineto endpoint in all six pillars, ranging from anincrease of 11% to 22% per pillar (see Figure 6).Individually, institutions’ scores increased in fiveof the six pillars, on average, and by an averageof more than fifty points from baseline comparedto endpoint.Figure 6CHANGE IN CORE EVALUATIONSCORES FROM BASELINE TOENDPOINT BY PILLARPillar 1:Survivor SupportPillar 2:Clear Policies11%17%Pillar 3: MultitieredEducationPillar 4:Public Disclosure19%14%Pillar 5: SchoolwideMobilization22%Pillar 6: OngoingSelf-Assessment22%IIPs included an average of 22 objectives across the six pillars; the minimum number of objectives was eight, andthe maximum 67. On average, ten different individuals, offices, or departments were assigned responsibility forimplementation of these objectives. This diffusion of responsibility was intentional in the program design and IIPdevelopment and helped to ensure that the work of creating institutional change was shared across the CLT. Atthe conclusion of the program, institutions had completed or made progress on an average of 85% of theirobjectives. Cumulatively, the greatest number of objectives completed occurred in Pillar 1, Survivor Support (71objectives) and Pillar 3, Multitiered Education (87 objectives).SEPTEMBER 202012

Results (continued)Changes in FederalCompliance andRecommended PracticesBaseline CORE Evaluation reports for Cohort 1included a checklist of compliance with federallaws and guidance, including the Clery Act, theFamily Educational Rights and Privacy Act (FERPA),and guidance from the Department of Education.Cohort 2 reports included a checklist for federallaws and guidance, but also a “RecommendationsChecklist,” which included practices that wereno longer required after the 2017 guidancerescissions, but were still considered goodpractice. Checklists were updated based on theresponses provided in the institution’s endpointCORE Evaluation responses. At the conclusion ofthe program, institutions were newly compliantwith an average of three federal law or guidancepractices (Cohorts 1 and 2) and an average of tworecommended practices (Cohort 2).Key Findings by PillarSURVIVOR SUPPORTThe score for the Survivor Support pillarincreased 11% from baseline to endpoint,with the greatest amount of change occurringaround improvements related to creating asupportive environment for survivors to reportsexual violence. Notable improvements included:expanding the content and clarity of survivorcentered processes and procedures in theinstitution’s sexual misconduct policy (25%);establishing or clarifying amnesty policies (35%);clearly explaining the responsibilities of the Title IXSEPTEMBER 2020coordinator/officer in policy (45%); and ensuringthere was a team to coordinate the provision ofservices to survivors of sexual violence (43%).Also notable was a 200% increase in informingsurvivors how and when timely warnings abouttheir assaults would be distributed to the campuscommunity, thus reducing the likelihood ofpotential retraumatization.CLEAR POLICIESThe score for the Clear Policies pillar increased17% from baseline to endpoint, with the greatestincrease related to changes in sanctions andsanctions language in policy. Institutions noteda 48% increase in the frequency with which theyreviewed or revised their sexual misconductpolicy: at endpoint, 25 institutions were reviewingthem at least annually. Cohort 2 also saw a 100%increase in the formats in which policies relatedto sexual misconduct were accessible for thosewith disabilities and/or for whom English is not afirst language. Across both cohorts, participantsindicated an 89% increase in the use of the lesslegalistic and more neutral “reporting party”and “responding party” language in reference topending or potential investigations.13

Results (continued)MULTITIERED EDUCATIONPUBLIC DISCLOSUREThe score for the Clear Policies pillar increased19% from baseline to endpoint; increases inthis pillar were most strongly associated withprevention education provided to students. Whilesome improvements in this pillar are attributedto increased completion rates for existingprevention education, institutions also added atleast one additional dose of prevention educationprogramming for incoming undergraduate(three institutions) and graduate students (twoinstitutions), which increased endpoint scoresby 83% and 175%, respectively. Institutions alsoindicated an increase in the frequency with whichprimary prevention and awareness programmingwas required for continuing undergraduatestudents (92%) and graduate students (57%).This increase for graduate students is particularlynotable given the gaps in education for graduatestudents identified in the baseline COREEvaluations. Other increases can be tied toincreased campus-wide prevention (see Table 2).The score for the Public Disclosure pillarincreased 14% from baseline to endpoint, withincreases most notably attributed to increasedcommunication with campus stakeholders aboutprevention and response as well as sharing moreinformation about the institutional strategy toaddress sexual violence. At endpoint, institutionsmore proactively shared the results of theirclimate surveys and incorporated discussionsof sexual violence prevalence and institutionalresponse into annual reports (57% and 100%increases, respectively). Participating institutionsalso expanded data collection and reporting onthe prevalence of sexual violence among specificstudent demographics and identities, in the hopethat these would inform additional support andprevention education accordingly.Table 2INCREASED CAMPUS-WIDEPREVENTION EFFORTSInstitution-LevelInterventionScore Increase fromBaseline to EndpointCommemorates StalkingAwareness Month350%Primary prevention andawareness programming forstudents and training foremployees recognizes theintersection of sexual violenceand marginalized identities94%Hosts a campus-wideprimary prevention andawareness campaign100%SEPTEMBER 2020SCHOOLWIDE MOBILIZATIONThe score for the Schoolwide Mobilization pillarincreased 22% from baseline to endpoint. Thebulk of this increase was tied to increased studentengagement. Schools offered peer educatorscompensation for the firs

or universities (under 5,000 students); 38% were medium-sized (5,001 to 14,999 students); 15% were large institutions (15,001 to 29,999 students); and 10% were very large institutions (over 30,000 students). Across Cohorts 1 and 2, 62% were public institutions, 38% private institutions, 20% religiously affiliated institutions, and 10%

Related Documents:

In the sexual violence field, we have a tendency to treat sexual violence as though it occurs in a vacuum. We deal with the consequences of sexual violence through crisis intervention and counseling and with the dynamics of sex-ual violence through risk reduction and awareness efforts. However, we rarely take (or have) the time to step back,

RAINN, "Campus sexual violence: Statistics," accessed June 2017; National Sexual Violence Resource Center (NSVRC), "Statistics about sexual violence," accessed June 2017; Know Your IX, "Statistics" (on gender-based violence), accessed June 2017.

Wk 3/4 X6 Words ending in –able and – ible The –able ending is far more common than the –ible ending. The–able ending is used if there is a related word ending in –ation. If the –able ending is added to a word ending in –ce or –ge, the e after the c or g must be kept as th

2.4 The School’s Sexual Harassment and Sexual Violence Policy is intended to cover instances of harassment and/or violence of a sexual nature. In the event of a conflict between this discrimination, harassment and bullying policy and the Sexual Harassment and Sexual Violence Policy, the terms of the latter policy will prevail.

THE SPECTRUM OF SEXUAL VIOLENCE This module covers the spectrum of sexual violence: the wide range of behaviors that constitute sexual violence . It also provides an opportunity for programs to cover legal definitions specific to their own state or territory, as well as to explore where legal definitions may fall short of capturing all

THE ASSOCIATION OF THE BAR OF THE CITY OF NEW YORK 42 West 44th Street, New York, NY 10036-6689 www.nycbar.org DOMESTIC VIOLENCE, DATING VIOLENCE, AND SEXUAL ASSAULT RESOURCES: TOP RESOURCES IN NEW YORK CITY HOTLINES FOR EMERGENCIES Call 911 SAFE HORIZON HOTLINES (800) 621-4673 – Domestic Violence Hotline TDD: (866) 604-5350 (212) 227-3000 – Rape, Incest, Sexual Assault Hotline

Spelling Words ending in ious Words ending in cious Words ending in tial / cial Challenge words Challenge words Words ending in ant / ance ent / ence words ending in ible and able words ending in ibly and ably Challenge words Challenge words Short vowel i spelled with y Long vowel i spelled w

Based on Uganda Demographic Household Survey data (UDHS), Uganda has a high rate of reported GBV4. The 2011 UDHS indicated that overall prevalence rates by type of violence were 56% for physical violence, 27.7% for sexual violence and 42.9% for spousal emotional violence. Sexual violence