Sharing Justice Information - Bureau Of Justice Assistance

1y ago
49 Views
2 Downloads
1.40 MB
62 Pages
Last View : 5d ago
Last Download : 3m ago
Upload by : Aydin Oneil
Transcription

Sharing Justice Information:A Capability Assessment ToolkitAnthony M. CresswellTheresa A. PardoDonna S. CanestraroSharon S. DawesDubravka JuragaCenter for Technology in GovernmentUniversity at Albany, SUNY187 Wolf RoadAlbany, NY 12205Phone: (518) 442-3892Fax: (518) 442-3886e-mail: info@ctg.albany.eduhttp://www.ctg.albany.eduAugust 2005This material is based upon work supported by the U.S. Department of Justice, Office of Justice Programsunder NIJ award # 2002-LD-BX-0004. Any opinions, findings, conclusions or recommendations expressed inthis material are those of the authors and do not necessarily reflect the views of the Office of JusticePrograms. 2005 Center for Technology in Government, University at Albany-SUNY

Table of ContentsParticipants .4Executive Summary.5Getting Started .7Overview .9Why Assess Information Sharing Capability? .9Understanding Information Sharing Capability.9Critical Success Factors .10Using the Capability Assessment Toolkit .11Cycle of Planning and Capability Assessment Activities.12Collecting and Combining Data for Capability Assessment .13Dimensions of Information-Sharing Capability .14Planning and Organizing A Capability Assessment .17Implementation Guide .19Applying the Capability Assessment Toolkit .19Phase One: Preliminary Planning .19Phase Two: Authorizing the Assessment.19Phase Three: Operational Planning .20Alternative Approach.22What Method Will Be Used to Review and Combine Ratings?.23Phase Four: Conducting the Assessment.27Phase Five: Developing Action Plans .28Conclusion .29Appendices.30Appendix 1. Case Example: Reducing the Number of Parole Violators Not in Custody.31Appendix 2. Memos to Leaders and Participants.36Appendix 3. Capability Assessment Workshop Materials*.38Appendix 4. Glossary .57Appendix 5. Related Links .59Appendix 6. Selected Publications.60Appendix 7. Summary Sheets.61Tables and FiguresTablesTable 1. Five Phases of Work.8Table 2. Dimensions and Descriptions of Information-Sharing Capability .14Table 3. A Sample Assignment of Specific Dimensions to Types of Participants.22FiguresFigure 1. Cycle of Planning and Capability Assessment Activities .7Figure 2. Capability Assessment Process Flowchart.13Figure 3. Collaboration Readiness Dimension Description.15Figure 4. Example of Subdimension Statements .16Figure 5. Example of Subdimension Evidence Statement .17Figure 6. Confidence Level .17Figure 7. Format for Dimension Displays.242

Figure 8. Example of Dimension Summary Display.24Figure 9. Alternative Dimension Worksheet for Weighted Ratings .26Figure 10. Spreadsheet for Weighted Ratings.273

ParticipantsWorkshopDenise Baer, University of New OrleansMary Barnett, New York County DistrictAttorney's OfficeTheresa Brandorff, Colorado IntegratedCriminal Justice Information SystemDavid Clopton, System Planning CorporationMatthew D'Alessandro, Motorola (attending onbehalf of Integrated Justice InformationSystems)F. Michael Donovan, New York State PoliceDepartmentChristopher Gahan, Massachusetts DistrictAttorney's AssociationJack Gallt, National Association of State ChiefInformation OfficersKennethGill, US Department of Justice, Officeof Justice ProgramsKristin Gonzenbach, DeKalb County, GeorgiaKristine Hamann, New York County DistrictAttorney's OfficeThomas Henderson, National Center for StateCourtsThomas Herzog, New York State Division ofParoleGwen Kelly-Holden, National GovernorsAssociationC. Ted Hom, New York City Police DepartmentRobert Koberger, New York State Departmentof Correctional ServicesThomas Kooy, CRIMNET, State of MinnesotaFern Laethem, Sacramento County PublicProtection AgencyJenine Larsen, National Criminal JusticeAssociationErin Lee, National Governors AssociationJ. Patrick McCreary, US Department of Justice,Office of Justice ProgramsHolly Bockbrader Mathews , Toledo/LucasCounty Criminal Justice Coordinating CouncilLeigh Middleditch, State Attorney's Office forBaltimore CityMark Myrent, Illinois Criminal JusticeInformation AuthorityPaul O'Connell, Iona CollegeLiz Pearson, Integrated Justice InformationSystems InstituteMark Perbix, Colorado Integrated CriminalJustice Information SystemDonald Price, Washington State Department ofCorrectionsBrian Richards, Sacramento County IntegratedJusticeLinda Rosenberg, Pennsylvania JusticeNetwork (JNET)Moira O'Leary Rowley, ACSJoseph Scagluiso, New York City PoliceDepartmentPeter Scharf, Center for Society, Law & JusticeUniversity of New OrleansValerie Shanley, New York State Division ofCriminal Justice ServicesJames Shea, New York State Division ofCriminal Justice ServicesTammy Woodhams, Kalamazoo CriminalJustice CouncilMichael Zimmerman, Hennepin County(Minnesota) Information Technology DepartmentReview TeamsPennsylvania JNET:Dave Woolfenden, Pennsylvania JusticeNetworkBruce Baicar, US Department of JusticeThomas MacLellan, NGA Center for BestPracticesJustice Information Sharing Professionals(JISP):Dwayne Campbell, Mecklenburg County CourtServicesPaul Embley, Practitioners Resource Group,Bonnie Locke, Wisconson Office of JusticeAssistanceJohn Nanni, Tennessee State IntegratedCriminal Justice Information ProjectCatherine Plummer, SEARCHPamela Scanlon, Automated Regional JusticeInformation SystemLaurie Smith, Kalamazoo Criminal JusticeCouncilIntegrated Justice Information SystemInstitute (Integrated Justice InformationSystems):Susan Bates, Justice Management Inc.Steve Mednick, Law Offices of Steven G.Mednick (Current President of Integrated JusticeInformation Systems Institute)Stephanie Rondenell, ACG Inc.4

Executive SummaryThe justice enterprise faces many performance challenges that can be addressed more successfullythrough better information-sharing initiatives. These challenges differ widely in their scope andcomplexity. Enterprise-level initiatives, such as the creation of a statewide crime communicationsnetwork, may consist of many organizations at several levels of government pursuing related butsomewhat different objectives. These organizations are engaged in diverse but overlappingbusiness processes and depend on similar, if not identical, information. They generally interact withthe same population, but at different points in time. At the other extreme, smaller initiatives, such aslinking the different databases and case management processes in a District Attorney’s office, mayinvolve the units of a single organization, operating under one executive leader, working together toachieve a common organization-level goal.Regardless of their size, all these initiatives are made less difficult when participating organizationshave high levels of information-sharing capability. Therefore, decisions to invest in informationsharing initiatives must be grounded in a full understanding of the ability of those involved to identifyand fill the gaps between current and required capability.This toolkit is designed for justice professionals to use when considering or planning a justiceinformation-sharing initiative. It provides a process for assessing where capability for informationsharing exists and where it must be developed in order to achieve public safety goals. Assessmentresults provide a basis for action planning to fill capability gaps both within and across organizations.This is a self-assessment tool, based on the idea that the persons involved in an information-sharinginitiative are best equipped, by their knowledge and experience, to make judgments and supplyevidence about these capabilities. The toolkit facilitates discussion within individual organizations aswell as across organizations involved in an information-sharing initiative; guides assessment along16 dimensions of capability; and guides analysis toward a collective understanding of how to help aspecific initiative succeed. It produces results that: inform planning and design of integrated justice initiatives;identify both strengths and weaknesses;focus investments in specific capability-building efforts;help identify risk and risk mitigation strategies; andhighlight what additional information is needed to make sound decisions.The toolkit is divided into five sections:1. Getting StartedThis section orients the manager of the assessment to the material in the toolkit and the key phasesof work that it entails.2. Overview of Capability AssessmentThe overview briefly describes information-sharing capability and the costs and benefits of acapability assessment. It also presents the approach to capability assessment used in this toolkitincluding brief summaries of the methods and the kinds of results that can be expected. It wasdesigned to be shared with executives or used as talking points when seeking support for an5

assessment. It should also be used in orientation sessions for organizers, participants, and otherstakeholders.3. Implementation GuideThe implementation guide provides guidance for conducting a capability assessment; introduces theprocess of gathering, analyzing, and using assessment data; and offers process and analysisoptions for different situations. It is designed to assist the person or team responsible for managingthe assessment.4. Capability Dimension WorksheetsThis section includes data collection worksheets for the 16 dimensions of capability and theirassociated subdimensions. They address such topics as governance, collaboration readiness,security, project management, technology knowledge, and stakeholders. These worksheets areused to record specific ratings, evidence for those ratings, and confidence levels. Alternativeworksheets and analysis tools can be accessed on the web, including worksheets that use numericscores and weighting.15. AppendicesThese include a case example, sample correspondence and work plans, workshop facilitationguides and exercises, and reference material.1These tools can be found at http://www.ctg.albany.edu/ [need the rest of the URL here]6

Getting StartedThe Capability Assessment Toolkit was devised to be used by the person or team responsible formanaging the assessment. It contains the information needed to plan and carry out the work as wellas material that participants will use during the process. A good way to get started is to read theoverview and case study (appendix 1). Together they present the rationale, summarize themethodology, and provide a practical example of capability assessment.The assessment manager can select different parts of the kit to share with various participants atdifferent points in the assessment. For example, the overview might be a useful way to introduceassessment concepts to top executives (either as a handout, or as a guide for a presentation). Theoverview plus one or two dimension worksheets would help orient the participants from the variousagencies or organizational units to how they can rate capability. The implementation guide andmaterial in the appendices (such as the sample correspondence, facilitation plans, and how-tos) willhelp the assessment manager plan and carry out the assessment.The toolkit has been tested by justice professionals around the country. Their advice and practicalideas are included throughout.Capability assessment links planning and action as shown in the figure below. An effective capabilityassessment will be aligned with strategic plans, program goals, and policy priorities, and the resultswill lead to investments and actions that help achieve them. PreparationScanenvironmentSet goals &scopeSituation &gap analysisCapabilityAssessmentUsing the Toolkit:The Five Phases ofWork1.2.3.4.5.PreliminaryplanningAuthorizing theassessmentOperationalplanningConducting theassessmentDeveloping actionplansUsing Results New actionplansInvestments inimprovedcapabilitiesInvestments inthe initiativeFigure 1. Cycle of Planning and Capability Assessment Activities7

The capability assessment itself has five overlapping phases:1.2.3.4.5.Preliminary planningAuthorizing the assessmentOperational planningConducting the assessmentDeveloping action plansTable 1 below summarizes the key activities and decisions associated with each phase; it is a roughchecklist or a guide to preparing a detailed plan. The implementation guide and appendices offermuch more information.Table 1. Five Phases of WorkPhase1. Preliminary planning2. Authorizing theassessment3. Operational planning4. Conducting theassessment5. Developing actionplansKey activities and decisionsIdentify the organizing team who will plan and implement theassessmentIdentify goals of the assessmentOrient organizers to the toolkit and processBegin to consider assessment implementation options in terms of goalsIdentify timeline for conducting the assessmentIdentify milestones for communicating with participants and leadersabout the assessment and resulting plansIdentify necessary authorizing bodiesDevelop business cases targeted to the necessary authorizing bodiesincluding approach, costs, and benefitsObtain approval to proceedDecide who should participateDecide how dimensions will be assignedDecide what method will be used to review and combine ratingsConduct orientation workshops with all participantsConduct as many ratings collect and analysis workshops as necessaryusing selected methodsShare results with participants and leadersIntegrate results with ongoing strategic planning or create new planningprocesses as necessaryDetermine where investments in the specific information sharinginitiative must be made and where more general investments must bemade in organizational capabilityIdentify short term investments to build capabilityIdentify long term investments to build capability8

OverviewWhy Assess Information Sharing Capability?Capability assessment improves justice information sharing in order to improve the overallperformance of the justice enterprise. The assessment is designed to enhance the prospects forsuccess in information sharing initiatives2 that improve public safety and the administration of justice.These initiatives can involve different levels of government, various combinations of justiceagencies, and a wide range of information types and technologies. The JNET Project inPennsylvania, for example, is a statewide effort that has developed a secure network infrastructure,web-based information sharing access, and information sharing relationships among the justiceagencies. Current functionality includes a portal for access to driver license photos, mug shots, rapsheets, and court case data, advanced photo imaging for investigations, and capacity for email andpager notification of security events or arrests.Some more extensive integration examples are found at the county level. The Harris County (Texas)Justice Information Management System (JIMS) is a highly integrated information sharing systemthat involves 281 public agencies in the county (which includes the city of Houston), and coversmost aspects of both criminal and civil justice functions, including jury management and payroll.Some local projects have narrower information sharing objectives. The Jacksonville (Florida)Sheriff’s Department implemented a web-based portal for information sharing and coordinationamong the 48 law enforcement agencies providing security for the 2005 Super Bowl.Initiatives like these are typically complex, difficult, and prone to failure. They are more likely tosucceed if they are based on a comprehensive and systematic assessment of organizational andtechnical capabilities. Using this toolkit generates comprehensive information about thosecapabilities. The results are useful in planning integrated justice initiatives because they focusattention on the particular capabilities needed and on the strategic selection of sharing partners. Theassessment results also help identify risks and risk mitigation strategies.Understanding Information Sharing CapabilityThe concept of information sharing capability used in this toolkit comes from a combination ofresearch and consultation with justice professionals and balances two different notions of capability.One notion is that capability is composed of a set of generic dimensions that apply in practically anyintegrated justice situation. The other is that these dimensions may be applied or interpreteddifferently, depending on the nature of a particular initiative. Because each initiative has its owngoals, resources, and capability issues, the toolkit provides a means to assess all the importantdimensions of capability in a way that can be adapted to a wide range of situations.This approach is reflected in the following assumptions about information sharing capability.Capability is:2The term initiative refers to the collection of organizations and activities that are involved in justice information sharing improvements.These initiatives range from a single IT project in one justice agency to a multistate effort composed of several separate projects. Sincethe toolkit may be used in any of these settings we use this general term to cover all situations.9

multidimensional—it is made up of several dimensions (in this framework there are 16), all ofwhich contribute to overall information sharing capability.complementary—high or low levels can result from different combinations of factors, highcapability in some dimensions can often compensate for lower levels in others.dynamic—it can increase or diminish due to changes within an initiative or in its externalenvironment.specific to its setting—some elements of capability apply to all settings, but capability for anyparticular project must be assessed relative to its own specific objectives and environment.The interorganizational nature of most information sharing efforts suggests two additional ideas forcapability assessment. First, the success of information sharing depends on the combination ofcapabilities that exist among the sharing partners. Not all organizations need the same capabilityprofile. Instead, the combination of capability profiles across a set of agencies sharing informationdetermines the effectiveness of an initiative. And, second, the knowledge and experience requiredfor effective assessment can be found in the people working on the effort. The necessarycombinations of knowledge and experience may not exist in a single organization, but may beavailable as a result of joining forces across the multiple organizations involved in a cross-boundarysharing initiative.Critical Success FactorsThe elements of the toolkit all work together to support capability assessment, but to be effectivethey should be used in an atmosphere of commitment, learning, and trust. Effective use of the toolkittherefore requires careful attention to the following critical success factors.Trust and CandorThe success of the assessment depends in largepart on the willingness of users to makeassessments and decisions based on solid evidence.Participants must be willing to freely shareinformation about their own organizations and aboutthe capabilities of their sharing partners. Such awillingness helps build an accurate assessment ofthe initiative as a whole. It also helps identify gaps incapability and strategies for addressing them.Critical Success Factors1. Trust and candor2. High levels of individual andorganizational commitment3. The right mix of participants4. Willingness to repeat theassessment as neededThe information and judgments on which theassessments are based must be as accurate andhonest as possible. Accurate assessment dependson letting the “warts and wrinkles” in operationsshow. Without candor, the assessments will not be a useful guide for improving information sharingcapability and creating action plans. Threats to accuracy and honesty, such as low-qualityinformation, unconscious bias, and distortion of the status quo, can lead to invalid or badly skewedcapability assessments.Biased information can come from many sources. Participants may inflate ratings to avoidembarrassment or sanction by management. Or, conversely, they may downgrade their own unit’sratings to make a stronger case for new resources or other organizational benefits. In either case,10

the value of the assessment is diminished. The risk of inflated capability assessments can be greatlyreduced by explicit assurances from executives and accompanying actions demonstratingassessment results will not be used to penalize any individual or unit. These assurances must becredible and be reinforced by adequate trust relationships. If the necessary levels of trust andcredibility do not exist, efforts to establish them should precede the capability assessment.Individual and Organizational CommitmentUsing the toolkit requires a high level of commitment from all participants and organizations to carryout a labor- and time-intensive endeavor. Considerable effort and time are needed to gather thenecessary information, make capability judgments, participate in group discussions, resolvedifferences, reach decisions, and implement action plans. The endeavor also requires logisticalsupport from participating organizations.The Right Mix of ParticipantsAssessing information sharing capability requires specific knowledge and experience. The selectionof participants should result in teams with the right mix of knowledge for the situation at hand. It isnot necessary (or possible) for every participant to be an expert on every aspect or dimension ofcapability. What matters is to get the right expertise by putting together the right team. This teamshould include program specialists, IT specialists, and program and agency leaders from eachparticipating organization. Collectively, the participants must have knowledge of the programenvironment, existing systems, and possible future strategies and technologies. In addition, they willneed to form accurate judgments about the capacity for change in management, policy, andtechnology, and about new investments of resources. The team must bring to the task a solidinstitutional memory and innovative spirit as well as an appreciation for interdependencies. Diversityamong participants helps ensure that differences both within and across organizations areconsidered. Broad involvement throughout the process helps assure that different perspectives aremade explicit and taken into account.Willingness to Repeat the Assessment As NeededThe complexity of information sharing initiatives and the changing nature of information needs andtechnologies suggest that assessments should be repeated over the life of an initiative. Throughrepeated assessments emerging requirements can be taken into consideration, and new capabilitiesand problems can be identified. Likewise, action plans can be refined in light of new requirementsand resources that are identified through repeated assessments.Using the Capability Assessment ToolkitThis toolkit provides a framework and methods for collecting capability assessment ratings fromknowledgeable individuals and using that information to inform decision-making and planning aboutinformation sharing initiatives. It uses simple data analysis tools and extensive discussionopportunities to assemble overall capability assessment ratings. The toolkit helps participants sharetheir individual knowledge and build a well-grounded, collective understanding of areas of high andlow capability. This shared understanding helps the participants identify positive steps to enhancecapability and thus the prospects for a successful initiative.While the toolkit provides assessment criteria and methods, it does not require outside evaluators orconsultants. Rather, the process works by collecting and organizing local knowledge and experiencein a systematic way. External assistance in facilitating or supporting the assessment can often behelpful, but is not required. Decisions about whether and how to use external assistance can bemade by the organizers of the assessment.11

An assessment effort includes: preparation—obtaining authorization, mobilizing support and resources, and planning the detailsof the activities assessment—collecting, analyzing, and reporting assessment data using results—designing and implementing actions to enhance capabilityA summary and examples of these activities are presented in this section of the toolkit. The detailsof how to implement the assessment and work with assessment data are presented in the nextsection, the Implementation Guide. The Dimension Worksheets section contains the data collectionworksheets used to collect the assessment data. The appendix presents a case example along withsample work plans and references.Cycle of Planning and Capability Assessment ActivitiesThe activities described above should be understood as part of a larger set of planning activitiesshown in figure 1 and illustrated in the case example provided in appendix 1. Use of the toolkitshould begin only after careful preparation, including developing a clear, if preliminary,understanding of the goals and scope of the information sharing initiative. This understanding isbased on existing plans and responses to environmental demands. Preparation also requiresdescribing the current situation and identifying the gaps between it and the desired situation. Thesepreparation activities set the stage for use of the capability assessment toolkit, shown as the centralactivity in figure 1. The results of an assessment lead to action plans that lead in turn to investmentdecisions: investments in the specific initiative and investments in the general improvement ofinformation sharing capability.The dashed arrows indicate that this process is almost never linear; instead, it progresses throughmultiple iterations as information and analysis from one set of activities feed back into and modifyearlier conditions and understandings. Over the long term, as indicated in the links from UsingResults to Preparation, the investments made in one initiative will change the status quo and shapefuture initiatives.12

Preparation ScanenvironmentSet goals &scopeSituation &gap analysisCapabilityAssessmentUsing the Toolkit:The Five Phases ofWork1.2.3.4.5.PreliminaryplanningAuthorizing theassessmentOperationalplanningConducting theassessmentDeveloping actionplansUsing Results New actionplansInvestments inimprovedcapabilitiesInvestments inthe initiativeFigure 1. Cycle of Planning and Capability Assessment ActivitiesCollecting and Combining Data for Capability AssessmentThe most complete data come from a process that begins with the individual organizational unitsengaged in the initiative assessing themselves and producing unit-specific results. These are t

Criminal Justice Information Project Catherine Plummer, SEARCH Pamela Scanlon, Automated Regional Justice Information System Laurie Smith, Kalamazoo Criminal Justice Council Integrated Justice Information System Institute (Integrated Justice Information Systems): Susan Bates, Justice Management Inc. Steve Mednick, Law Offices of Steven G.

Related Documents:

Mr.Justice Sh.Riaz Ahmed, HCJ Mr.Justice Munir A.Sheikh Mr.Justice Iftikhar Muhammad Chaudhry Mr.Justice Qazi Muhammad Farooq Mr.Justice Mian Muhammad Ajmal Mr.Justice Syed Deedar Hussain Shah Mr.Justice Hamid Ali Mirza Mr.Justice Abdul Hameed Dogar Mr.Justice Muhammad Nawaz Abbasi CONSTITUTION PETITION NO.15 OF 2002

Justice David S. Wiggins Justice Daryl L. Hecht Justice Brent R. Appel Justice Thomas D. Waterman Justice Edward M. Mansfield Justice Bruce B. Zager In Memoriam Chief Justice W. Ward Reynoldson (Iowa Supreme Court 1971-1987) Justice James H. Carter (Iowa Supreme Court 1982-2006)

Module 5: Role of Medical And Mental Health Practitioners in Investigations Notice of Federal Funding and Federal Disclaimer –This project was supported by Grant No. 2010-RP-BX-K001 awarded by the Bureau of Justice Assistance.The Bureau of Justice Assistance is a component of the Office of Justice Programs, which also includes the Bureau of Justice Statistics, the National Institute of .

The National Institute of Justice is a component of the Office of Justice Programs, which also includes the Bureau of Justice As s i s tance, the Bureau of Justice Statistics, the Office of Juve n i l e Justice and Delinquency Pr evention, and the Office for Victims of Crime.

The National Institute of Justice is a component of the Office of Justice Programs, which also includes the Bureau of Justice Assistance, the Bureau of Justice Statistics, the Office of Juvenile Justice and Delinquency Prevention, and the Office for Victims of Crime. CENTERS FOR DISEASE CONTROL AND PREVENTION. iii Executive Summary

OMB No. 1121-0329 Approval Expires 12/31/2023 U.S. Department of Justice Office of Justice Programs Bureau of Justice Assistance BJA FY 21 Justice Reinvestment Initiati

and Victims: 2006 National Report. Washington, DC: U.S. Department of Justice, Office of Justice Programs, Office of Juvenile Justice and Delinquency Prevention. The Office of Juvenile Justice and Delinquency Preven tion is a component of the Office of Justice Programs, which also includes the Bureau of Justice Assistance,

23. Sharma, P. D. [1991] : The Fungi (Rastogi & Co. Meerut) 24. Vasishta, B. R. [1990] : Fungi (S. Chand & Co. New Delhi) 25. Sharma, O. P. : Fungi (TMH)