OUP Data Stewardship Governance And Management Plan 200731

1y ago
7 Views
2 Downloads
7.60 MB
119 Pages
Last View : 1m ago
Last Download : 3m ago
Upload by : Sutton Moon
Transcription

OUP DATA STEWARDSHIP, GOVERNANCE AND MANAGEMENT PLAN (DSGMP)ABSTRACTOverview. The OUP Data Stewardship, Governance, and Management Plan (DSGMP) provides, A practical, comprehensive approach to achieving state‐of‐the‐art operational analytics use ofOUP’s data, based on industry‐accepted guidanceA data management approach ensuring data quality and adequacy, and enabling data‐driven,analytics‐based, Machine Learning/Artificial Intelligence (ML/AI)‐derived insights into current andpast Center of Excellence (COE) R&D results, inform emerging needs and new potentialdirections, and augment decision‐makingAn initial set of recommended metrics and measures of performance (MMoPs), dashboards andvisualizations in an OUP context, permitting practical application to OUP operationsAn initial set of processes, methods and models for assessing OUP research transition tooperational impact, including Technology Readiness Level (TRL) and nt (ROI/BCA)Recommendations. The OUP DSGMP process flow is shown in Figure A‐1. Recommendations include, Establishment of an OUP Data Stewardship Council (DSC) to oversee, address and resolve data‐related issuesEstablishment of an OUP Data Governance Committee (DGC) to establish and quantitativelymonitor data‐related processes, DSGMP implementation progress and usageFormalizing and adopting an OUP Data Management Framework (DMF) to prescribe datamanagement procedures that will ensure data uniformity, quality and adequacyReview, comment, refinement and acceptance of OUP stakeholder roles and responsibilities,activities and metrics in the implementation of the OUP DSGMP and in deriving its benefitsDivision of labor among OUP‐associated stakeholders versus information sciences andtechnology‐associated (IT) personnelDevelopment of an initial OUP data lexicon/glossary, an evaluation of its suitability for use inanalytics assessments, and recommended changes in data collection practicesAn initial set of MMoPs for assessing effectiveness of OUP’s DSGMP implementation, stakeholderactivities, data quality, and actual OUP operational use of data‐driven, analytics‐based methodsExample dashboards and visualizations to guide and assist OUP stakeholder evolution fromcurrent static, manual processes to data‐driven, analytics‐based work flow analysis and reportingImplementation Strategy. The recommended OUP DSGMP implementation strategy is a three‐phaseapproach, shown in Figure A‐2, which include,Phase I: Draft DSGMP Pre‐Rollout and Stakeholder Preparation ‐‐ Engage in draft OUP DSGMPsocialization/buy‐in with stakeholders, followed by further OUP DSGMP refinement before apilot rollout in Phase II.a. Refine the OUP stakeholder identification and analysisb. Refine initial OUP data assessment and lexicon/glossary, data quality and uniformity needsassessment and validationc. Refine initial set of metrics, specify data collection to enable assessment of measures reinforcingdesired outcomes and driving toward data‐driven analytics utilization goals and objectivesd. Develop array of initial dashboards and visualizations for select OUP operational use casese. Engage in OUP stakeholder outreach, review, iteration and acceptance of roles, responsibilities,and MMoPs in the DSGMP’s DSC, DGC and data management activitiesf. Review initial case studies and models for TRL and ROI/BCA assessment

OUP Data Stewardship, Governance and Management Plan (DSGMP)AbstractFigure A‐1. OUP Data Stewardship, Governanceand Management Plan (DSGMP) OverviewFigure A‐2. Three‐Phase OUP Data Stewardship,Governance and Management Plan (DSGMP) RolloutPhase II: OUP DSGMP Rollout and Transition ‐‐ Initiate pilot OUP DSGMP rollout, including clarifications,expectations and MMoPs for each stakeholder.a. Conduct individual stakeholder discussions in operational DSGMP implementation,learning/training, and feedback activitiesb. Convene and formalize OUP Data Stewardship Council and Data Governance Committeec. Initiate OUP Data Management Framework Implementation – Data entry training/examples,data governance compliant data preparation, input/roll‐outd. Develop processes, methods and models for OUP DSGMP Data Analytics Usage/BenefitsComparisons – Implement data‐driven, analytics‐based processes, methods and models forusing analytics to improve reporting, data call response, and assessing operational impact (TRLassessment, Risk, BCA/ROI) and pathways from research transition to operational impacte. Conduct Extended Case Study Evaluations and Analysis Model Development – Conduct extendedretroactive/forward‐looking reviews of transition success case studies, apply lessons‐learnedassessment to plans and processes, revise analysis methodologies and models of TRL and BCAPhase III: OUP DSGMP Full Scale Implementation, Sustainment and Assessment of Benefits/Value ‐‐Revise OUP DSGMP based on lessons‐learned during Phase II pilots, and initiate full scale,routine operational use/feedback/maintenance/evolution/growth projections of OUP DSGMP.Metrics and Measures of Performance (MMoPs). The OUP DSGMP incorporates a widely recognizedkey to success of new process implementation efforts – the development, customization and evolutionof metrics driving and aligning the desired people‐centric activities and behaviors with processes andoutcomes, for each of the three phases of 1) the initial pre‐rollout activities, 2) during the post‐rolloutpilot transition, and finally, 3) in routine operational use and through long‐term sustainment andintegration with other S&T units. An initial set of metrics are recommended for, People activities – Data Stewards, Governance Committee, Data Analysts, OUP PMs/PCs, COEDirectors, COE PMs/PIs, etc., during each of the three phases Process implementation activities and outcomes, and operational usage activities and outcomes Long‐term DSGMP operational use, maintenance/sustainment and integration within S&TAs measures for the metrics are collected, the data will be used to gauge appropriateness, and MMoPsadjusted in collaboration and consultation with stakeholders to better serve programmatic objectives.Initial OUP‐Relevant Data AssessmentOUP has a diverse collection of internal data from the outputs of its R&D, education and minority‐serving programs since the first COE was established in 2004. Current OUP data assets include COEii

OUP Data Stewardship, Governance and Management Plan (DSGMP)Abstractproject reports, publications and presentations, comprehensive statistical data on COE performancemeasures, cross‐walk of Notice of Funding Opportunities (NOFO), education and Minority ServingInstitution (MSI) outcomes, etc. These data sets would benefit from a comprehensive review by the DSCand DSG, in conformance with the DMF, to derive greater uniformity and quality, thus enabling the data‐driven analytics insight goals and objectives of the DSGMP.OUP operations can further benefit by leveragingadditional external data, some currently availablebut not readily accessible, such as data fromrelevant contracts and legal units, some readilyextractable from its repository of reports,publications and presentations using ML/AItechniques, and some by applying analysis modelsto generate the data, such as Technology ReadinessLevels (TRLs) and Benefit‐Cost Analysis/Return‐on‐Investment (BCA/ROI). Example data sources anddata‐related initiatives and systems of value bothto OUP internal operations and to external partiesinterested in OUP’s R&D results and transitionefforts are shown in Figure A‐3.Figure A‐3. Example Data Flows To/From OUP.Conformance with and Leverage of External (non‐OUP) Data Assets and Data Management EffortsThe OUP DSGMP was developed to be consistent with and is intended to leverage other governmentagency (OGA), DHS, and S&T policies and frameworks, enabling seamless integration of OUP data andanalytics into/from other government systems. In particular, the recommendations for the DSGMP areconsistent with Recommendation Three of the DHS Final Report: Enhancing Component DataManagement Winter Study, June 2019, which calls for components to “establish enterprise datagovernance by reforming and streamlining existing DHS data governance.” This reference also explicitlycalls out the Data Management Association (DAMA) International’s “DAMA Wheel”, from the DataManagement Body of Knowledge, Second Edition, Technics Publications, 2017, as a model for datagovernance. This is one of the key basis references for the DSGMP. This consistency and leverage acrossthe government are applicable to both data sources and data management frameworks and systems.OUP DSGMP‐Enabled Data Analytics, ML/AI‐Based Process Flow, Methods and ModelsThe ultimate goal of the DSGMP is to enable OUP toapply robust data analytics and ML/A advancements in itscore operational processes. A data analytics, ML/AI‐based process flow, integrating models for operationalassessment are shown in Figure A‐4. R&D needs, alongwith baseline Concepts‐of‐Operations (Conops) and TRLsof the current technologies. Information on OUP R&Dthat can address the needs is provided to analytical ML/AImodels that then provide individual project assessments ofimpact, BCA/ROI, etc., aiding the decision‐making in theevaluation and selection of projects to be funded, newdata needs, etc. The process continues through executionusing project performance data, aiding in decision‐makingon when to continue projects and when to redirect funds.For success, this effort will necessarily consist of technical,business and educational tracks, as describe herein.iiiFigure A‐4. OUP Analytics/ML/AI‐BasedProcess for Operational Assessments

OUP DATA STEWARDSHIP, GOVERNANCE AND MANAGEMENT PLAN (DSGMP)EXECUTIVE SUMMARYOBJECTIVES AND IMPACTPlan for better operational use of OUP data. The Office of University Programs (OUP) DataStewardship, Governance, and Management Plan (DSGMP) provides a comprehensive approach toachieving an organizational data maturity level that will enable up‐to‐date, robust, trusted and reliabledata‐driven, analytics‐based visualizations for insight and decision‐making in OUP management andoperations, Center of Excellence (COE) program management and direction, research and development(R&D) project selection and management, and operationalizing OUP and COE program results.Enable efficient and routine data‐driven analytics‐based insight into current and past COE R&D results,informing emerging needs and direction. The research conducted by the DHS Science and TechnologyDirectorate’s (S&T) university‐led COEs constitutes a valuable national asset. Efficient, insightful accessto the results of this body of work is essential in effectively developing operationally relevant homelandsecurity applications, report‐outs, identifying emerging needs for R&D, avoiding duplication whilefacilitating collaboration, leveraging and building on prior work.Develop data‐analytics processes, methods and models for assessing OUP research transition tooperational impact. The DSGMP also develops the elements that support implementation of data‐driven, analytics‐based processes, methods and models for assessing the operational impact of potentialCOE R&D results, and the elements that then support effective execution of the pathways from researchtransition to operational impact. These are demonstrated with case studies of prior efforts, for whichresults and outcomes are known, and forward‐looking case studies to evaluate the operationalperformance of the recommended processes, methods and models themselves.RECOMMENDATIONSEstablish an OUP Data Stewardship Council (DSC), OUP Data Governance Committee (DGC), and anOUP Data Management Framework (DMF), organized in accordance with Figure ES‐1. The DSGMP isled and managed by an OUP Data Stewardship Council (DSC), responsible for ensuring activities thatensure OUP data‐related work are performed according to policies and practices as formulated by theData Governance Committee (DGC). The DSC has final accountability and responsibility for the data andthe processes that ensure effective control and use of data assets.Data management activities are then conducted by all OUP stakeholders in accordance with an OUPData Management Framework that provides the uniform mechanisms for data entry, quality,consistency, etc., that enable their robust and confident usage in data analytics, machine learning (ML)and artificial intelligence (AI) methods.The recommended organizational structure for the OUP DSGMP shown in ES‐1 are based on bestpractices as described and documented in references in the corresponding sections of this report. Thepractices recommended for the OUP DSGMP are aligned with the DHS‐wide data management conceptof operations1, and will enable integration and roll‐up of OUP’s data and its data governance frameworkinto future S&T‐wide enterprise data management initiatives and strategies.1Enterprise Data Management Office, “Enterprise Data Management Concept of Operations,” Version 2.0,Department of Homeland Security, 10/1/2012.ES‐1

OUP Data Stewardship, Governance and Management Plan (DSGMP)Executive SummaryFigure ES‐1. Recommended OUP Data Stewardship, Governance, and Management OverviewCharacterize and engage OUP stakeholders at an individual level for specific OUP examples ofoperationalizing data‐driven, analytics‐based processes, training and demonstrations of effectiveness.Given the small size of the OUP staff, it is plausible to develop customized examples of OUP dashboardsand visualizations to guide and assist stakeholder evolution from their current processes to data‐driven,analytics‐based work flow analysis and reporting. It is also possible to identify, characterize and engageall OUP stakeholders in discovery of common data sharing and access needs, and ways to facilitate theuse of S&T‐wide data in OUP analytics efforts. Tables ES‐1 and ES‐2 provide a list of OUP Stakeholders.These include eight stakeholder groups and 33 total individual stakeholder types. The tables alsoinclude a brief summary of each stakeholder constituent characteristics, and examples of their OUPdata‐related needs. In addition, it is important to assess each stakeholder’s project impact level andrelative priority, which can be as simple as low/med/high impact, and 1‐2‐3 priority. The initial impactand priority assignments in Table ES‐2 should be further refined as part of stakeholder engagement,review and comments. Specific proposed assignments of responsibility at a more granular level, e.g.,DSC, DGC, PMs, Directors, etc., are provided in Table ES‐3.ES‐2

OUP Data Stewardship, Governance and Management Plan (DSGMP)Executive SummaryTable ES‐1. OUP Data Stewardship, Governance, and Management Stakeholders SummaryStakeholder Group CategoriesConstituentsOUP Director, PMs & PCs, and Communications, Education and Minority Program DirectorsCOE Leadership and Management Staff, PIs and ResearchersS&T Office of Innovation (OIC) Leadership and Communications/Data Calls PersonnelOther S&T Offices & Matrixed Divisions, e.g., contracts, tech centers, program/projectmanagement, data and analytics initiatives692125. DHS Leadership and Components6. Other Government Organization (OGA) Leadership and Components (DOD/DARPA, DOE, etc.)7. Congress8. General PublicTotal Number of Stakeholder Constituents*Ex Officio and Notional Representation1*1*1*1*331.2.3.4.Table ES‐2. OUP Data Stewardship, Governance, and Management Stakeholders CharacteristicsStakeholder Group CategoryBrief Summary of Stakeholder Constituent Characteristics Relative toOUP Data‐Related Needs; Impact; Relative Priority1. OUP Director, PMs & PCs, and Communications, Education and Minority Program Directors1.1. Director, Deputy DirectorResponsible for high level management of OUP and the research, MSI andeducation programs of all the COEs; Overall status of COEs and their budgets andspending projections, ability to query and visualize COE and project domains,supported components, geographic distribution, relationships and collaborationsamong COEs/projects, key/impactful results, TRLs and BCAs/ROIs; High; 11.2. Program ManagersResponsible for monitoring status of one or more individual COEs, ensuringprogress focused on operational needs; Overall status of COE progress, budgetsand spending projections, rapid visualization of COE performance, schedule,budget, quality and risks; High; 11.3. Program CoordinatorsAssist COE PMs in the conduct of their responsibilities; Overall status of COEprogress, budgets and spending projections, rapid visualization of COEperformance, schedule, budget, quality and risks; High; 11.4. CommunicationsResponsible for developing newsletters, fact sheets, announcements, etc.,accurately portraying OUP progress and results; Responses to data calls,Congressional inquiries, news outlet requests for information; Med; 21.5. EducationResponsible for developing education and workforce development programs tobe executed by COEs; Must be able to visualize number of participants and theiroperational participation/interest areas, demographics and other statisticalinformation, status of participants along educational pipeline; Med; 21.6. Minority ProgramResponsible for developing MSC engagement programs to be executed by COEs;Must be able to visualize number of research participants, budget allocations,operational participation/interest areas, demographics and other statisticalinformation; Med; 2Subtotal, OUP Stakeholder Constituents62. COE Leadership and Management Staff, PIs and Researchers2.1. Directors, Executive DirectorsResponsible for high level management of COEs and their research and educationprograms; Overall status of projects and budgets, and spending projections,identification of projects both producing valuable/impactful results or in need offurther attention; High; 2ES‐3

OUP Data Stewardship, Governance and Management Plan (DSGMP)Executive SummaryStakeholder Group CategoryBrief Summary of Stakeholder Constituent Characteristics Relative toOUP Data‐Related Needs; Impact; Relative Priority2.2. Project ManagersResponsible for monitoring status of one or more individual projects, ensuringprogress focused on operational needs; rapid visualization of project(s)’performance, schedule, budget, quality and risk; High; 22.3. Communications LeadResponsible for developing newsletters, fact sheets, announcements, etc.,accurately portraying COE progress and results; Responses to data calls, newsoutlet requests for information; Low; 32.4. Transition LeadResponsible for ensuring research results are useful to the intended operationalcomponent, engaging industrial participation, exploring other HSE applicationand commercialization opportunities and pathways; Assessments of TRL,BCA/ROI; Med; 22.5. PIs, Co‐PIsResponsible for conducting leading, state‐of‐the‐art research projects addressingHSE issues of specific need by targeted operational unit, identification andreporting of all IP generated by all participants; Must know source, pedigree andfull details of all data used and/or generated; High; 22.6. Other (non‐PI) Researchers,In collaboration with lead PI or Co‐PI, responsible for conducting leading, state‐Subcontractor Leadsof‐the‐art research projects addressing HSE issues of specific need by targetedoperational unit, identification and reporting of all IP generated; Must knowsource, pedigree and full details of all data used and/or generated; Med; 32.7. Post‐Doctoral FellowsUnder the direction of supervisory researchers, responsible for conducting2.8. Graduate Studentsleading, state‐of‐the‐art research projects addressing HSE issues of specific need2.9. Undergraduate Studentsby targeted operational unit, identification and reporting of all IP generated;Must know source, pedigree and full details of all data used and/or generated;High; 2Subtotal, COE Stakeholder Constituents93. S&T Office of Innovation (OIC) Leadership and Communications/Data Calls Personnel3.1. OIC Director, CoSMust be able to report on the status of its OUP unit’s activities, progress andaccomplishments, its integration with other units, and respond quickly torequests for information from/to its hierarchy; Med; 23.2. OIC FFRDCs, National Labs, IndustryCould be a key supplier of execution partners in the transfer and/or transition ofand International PartnershipsOUP R&D results to operational components; Low; 3Subtotal, OIC Stakeholder Constituents24. Other S&T Offices & Matrixed Divisions, e.g., contracts, tech centers, program/project management, analytics initiatives4.1. US Leadership, CoSMust be able to report on the status of OUP’s activities, progress andaccomplishments, and its integration with other S&T units, and respond quicklyto requests for information from/to its hierarchy; Med; 34.2. US Strategy & Policy, CoSShould be able to leverage OUP’s activities, progress and accomplishments, andits integration with other S&T units, to inform strategy‐y and policy‐making;Med; 34.3. MCS Leadership, CoSShould be able to leverage OUP’s activities, progress and accomplishments, andits integration with other MCS projects; Med; 34.4. MCS PMs with projects engagingMust demonstrate awareness of peer‐conducted R&D, explore collaborations;universitiesLow; 34.5. OSE Leadership, CoSDemonstrate awareness of OIC’s/OUP’s efforts related to its Tech Centers, TechScouting and Transition, etc., and demonstrate awareness of and collaborationwith OUP/COE universities and partners; Low; 34.6. OSE Operations & RequirementsEnable technological collaboration, engagement and integration with parallelAnalysis, Tech Centers, Technology OUP efforts in advanced technology development, scouting and transition ofScouting & TransitionOUP COE R&D; Med; 3ES‐4

OUP Data Stewardship, Governance and Management Plan (DSGMP)Executive SummaryStakeholder Group Category4.7. OES Leadership, CoS4.8. OES, CIO4.9. OES, Communications & Outreach4.10. OES, Compliance4.11. OES, Executive Secretariat4.12. OES, Finance & BudgetSubtotal, Other Stakeholder Constituents5. DHS Leadership and ComponentsSubtotal, DHS Leadership & Components6. Other Government Organization(OGA) Leadership and ComponentsSubtotal, OGA Leadership & Components7. Congress8. General PublicSubtotal, Congress and General PublicTotal Number of Stakeholder ConstituentsBrief Summary of Stakeholder Constituent Characteristics Relative toOUP Data‐Related Needs; Impact; Relative PriorityDemonstrate awareness of OIC’s/OUP’s efforts related to its divisions, anddemonstrate collaboration with OUP data‐related needs; Low; 3Support IST‐related responsibilities in implementation of OUP SSGMP; High; 1Responsible for developing external communications; Responses to data calls,Congressional inquiries, news outlet requests for information; Low; 3Potentially high beneficiary of ready access to compliance‐related data toconfirm security requirements; High; 2Responsible for data calls issued to OUP and collecting responses; Med; 2Source of contract‐related language and budget information necessary for value‐added benefit of OUP DSGMP; Med; 212Interested in knowing/highlighting COE successes, leveraging S&T COE R&Dproject results with impact to Department and specific components’ respectivedomains; High‐Med; 41*Interested in knowing about and leveraging DHS R&D results in their respectivedomains; Low; 11*Interested in demonstrating value of investments in the COE program, inventionsand innovations, operational impacts and ROIs, technology transfer, licensing andcommercialization success stories, geographic distribution of investments byCongressional districts, States, localities; Med‐Low; 22*33*Ex Officio and Notional RepresentationTable ES‐3. Summary of Recommended Designations of Stakeholder Membership and Responsibilitiesin OUP DSC, DGC and Data ManagementStakeholderRepresentative on Data Representative on DataData ManagementStewardship CouncilGovernance CommitteeResponsibilities1. OUP Director, Program Managers & Coordinators, Communications, Education and Minority Program DirectorsDirector, Deputy Director1 (Chair)1*1Program Managers1 (Rotating)1 (Chair)1Program ity Programs111Subtotal, OUP Stakeholders4 [5*]4 [6*]62. COE Leadership and Management Staff, PIs and ResearchersDirectors, Executive Directors1 (Rotating)1* (Rotating)1Project Managers1 (Rotating)1Communications Lead1Transition Lead1Pis, Co‐Pis1Other (non‐PI) Researchers1ES‐5

OUP Data Stewardship, Governance and Management Plan (DSGMP)Executive SummaryStakeholderRepresentative on DataStewardship CouncilRepresentative on DataGovernance CommitteeData ManagementResponsibilities1119Post‐Doctoral FellowsGraduate StudentsUndergraduate StudentsSubtotal, COE Stakeholders11 [2*]3. S&T Office of Innovation (OIC) Leadership and Communications/Data Calls PersonnelOIC Director, CoS1*1*1OIC FFRDCs, National Labs, Industry and1*1International PartnershipsSubtotal, OIC Stakeholders0 [1*]0 [2*]24. Other S&T Offices & Matrixed Divisions, e.g., engaging in contracts, data governance, management & analyticsinitiativesUS Leadership, CoS1*1*1US Strategy & Policy1MCS Leadership, CoS1*1*1MCS PMs for projects engaging universities1OSE Leadership, CoS1*1*1OSE Operations & Requirements Analysis,1Tech Centers, Technology Scouting andTransitionOES Leadership, CoS1*1*1OES, CIO1*1*1OES, Communications & Outreach1OES, Compliance1OES, Executive Secretariat1OES, Finance & Budget1Subtotal, Other Stakeholders0 [5*]0 [5*]125. DHS Leadership and ComponentsDHS LeadershipDHS ComponentsSubtotal, DHS Leadership and Components[1*]6. Other Government Organization (OGA) Leadership and ComponentsOGA Leadership (DOD, DOE, NASA, etc.)OGAs Components (AFRL, DARPA, etc.)Subtotal, DHS, OGA Stakeholders[1*]7. Congress, and 8. General PublicCongressGeneral PublicSubtotal, Congress and Public Stakeholders0 [1*]0 [2*][2*]Total Number of Stakeholders5 [12*]5 [15*]29 [33*]*Ex Officio/Notional Representation. Numbers in brackets ([*]) are totals including Ex Officio & Notional members.ES‐6

OUP Data Stewardship, Governance and Management Plan (DSGMP)Executive SummaryIncorporate metrics and measures of performance (MMoPs) that drive and align people‐centricbehaviors/activities and process‐centric outcomes. A key success factor for DSGMP implementation,compliance and usage, and thus to achieve its operational value, is the active involvement andengagement of stakeholders in vigorous discussion, understanding, acceptance, feedback and iterationin the development of the metrics. This outreach process will serve to develop metrics that encourage,assess and align the effectiveness of people, processes and outcomes, including such factors asstakeholder behavior and activities, data quality, and actual, practical use of data‐driven, analytics‐basedmethods and thus the desired outcomes of the OUP DSGMP implementation. As measures for themetrics are collected, the data will be used to gauge appropriateness, and MMoPs adjusted incollaboration and consultation with stakeholders to better serve programmatic objectives.OUP DSGMP IMPLEMENTATION STRATEGY – A THREE‐PHASE APPROACHThe recommended OUP DSGMP implementation strategy is the three‐phase approach shown in FigureES‐2.Figure A‐2. Three‐Phase OUP Data Stewardship, Governance and Management Plan (DSGMP) RolloutPhase I, Draft DSGMP Pre‐Rollout and Stakeholder Preparation: Engage in draft OUP DSGMP reviewand socialization/buy‐in with stakeholders, followed by further OUP DSGMP refinement before a pilotrollout in Phase II.I‐1. Confirm/Refine Initial Stakeholder Identification and Analysis – A key to success in an initiativethat will include operational change is to engage the affected stakeholders in the development andimplementation planning and execution processes. Five groups of stakeholders have beenidentified, containing a total of 33 stakeholder types, and their characteristics, data‐related needs,impact and priority to the project described. These assignments should be refined as necessary inpreparation for distribution to stakeholders for their review and comments.I‐2. Draft DSGMP Review and Stakeholder Engagement and Outreach ‐‐ Engage stakeholders initerative review/comment/feedback discussions of the draft DSGMP, data assessments, metricsES‐7

OUP Data Stewardship, Governance and Management Plan (DSGMP)Executive Summaryand analytics dashboards and visualizations (D&Vs), with the goal of their understanding thecharacteristics and benefits of the DSGMP. Confirm understanding of guiding rollout philosophy of,a. Stakeholder reference set of roles, responsibilities, activities and metrics and measures ofperformance (MMoPs), summarized in Table ES‐4b. MMoPs in the context of value to the OUP organization and each stakeholder’s participationin the DSC, DGC and DMFc. Learn while implementing, including training needs, providing feedback in real‐time, “while”an issue is being experienced, better than “later”d. Opportunity for input into DSGMP, data needs, metrics, design of analytics D&Vse. OUP‐wide review of initial case studies and process analysis models, preparing to conductadditional case studies (past and current/future), and developing additional modelsPhase II, OUP DSGMP Rollout and Transition: Initiate pilot OUP DSGMP rollout, including clarifications,expectations and MMoPs for and by each stakeholder.II‐1. Conduct individual stakeholder discussions in operational DSGMP implementation,learning/training, and feedback activities What do I have to do firsto Expectationso Responsibilities What will I be able to do bettero Current processo New process How will I have to do things differentlyo Old wayo New way How will the effectiveness of my activities be measuredo Metricso Measureso Conduct data‐based analysis of MMoPs to refine data collection to betteraddress performance objectives Who will be my “upstream” person(s), and who will be my “downstream” person(s)o Who will I depend on to

The OUP Data Stewardship, Governance, and Management Plan (DSGMP) provides, A practical, comprehensive approach to achieving state‐of‐the‐art operational analytics use of OUP's data, based on industry‐accepted guidance

Related Documents:

Chapter 5: Antimicrobial stewardship education for clinicians 123 Acronyms and abbreviations 126 5.1 Introduction 127 5.2 Key elements of antimicrobial stewardship education 128 5.2.1 Audiences 128 5.2.2 Principles of education on antimicrobial stewardship 129 5.2.3 Antimicrobial stewardship competencies and standards 129

The Oxford Advanced Learner’s Dictionary, The Oxford Collocations page 35 Dictionary For Students of English, The Oxford Student’s Dictionary of English, The Oxford Wordpower Dictionary For Learners of English (OUP) Chit Chat 1 Paul Shipton (OUP) page 36 Intercultural Activities Simon Gill & Michaela Cakov, (OUP) page 37 Intercultural .

This section of the FHWA Data Governance Plan provides the organizational framework for how Data Governance will be managed within the Agency. 1.1 IntroductIon This document is intended to provide a brief introduction to Data Governance and how it will be implemented within the Federal Highway Administration (FHWA). The Data Governance Plan

Analysts and Data Stewards in an organization use Data Governance Tools to simultaneously enforce corporate governance policies and to promote correct usage of data. Typically, metadata is extracted from Databases, ETL processes, and some BI Tools and is consolidated within a Data Governance Tool where it is enriched with additional governance .

We propose the Big Data Governance Framework to facilitate successful implementation in this study. Big Data governance framework presents additional criteria from existing data governance focused on data quality level. The Big Data governance framework focuses on timely, reliable, meaningful, and sufficient data services.

PART III Globalism, liberalism, and governance 191 9 Governance in a globalizing world 193 ROBERT O. KEOHANE AND JOSEPH S. NYE JR., 2000 Defining globalism 193 Globalization and levels of governance 202 Globalization and domestic governance 204 The governance of globalism: regimes, networks, norms 208 Conclusions: globalism and governance 214

of an organization, data governance teams are required to manage the firm's data assets. Key Components of Data Governance Typically, data governance focuses on the people, processes, policies, and supporting technologies that drive a data-driven culture, as outlined in Figure 1. Key components of any successful data governance program include:

and measured pile capacities. API-1993 provides potentially non-conservative results for shaft capacity in loose sands, and in loose-to-medium sands with high length (L) to diameter (D) ratios. Figures 1 and 2 illustrate these skewed trends, reproducing the database comparisons given by Jardine et al (2005) between calculated (Q c) and measured (Q m) shaft capacities. 2.2.2 Non-conservative .