Information Risk And Data Quality Management Operational Risk

1y ago
8 Views
2 Downloads
6.38 MB
129 Pages
Last View : 21d ago
Last Download : 3m ago
Upload by : Averie Goad
Transcription

Information riskanddata quality managementOperational RiskJulio - 2015Alberto Ferreras SalagreAlberto.ferreras.salagre@gmail.com

Alberto Ferreras, FRM 2015Anthony Tarantino and deborah Cernauskas, Risk Management in Finance: SixSigma and Other Next Generation Techniques (Hoboken, nJ: John Wiley & Sons,2009).Chapter 3. Information Risk and Data Quality ManagementAssess the potential negative impact poor data quality may have on a businessIdentify the most common issues which result in data errors8-910Identify some key dimensions of data quality.11-12Describe the operational data governance process and differentiate betweendata quality inspection and data validation.13-14Summarize the process of creating a data quality scorecard and compare threedifferent viewpoints for reporting data via a data quality scorecard15-17FRM I2

Alberto Ferreras, FRM 2015“Principles for Effective Data Aggregation and Risk Reporting,” (BaselCommittee on Banking Supervision Publication, January 2013).*Explain the potential benefits of having effective risk data aggregation andreporting25-26Describe key governance principles related to risk data aggregation and riskreporting practices.27-28Identify the data architecture and IT infrastructure features that can contributeto effective risk data aggregation and risk reporting practices.29-30Describe characteristics of a strong risk data aggregation capability anddemonstrate how these characteristics interact with one another.31-36Describe characteristics of effective risk reporting practices37-45FRM I3

Alberto Ferreras, FRM 2015John Hull, Risk Management and Financial Institutions, 3rd Edition (Boston:Pearson Prentice Hall, 2012). Chapter 20.Compare three approaches for calculating regulatory capital.Describe the Basel Committee’s seven categories of operational risk.98-10374-78Derive a loss distribution from the loss frequency distribution and loss severitydistribution using Monte Carlo simulations.111-114Describe the common data issues that can introduce inaccuracies and biases in theestimation of loss frequency and severity distributions.119-120Describe how to use scenario analysis in instances when data is scarce.Describe how to identify causal relationships and how to use risk and control selfassessment (RCSA) and key risk indicators (KRIs) to measure and manageoperational risks.12183Describe the allocation of operational risk capital and the use of scorecards.123Explain how to use the power law to measure operational risk.115Explain the risks of moral hazard and adverse selection when using insurance tomitigate operational risks.FRM I894

Alberto Ferreras, FRM 2015Information risk and dataquality managementInformation Risk and DataQuality ManagementPrinciples for Effective DataAggregation and Risk ReportingFRM I5

Alberto Ferreras, FRM 2015Information Risk and DataQuality ManagementPrinciples for Effective DataAggregation and Risk ReportingFRM I6

Alberto Ferreras, FRM 2015 Information risk and data quality management If successful business operations rely on high-qualitydata, then the opposite is likely to be true as well:flawed data will delay or obstruct the successfulcompletion of business processes. No enterprise risk management program is asuring, reporting, reacting to, and controllingthe risks associated with poor data quality.FRM I7

Alberto Ferreras, FRM 2015 Assess the potential negative impact poor data quality mayhave on a business.1. Financial impacts: Lower revenues or higher expensesIncreased operating costs,decreased revenues,Missed opportunities,Reduction or delays in cash flow,Increased penalties, fines, or other charges.2. Confidence-based impacts. Managers may make incorrect businessdecisions based on faulty data Decreased organizational trust, low confidence in forecasting, inconsistent operational and management reporting, delayed or improper decisions.3. Satisfaction impacts. Customers may become dissatisfied when the business processes faulty data (e.g.,billing errors). Employees may become dissatisfied when they are unable to properly performtheir job due to flawed data.FRM I8

Alberto Ferreras, FRM 20154. Productivity impactsincreased workloads,decreased throughput,Increased processing time,decreased end-product quality.5. Risk impacts Underesrimaring credir risks due to inaccurare documentarion, thereby exposing alender to potential losses. Underestimating investment risk, thereby exposing an investor to potential losses.6. Compliance is jeopardized, whether that compliance is with government regulations, industry expectations, orself-imposed policies (such as privacy policies). A business may no longer be in compliance with regular ions (e.g., Sarbanes-Oxley) iffi nancial reporrs are inaccurareDespite the natural tendency to focus on financial impacts, in manyenvironments the risk and compliance impacts are largely compromised bydata quality issues.FRM I9

Alberto Ferreras, FRM 2015 Identify the most common issues which result in dataerrors.Data entry errors Missing dataDuplicate recordsInconsistent dataNonstandard formatsComplex data transformationsFailed identity management processesUndocumented, incorrect, or misleading metadataAll of these types of errors can lead to inconsistent reporting, inaccurateaggregation, invalid data mappings, incorrect product pricing, and failures intrade settlement, among other process failures.Employee Fraud and AbuseUnderbilling and Revenue AssuranceCredit RiskDevelopment RiskCompliance RiskFRM I10

Alberto Ferreras, FRM 2015 Identify some key dimensions of data quality. DATA QUALITY EXPECTATIONSThe first step toward managing the risks associated with the introduction offlawed data into the environment is articulating the business userexpectations for data quality and asserting specifications that can beused to monitor organizational conformance to those expectations Accuracy. ExactitudThe degree to which data correctly reflects the real world object.Measurement of accuracy can occur by manually comparing the data to anauthoritative source of correct information. Completeness. CompletitudThe completeness dimension specifies the expectations regarding thepopulation of data attributes.The extent to which the expected attributes of data are provided. Eg:phone number.Completeness does not necessarily imply accuracyFRM I11

Alberto Ferreras, FRM 2015 Consistency. ConsistenciaConsistency refers to measuring reasonable comparison of values in onedata set to those in another data.Note that consistency does nor necessarily imply accuracyThere are three types of consistency:l. Record level: consistency between one set of data values and another set within the same record.2 Cross-record level: consistency between one set of data values and another set in different records.3. Temporal level consistency: between one set of data values and another set within the same recordat different points in time. Reasonableness. RazonabilidadThis dimension is used to measure conformanceexpectations relevant within specific operational contexts.toconsistency Currency. RelevanciaThis dimension measures the degree to which information is current withthe world that it models. Uniqueness. Naturaleza únicaThis dimension measures the number of inadvertent duplicate records thatexist within a data set or across data setsFRM I12

Alberto Ferreras, FRM 2015 Describe the operational data governance process and differentiatebetween data quality inspection and data validation. Operational data governance is the manifestation of the processes andprotocols necessary to ensure that an acceptable level of confidence in thedata effectively satisfies the organization’s business needs. Operational data governance refers to the collective set of rules andprocesses regarding data that allow an organization to have sufficientconfidence in the quality of its data A data governance program defines the roles, responsibilities, andaccountabilities associated with managing data quality. A data qualityscorecard could be used to monitor the Success of such a program. Operational data governance combines the ability to identify data errorsas early as possible with the process of initiating the activitiesnecessary to address those errors to avoid or minimize any downstreamimpacts.FRM I13

Alberto Ferreras, FRM 2015o Data Quality Inspection vs. Data ValidationWhile the data validation process (is a one tirne step)reviews andmeasures conformance of data with a set of defined business rules,inspection is an ongoing process to: Reduce the number of errors to a reasonable and manageablelevel. Enable the identification of data flaws along with a protocol forinteractively making adjustments to enable the completion of theprocessing stream. Institute a mitigation or remediation of the root cause within anagreed-to time frame. Solve the cause of the errors and flaws ina timely mannerThe goal of data quality inspection is to catch issues early onbefore they have a substantial negative impact on businessoperations.FRM I14

Alberto Ferreras, FRM 2015 Summarize the process of creating a data qualityscorecard and compare three different viewpoints forreporting data via a data quality scorecardo Essentially, the need to present higher-level data quality scoresintroduces a distinction between two types of metrics. “base-level” metrics. The simple metrics based on measuring againstdefined dimensions of data quality. They quantify specific observance ofacceptable levels of defined data quality rules. “complex” metric. Representing a rolled-up score computed as a function(such as a sum) of applying specific weights to a collection of existingmetrics, both base-level and complex.Complex data quality metrics can be accumulated for reporting in ascorecard in one of three different views: by issue, by business process,or by business impact.FRM I15

Alberto Ferreras, FRM 2015 Data Quality Issues View Business Process View Evaluating the impacts of a specific data quality issue acrossmultiple business processes demonstrates the diffusion of pain acrossthe enterprise caused by specific data flaws. This scorecard scheme, which is suited to data analysts attempting toprioritize tasks for diagnosis and remediation, provides a rolled-up view ofthe impacts attributed to each data issue. Drilling down through this viewsheds light on the root causes of impacts of poor data quality, as well asidentifying “rogue processes” that require greater focus for institutingmonitoring and control processes A scorecard view by business process. For each business process, this scorecard scheme consists of complexmetrics representing the impacts associated with each issue. Thedrill-down in this view can be used for isolating the source of theintroduction of data issues at specific stages of the business process aswell as informing the data stewards in diagnosis and remediation.FRM I16

Alberto Ferreras, FRM 2015 Business Impact View Business impacts may have been incurred as a result of a number ofdifferent data quality issues originating in a number of differentbusiness processes. This reporting scheme displays the aggregation of business impacts rolledup from the different issues across different process flows.For example, one scorecard could report rolled-up metrics documenting theaccumulated impacts associated with credit risk, compliance with privacyprotection, and decreased sales. Drilling down through the metrics will pointto the business processes from which the issues originate; deeper reviewwill point to the specific issues within each of the business processes. Thisview is suited to a more senior manager seeking a high-level overview of therisks associated with data quality issues, and how that risk is introducedacross the enterpriseFRM I17

Alberto Ferreras, FRM 20151 - Which of the following viewpoints regarding dataquality scorecards is best described as providing a highlevel understanding of the risks embedded in dataquality problems?A. Business impact view.B. Business process view.C. Data quality issues view.D. Data process issues view.FRM I

Alberto Ferreras, FRM 2015RESUMEN DE IDEAS Flawed data will delay or obstruct the successfulcompletion of business processes Negative impact poor data roductivityCompliance Data errors lead can lead to inconsistent reporting,inaccurate aggregation . Key dimensions of data qualityAccuracyCompletenessConsistency ReasonablenessCurrencyUniqueness Operaríonal data governance Data Quality Inspection vs. Data Validation Data Quality / Business Process / Business ImpactFRM I19

Alberto Ferreras, FRM 2015Information Risk and DataQuality ManagementPrinciples for Effective DataAggregation and Risk Reporting20

Alberto Ferreras, FRM 2015o Identify Principles for effective risk data aggregation andrisk reporting One of the most significant lessons learned from the global financial crisisthat began in 2007 was that banks’ information technology (IT) anddata architectures were inadequate to support the broadmanagement of financial risks. Many banks lacked the ability to aggregate risk exposures andidentify concentrations quickly and accurately at the bank grouplevel, across business lines and between legal entities. Some banks were unable to manage their risks properly because ofweak risk data aggregation capabilities and risk reportingpractices. This had severe consequences to the banks themselves and to the stabilityof the financial system as a whole. In response, the Basel Committee issued supplemental Pillar 2(supervisory review process) guidance to enhance banks’ ability toidentify and manage bank-wide risks.FRM I21

Alberto Ferreras, FRM 2015o “risk data aggregation” (RDA) means: defining, gathering and processing risk data according to the bank’s risk reporting requirements toenable the bank to measure its performance against itsrisk tolerance/appetite. This includes sorting, merging or breaking downsets of data.o The paper presents a set of principles to strengthen banks’ risk dataaggregation capabilities and internal risk reporting practices (thePrinciples). In turn, effective implementation of the Principles is expectedto enhance risk management and decision-making processes atbanks.o The adoption of these Principles will enable fundamentalimprovements to the management of banks. The Principles areexpected to support a bank’s efforts to:FRM I22

Alberto Ferreras, FRM 2015o The principles are initially addressed to SIBs and apply at both thebanking group and on a solo basis. Banks identified as G-SIBs bythe FSB in November 20118 or November 20129 must meet thesePrinciples by January 2016; G-SIBs designated in subsequentannual updates will need to meet the Principles within three yearsof their designation.o It is strongly suggested that national supervisors also apply thesePrinciples to banks identified as D-SIBs by their nationalsupervisors three years after their designation as D-SIBs.o The Principles and supervisory expectations contained in this paperapply to a bank’s risk management data. This includes datathat is critical to enabling the bank to manage the risks itfaces. Risk data and reports should provide management withthe ability to monitor and track risks relative to the bank’srisk tolerance/appetite.FRM I23

Alberto Ferreras, FRM 2015o These Principles also apply to all key internal risk managementmodels, including but not limited to, Pillar 1 regulatory capital models (eginternal ratings-based approaches for credit risk and advancedmeasurement approaches for operational risk), Pillar 2 capital models andother key risk management models (eg value-at-risk).o All the Principles included in this paper are also applicable to processes thathave been outsourced to third parties The Principles cover four closely related topics: Overarching governance and infrastructureRisk data aggregation capabilitiesRisk reporting practicesSupervisory review, tools and cooperationo Banks should develop forward looking reporting capabilities to provideearly warnings of any potential breaches of risk limits that mayexceed the bank’s risk tolerance/appetite.oFRM IThese risk reporting capabilities should also allow banks to conduct a flexible andeffective stress testing which is capable of providing forward-looking riskassessments. Supervisors expect risk management reports to enable banks toanticipate problems and provide a forward looking assessment of risk.24

Alberto Ferreras, FRM 2015 Explain the potential benefits of having effective risk dataaggregation and reporting. Enhance the infrastructure for reporting key information, particularlythat used by the board and senior management to identify, monitor andmanage risks; Improve ebanking Enhance the management of information across legal entities, whilefacilitating a comprehensive assessment of risk exposures at theglobal consolidated level; Reduce the probability and severity of losses resulting from riskmanagement weaknesses; Improve the speed at which information is available and hencedecisions can be made; and Improve the organisation’s quality of strategic planning and the abilityto manage the risk of new products and services.FRM I25

Alberto Ferreras, FRM 2015 An increased ability to anticipate problems. In times of financial stress, effective risk data aggregation enhancesa bank s ability to identify routes to return to financialhealth. For expample, a bank may be better able to identify asuitable merger partner in order to restore the bank s financialviability. Improved resolvability. By strengthening a bank s risk function, the bank is better able tomake strategic decisions, increase efficiency, reduce thechance ofprofitabilityFRM Iloss,andultimatelyincrease26

Alberto Ferreras, FRM 2015 Describe key governance principles related to risk data aggregationand risk reporting practices.Principe 1 GovernanceA bank’s risk data aggregation capabilities and risk reportingpractices should be subject to strong governancearrangements consistent with other principles and guidanceestablished by the Basel Committee.o The governance principle suggests that risk data aggregation shouldbe part of the bank's overall risk management framework.o To ensure that adequate resources are devoted, senior managementshould approve the framework before implementationA bank’s risk data aggregation capabilities and risk reporting practicesshould be: Fully documented and subject to high standards of validation. This validationshould be independent , using staff with specific IT, data and reporting expertiseFRM I27

Alberto Ferreras, FRM 2015 Considered as part of any new initiatives, including acquisitions and/ordivestitures, new product development, as well as broader process and IT changeinitiatives. When considering a material acquisition, a bank’s due diligenceprocess should assess the risk data aggregation capabilities and riskreporting practices of the acquired entity, as well as the impact on its own riskdata aggregation capabilities and risk reporting practices. The impact on risk dataaggregation should be considered explicitly by the board and inform the decisionto proceed.The bank should establish a timeframe to integrate and align the acquired risk dataaggregation capabilities and risk reporting practices within its own framework. Unaffected by the bank’s group structure.The group structure should nothinder risk data aggregation capabilities at a consolidated level or at any relevant level within theorganisation (eg sub-consolidated level, jurisdiction of operation level). In particular, risk dataaggregation capabilities should be independent from the choices a bank makes regarding its legalorganisation and geographical presence.A bank’s senior management should be fully aware of and understand the limitationsthat prevent full risk data aggregation, in terms of coverage (eg risks not captured orsubsidiaries not included), in technical terms (eg model performance indicators or degree of relianceon manual processes) or in legal terms (legal impediments to data sharing across jurisdictions).The board should also be aware of the bank’s implementation of, and ongoing compliance with thePrinciples set out in this document.FRM I28

Alberto Ferreras, FRM 2015 Identify the data architecture and IT infrastructure features that cancontribute to effective risk data aggregation and risk reportingpractices.Principle 2 Data architecture and IT infrastructure.A bank should design, build and maintain data architectureand IT infrastructure which fully supports its risk dataaggregation capabilities and risk reporting practices not onlyin normal times but also during times of stress or crisis, whilestill meeting the other Principles.Principie 2 requires that: Risk data aggregation capabilities and risk reporting practices shouldbe given direct consideration as part of a bank’s businesscontinuity planning processes and be subject to a businessimpact analysis.FRM I29

Alberto Ferreras, FRM 2015 A bank should establish integrated data taxonomies andarchitecture across the banking group, which includesinformation on the characteristics of the data (metadata), as well asuse of single identifiers and/or unified naming conventions for dataincluding legal entities, counterparties, customers and accounts.Multiple data models may be used as long as there are robustautomated reconciliation measures in place. Roles and responsibilities should be established as they relate tothe ownership and quality of risk data and information for both thebusiness and IT functions. The owners (business and IT functions),in partnership with risk managers, should ensure there areadequate controls throughout the lifecycle of the data and for allaspects of the technology infrastructure. The role of the businessowner includes ensuring data is correctly entered by the relevant frontoffice unit, kept current and aligned with the data definitions, andalso ensuring that risk data aggregation capabilities and risk reportingpractices are consistent with firms’ policies.FRM I30

Alberto Ferreras, FRM 2015 Describe characteristics of a strong risk data aggregation capabilityand demonstrate how these characteristics interact with oneanother.Principle 3 Accuracy and IntegrityA bank should be able to generate accurate and reliable risk data tomeet normal and stress/crisis reporting accuracy requirements. Datashould be aggregated on a largely automated basis so as tominimise the probability of errors. Controls surrounding risk data should be as robust as those applicable toaccounting data. Where a bank relies on manual processes and desktop applications (egspreadsheets, databases) and has specific risk units that use theseapplications for software development, it should have effective mitigantsin place (eg end-user computing policies and procedures) and othereffective controls that are consistently applied across the bank’sprocesses.FRM I31

Alberto Ferreras, FRM 2015 Risk data should be reconciled with bank’s sources, including accountingdata where appropriate, to ensure that the risk data is accurate. A bank should strive towards a single authoritative source for risk dataper each type of risk. A bank’s risk personnel should have sufficient access to risk data toensure they can appropriately aggregate, validate and reconcile the data torisk reports.As a precondition, a bank should have a “dictionary” of the concepts used, such that data isdefined consistently across an organisation.There should be an appropriate balance between automated and manual systems. Whereprofessional judgements are required, human intervention may be appropriate. For many otherprocesses, a higher degree of automation is desirable to reduce the risk of errors.Banks must document and explain all of their risk data aggregation processes whether automatedor manual (judgement based or otherwise). Documentation should include an explanation of theappropriateness of any manual workarounds, a description of their criticality to the accuracy of riskdata aggregation and proposed actions to reduce the impact.Supervisors expect banks to measure and monitor the accuracy of data and to develop appropriateescalation channels and action plans to be in place to rectify poor data quality.FRM I32

Alberto Ferreras, FRM 2015Principle 4 CompletenessA bank should be able to capture and aggregate all materialrisk data across the banking group. Data should be availableby business line, legal entity, asset type, industry, region andother groupings, as relevant for the risk in question, centrations and emerging risks.Principie 4 requires that: A bank’s risk data aggregation capabilities should include allmaterial risk exposures, including those that are off-balance sheet.Both on- and off-balance sheet risks should be aggregated A banking organisation is not required to express all forms of risk in a commonmetric or basis, but risk data aggregation capabilities should be the same regardlessof the choice of risk aggregation systems implemented. However, each system shouldmake clear the specific approach used to aggregate exposures forany given risk measure, in order to allow the board and seniormanagement to assess the results properly.FRM I33

Alberto Ferreras, FRM 2015 Supervisors expect banks to produce aggregated risk data that iscomplete and to measure and monitor the completeness of theirrisk data. Where risk data is not entirely complete, the impactshould not be critical to the bank’s ability to manage its riskseffectively. Supervisors expect banks’ data to be materiallycomplete, with any exceptions identified and explained.Principle 5 TimelinessA bank should be able to generate aggregate and up-to-daterisk data in a timely manner while also meeting the principlesrelating to accuracy and integrity, completeness andadaptability. The precise timing will depend upon the natureand potential volatility of the risk being measured as well asits criticality to the overall risk profile of the bank. The precisetiming will also depend on the bank-specific frequencyrequirements for risk management reporting, under bothnormal and stress/crisis situations, set based on thecharacteristics and overall risk profile of the bank.FRM I34

Alberto Ferreras, FRM 2015Principie 5 requires that:A bank’s risk data aggregation capabilities should ensure that it is ableto produce aggregate risk information on a timely basis to meet allrisk management reporting requirements.Critical risks indude, bur are not limited ro The aggregated credit exposure to a large corporateborrower. By comparison, groups of retail exposures may not change ascritically in a short period of time but may still include significantconcentrations; Counterparty credit risk exposures, including, for example,derivatives; Trading exposures, positions, operating limits, and marketconcentrations by sector and region data; Liquidity risk indicators such as cash flows/settlements andfunding; and Time-critical Operational risk indicators (eg systemsavailability, unauthorised access).FRM I35

Alberto Ferreras, FRM 2015Principle 6 AdaptabilityA bank should be able to generate aggregate risk data to meet abroad range of on-demand, ad hoc risk management reportingrequests, including requests during stress/crisis situations,requests due to changing internal needs and requests to meetsupervisory queries.Adaptability will enable banks to conduct better risk management,including forecasting information, as well as to support stress testingand scenario analyses. Data aggregation processes that are flexible and enable risk data to be aggregatedfor assessment and quick decision-making; Capabilities for data customisation to users’ needs (eg dashboards, keytakeaways, anomalies), to drill down as needed, and to produce quick summaryreports; Capabilities to incorporate new developments on the organisation of thebusiness and/or external factors that influence the bank’s risk profile; and Capabilities to incorporate changes in the regulatory framework.Supervisors expect banks to be able to generate subsets of data based onrequested scenarios or resulting from economic events.FRM I36

Alberto Ferreras, FRM 2015 Describe characteristics of effective risk reporting practiceso Accurate, complete and timely data is a foundation for effective riskmanagement.o However, data alone does not guarantee that the board and seniormanagement will receive appropriate information to make effectivedecisions about risk.o To manage risk effectively, the right informationneeds to be presented to the right people at theright time. Risk reports based on risk data shouldbe accurate, clear and complete.FRM I37

Alberto Ferreras, FRM 2015Principle 7 Accuracy.Risk management reports should accurately and preciselyconvey aggregated risk data and reflect risk in an exactmanner. Reports should be reconciled and validated.o Risk management reports should be accurate and precise to ensurea bank’s board and senior management can rely with confidenceon the aggregated information to make critical decisions aboutrisk. Defined requirements and processes to reconcile reports to risk data; Automated and manual edit and reasonableness checks, including an inventory ofthe validation rules that are applied to quantitative information. The inventory shouldinclude explanations of the conventions used to describe any mathematical or logicalrelationships that should be verified through these validations or checks; and Integrated procedures for identifying, reporting and explaining data errors orweaknesses in data integrity via excepti

Identify some key dimensions of data quality. 11-12 Describe the operational data governance process and differentiate between data quality inspection and data validation. 13-14 Summarize the process of creating a data quality scorecard and compare three different viewpoints for reporting data via a data quality scorecard 15-17 FRM I

Related Documents:

Risk Matrix 15 Risk Assessment Feature 32 Customize the Risk Matrix 34 Chapter 5: Reference 43 General Reference 44 Family Field Descriptions 60 ii Risk Matrix. Chapter 1: Overview1. Overview of the Risk Matrix Module2. Chapter 2: Risk and Risk Assessment3. About Risk and Risk Assessment4. Specify Risk Values to Determine an Overall Risk Rank5

Risk is the effect of uncertainty on objectives (e.g. the objectives of an event). Risk management Risk management is the process of identifying hazards and controlling risks. The risk management process involves four main steps: 1. risk assessment; 2. risk control and risk rating; 3. risk transfer; and 4. risk review. Risk assessment

Risk analysis Process to comprehend the nature of risk and to determine the level of risk Risk appetite Amount and type of risk that the organization is prepared to take in order to achieve its objectives. Risk assessment Overall process of risk identification , risk analysis and risk eva

The potential benefits of digital risk initiatives include efficiency and productivity gains, enhanced risk effectiveness, and revenue gains. The benefits of Exhibit 1 Digital risk management can significantly reduce losses and fines in core risk areas. Risk 2017 Digital Risk Exhibit 1 of 3 Credit risk Risk areas osses 2015, billion

1.5 Tactical Risk Decisions and Crisis Management 16 1.5.1 Risk preparation 17 1.5.2 Risk discovery 17 1.5.3 Risk recovery 18 1.6 Strategic Risk Mitigation 19 1.6.1 The value-maximizing level of risk mitigation (risk-neutral) 19 1.6.2 Strategic risk-return trade-o s for risk-averse managers 20 1.6.3 P

Depositary Receipts (ADRs, EDRs and GDRs) Derivatives XX X Hedging XX X Speculation XX X Risk Factors in Derivatives XX X Correlation Risk X X X Counterparty Risk X X X Credit Risk XX X Currency Risk Illiquidity Risk X X X Leverage Risk X X X Market Risk X X X Valuation Risk X X X Volatility Risk X X X Futures XX X Swap Agreements XX X

81. Risk Identification, page 29 82. Risk Indicator*, page 30 83. Risk Management Ω, pages 30 84. Risk Management Alternatives Development, page 30 85. Risk Management Cycle, page 30 86. Risk Management Methodology Ω, page 30 87. Risk Management Plan, page 30 88. Risk Management Strategy, pages 31 89. Risk

The new ISO 14001:2015 standard explicitly expects organizations to identify and address risks affecting compliance of products and services, resulting in improved environmental performance. Besides identifying the risks, the new ISO standard expects organizations to address opportunities for improvements and corrective actions based on the risk analysis. Note that while corrective action is a .