OpRiskHJF
Transcript of OpRiskHJF
-
8/6/2019 OpRiskHJF
1/59
Quantifying Operational Risk:Possibilities and Limitations
Hansjorg FurrerSwiss Life
Risk-Based Supervision, ETH Zurich Summer 20041 June 2004
Quantifying OpRisk
-
8/6/2019 OpRiskHJF
2/59
Contents
A. The New Basel Capital Accord (Basel II)
B. The OpRisk management process
C. OpRisk measurement for regulatory purposes
D. Conclusions
E. References
Quantifying OpRisk 1
-
8/6/2019 OpRiskHJF
3/59
A. The New Basel Capital Accord (Basel II)
1988: Basel Accord (Basel I): minimum capital requirements against credit risk.One standardized approach
1996: Amendment to Basel I: market risk.
1999: First Consultative Paper on the New Basel Capital Accord (Basel II)
April 2003: Third Consultative Paper (CP3) on Basel II(www.bis.org/bcbs/bcbscp3.htmcp3)
Aug 2003: Publication of the comments received on CP3:
- criticism of credit risk approach- some comments on OpRisk methodology
Quantifying OpRisk 2
-
8/6/2019 OpRiskHJF
4/59
Oct 2003: Committee meeting in Madrid. Press release:
- credit risk approach will be reviewed- no mention of OpRisk aspects?!
May 2004: Committee meeting in Basel: Census achieved on the remainingtechnical issues regarding Basel II (e.g. specification of LDG parameters incredit risk)
June 2004: Publication of the new Basel II framework
year-end 2006: Implementation of the standardized approaches
year-end 2007: Implementation of the advanced approaches
Quantifying OpRisk 3
-
8/6/2019 OpRiskHJF
5/59
Whats new?
Standards on capital adequacy substantially revised
Rationale for the New Accord:
- improve risk management- enhance financial stability
- provide incentives to adopt the more advanced risk-sensitive approaches
Structure of the New Accord (compare with Solvency II):
Three-pillar framework: Pillar 1: minimal capital requirements (risk measurement)
Pillar 2: supervisory review of capital adequacy
Pillar 3: public disclosure
Quantifying OpRisk 4
-
8/6/2019 OpRiskHJF
6/59
Whats new? (contd)
Two options for the measurement of credit risk:
- standard approach
- internal rating based approach (IRB)
Pillar 1 sets out the minimum capital requirements:
total amount of capital
risk-weighted assets 8%
MRC (minimum regulatory capital)def= 8% of risk-weighted assets
Quantifying OpRisk 5
-
8/6/2019 OpRiskHJF
7/59
Whats new? (contd)
New regulatory capital for OpRisk:
OpRisk: The risk of losses resulting from inadequate or
failed internal processes, people and systems or
from external events
Note: This definition excludes
- strategic risk
- reputational risk
- systemic risk
Quantifying OpRisk 6
-
8/6/2019 OpRiskHJF
8/59
Rationale for the explicit treatment of OpRisk
Banks activities have become more complex and diverse (due toglobalization, deregulation, improved technology. Specifically: e-commerce,
large-volume transactions, outsourcing,. . . )
Risk mitigation techniques for credit and market risk may produceother forms of risk
Growing number of operational loss event types
* Credit Suisse Chiasso-affair (1997)
* Nick Leeson/Barings Bank, 1.3b (1995)* Enron (largest US bankruptcy so far) (2001)
* Banque Cantonale de Vaudoise, KBV Winterthur (2003)
Quantifying OpRisk 7
-
8/6/2019 OpRiskHJF
9/59
Rationale for the treatment of OpRisk (contd)
OpRisk management viewed as a comprehensive inclusive discipline(comparable to credit and market risk mgmt)
7 Loss event types: Internal fraudExternal fraudEmployment practices and workplace safety
Clients, products & business practices
Damage to physical assets
Business disruption and system failures
Execution, delivery & process management
Quantifying OpRisk 8
-
8/6/2019 OpRiskHJF
10/59
Basel II = Solvency II
The difference between the two prudential regimes goes further in thattheir actual objectives differ. The prudential objective of the Basel Accord isto reinforce the soundness and stability of the international banking system.To that end, the initial Basel Accord and the draft New Accord are directedprimarily at banks that are internationally active. The draft New Accord
attaches particular importance to the self-regulating mechanisms of amarket where practitioners are dependent on one another. In the insurancesector, the purpose of prudential supervision is to protect policyholdersagainst the risk of (isolated) bankruptcy facing every insurance company.The systematic risk, assuming that it exists in the insurance sector, has not
been deemed to be of sufficient concern to warrant minimum harmonizationof prudential supervisory regimes at international level; nor has it been thedriving force behind European harmonization in this field
(EU Insurance Solvency Sub-Committee (2001))
Quantifying OpRisk 9
-
8/6/2019 OpRiskHJF
11/59
Risks within the SST framework
Quantifying OpRisk 10
-
8/6/2019 OpRiskHJF
12/59
B. The OpRisk management process
Reference: Basel Sound practices [4] for the management andsupervision of OpRisks
The Sound practices principles form the basis for Pillar 2 and 3of the Basel proposals
OpRisk management process:
Identification
Assessment Monitoring
Control/Mitigation
Quantifying OpRisk 11
-
8/6/2019 OpRiskHJF
13/59
The OpRisk management process (contd)
Previous wording: identification, measurement, monitoring andcontrol.
compulsory for those banks that want to adopt the standardized orAMA approach
Possibly intended for all banks
Quantifying OpRisk 12
-
8/6/2019 OpRiskHJF
14/59
The OpRisk management process (contd)
Responsibilities:
- Banking supervisors/board of directors: ensure that an OpRisk managementframework is in place
- Senior management: responsible for the implementation
/: Identification and Assessment of OpRisk:
- identify and assess the OpRisks inherent in all existing products, activities,processes and systems
- assess OpRisks before new products, activities, processes and systems areintroduced or undertaken
Quantifying OpRisk 13
-
8/6/2019 OpRiskHJF
15/59
The OpRisk management process (contd)
How to identify/assess OpRisk?
- Self- or risk assessment (questionnaires, checklists, workshops,. . . )
- risk mapping
- KRI (key risk indicators). State variable: to be assessed. For example #
failed trades, staff turnovers, system downtime etc.- Scorecards, KRD (key risk drivers): Control variables. Translating qualitative
assessments into quantitative assessments. Examples of KRD: # transactions,system quality,. . .
KRI = +k
k KRDk +
- threshold, limits (tied to risk indicators)
- measurement (tracking and recording systematically the frequency and severityof individual loss events)
Quantifying OpRisk 14
-
8/6/2019 OpRiskHJF
16/59
Example: Questionnaire of the Erste Bank
Der Arbeitsaufwand kann im Rahmen der normalen Arbeitszeit erledigt werden(d)
80%Der Arbeitsaufwand kann zumeist im Rahmen der normalen Arbeitszeit erledigt werden, jedochkommt es in Spitzenzeiten zu berstunden
(c)
20%Der Arbeitsaufwand ist zumeist nur mit einer erheblichen Anzahl an berstunden zu erledigen(b)
Der Arbeitsaufwand ist kaum zu bewltigen. Mitarbeiter arbeiten regelmsssig auch am Wochenende(a)
AuswertungWie wrden Sie die Auslastung der Mitarbeiter in ihrem Fachbereich einstufen in Bezug aufgeleistete berstunden?
38
Es stellt kein Problem dar, geeignetes Fachpersonal zu finden(c)
Es ist nicht immer einfach, sofort die richtige Arbeitskraft fr eine offene/neue Stelle zu finden,jedoch gelingt dies letztendlich doch immer wieder ohne allzu grossen Zeitaufwand
(c)
100%Es ist sehr schwierig und zeitaufwndig eine offene/neue Stelle zu besetzen, da geeignetes Personalnur sehr schwer zu finden und oft erst nach einer intensiven Einschulung einsetzbar ist.
(b)
Es ist fast unmglich, stets das richtige Personal fr offene/neue Stellen zu finden. Es ist meistausschliesslich nur durch zeitaufwndige, interne Schulungen mglich eine neue/offene Stellenachzubesetzen
(a)
AuswertungWie sieht es aus mit der Verfgbarkeit von Fachpersonal? Ist es leicht oder schwierig, dieseswieder nachzubesetzen
22
Quantifying OpRisk 15
-
8/6/2019 OpRiskHJF
17/59
C. OpRisk measurement for regulatory purposes
Recall:total amount of capital
risk-weighted assets 8%
Under Basel II: risk-weighted assets must include capital charge for OpRisk
Notation: COP: capital charge for OpRisk
Three distinct approaches for determining COP:
Basic Indicator Approach Standardized Approach
Advanced Measurement Approaches (AMA)
Quantifying OpRisk 16
-
8/6/2019 OpRiskHJF
18/59
Basic Indicator Approach
Capital charge:
CBIAOP = GI
CBIAOP : capital charge under the Basic Indicator Approach
GI: average annual gross income over the previous three years
= 15% (set by the Committee)
No risk mitigation via insurance allowed
Quantifying OpRisk 17
-
8/6/2019 OpRiskHJF
19/59
Standardized Approach
Similar to the BIA, but on the level of each business line:
CSAOP =
8i=1
i GIi i [12%, 18%], i = 1, 2, . . . , 8
8 business lines:
Corporate finance Payment & SettlementTrading & sales Agency Services
Retail banking Asset managementCommercial banking Retail brokerage
No risk mitigation via insurance allowed
Quantifying OpRisk 18
-
8/6/2019 OpRiskHJF
20/59
Advanced Measurement Approaches (AMA)
Allows banks to use their own methods for assessing their exposure to OpRisk significant capital reduction should result compared to BIA and SA
Preconditions: Bank must meet
- qualitative and- quantitative
standards before using the AMA
Qualitative standards: OpRisk management framework must be in place, seeSection B.
Quantifying OpRisk 19
-
8/6/2019 OpRiskHJF
21/59
Quantitative standards
the Committee does not stipulate which (analytical) approach banks must use. . .
. . . however, banks must demonstrate that their measurement system issufficiently granular to capture severe tail loss events, e.g 99.9% VaR
Capital charge defined in the following way:
MRC = EL + UL
If ELs are already captured in a banks internal business practices, then capitalcharge set at the unexpected loss alone:
MRC = UL
Quantifying OpRisk 20
-
8/6/2019 OpRiskHJF
22/59
Quantitative standards (contd)
In a normally distributed world with = 0.999:
UL = VaR(X) = + 3.09 , X N(, 2)
Internally generated OpRisk measures must be based on a 5-year observationperiod
Internal data: banks must record internal loss data. Problem: the data formany (business line/loss event type) cells is sparse. Therefore: aggregation with
External data: A banks OpRisk measurement system must also include externaldata
Quantifying OpRisk 21
-
8/6/2019 OpRiskHJF
23/59
External consortium data
Data consortia (e.g. Operational Risk data eXchange ORX)exchange loss data from member banks anonymously
ORX member banks: ABN Amro, BNP Paribas, Commerzbank,Deutsche Bank, JP Morgan Chase,. . .(number of member banks increasing)
Mathematical problem: How to scale severity and frequency of the
external data to fit into the internal loss data collection?
Quantifying OpRisk 22
-
8/6/2019 OpRiskHJF
24/59
Scaling of OpRisk Data
Example:
- Bank ABC: B/S: CHF 250b
- Bank ORX: B/S: CHF 415b, OpRisk loss CHF 82m for cell (i, k).
- Scaling process:
82 250
415= 49.4
Is this reasonable? Does a relationship between size and risk exist?
If so, is this relationship linear?
Quantifying OpRisk 23
-
8/6/2019 OpRiskHJF
25/59
OpRisk Loss Data
QIS2 exercise ([3]): gathering information concerning OpRisk loss events
loss data mapped according to the 8 7 business line/loss event type cells
- # of participating banks: 30 (11 countries)
- time span: three years (1998-2000)- # reported loss events: 11,300- total loss amount EUR 2.6b (only events exceeding EUR 10,000)
- largest contributions:
(Retail Banking/Clients, Products and Business Services)
(Trading and Sales/Execution, Delivery, and Process Management)
(Commercial banking/External Fraud)
- two cells have no reported losses- extraordinary losses presumably absent from this database
Quantifying OpRisk 24
-
8/6/2019 OpRiskHJF
26/59
OpRisk Loss Data (contd)
LDCE exercise ([5]): Loss Data Collection Exercise for OpRisk
extension and refinement of the QIS2-exercises
Characteristics of the LDCE:
- # of participating banks: 89 (19 countries)
- time span: 1 year (2001)
- # reported loss events: 47,200- total loss amount EUR 7.8b (only events exceeding EUR 10,000)
- considerable clustering around certain business lines/event type cells
Quantifying OpRisk 25
-
8/6/2019 OpRiskHJF
27/59
Comparison of the LDCE with the QIS2 data
Figure1a: Percent Frequency by Business Line
0%
10%
20%
30%
40%
50%
60%
70%
Corporate Finance Trading and Sales Retail Banking Commercial
Banking
Payment and
Settlement
Agency and
Custody Services
Asset
Management
Retail Brokerage No Information
2000 2001
Quantifying OpRisk 26
-
8/6/2019 OpRiskHJF
28/59
Comparison of the LDCE with the QIS2 data
Figure 1b: Percent Frequency by Event Type
0%
5%
10%
15%
20%
25%
30%
35%
40%
45%
50%
Internal Fraud External Fraud Employment
Practices and
Workplace Safety
Clients, Products
and Business
Services
Damage to Physical
Assets
Business Disruption
and System Failures
Execution, Delivery,
and Process
Management
No Information
2000 2001
Quantifying OpRisk 27
-
8/6/2019 OpRiskHJF
29/59
Comparison of the LDCE with the QIS2 data
Figure 2a: Percent Severity by Business Line
0%
5%
10%
15%
20%
25%
30%
35%
40%
45%
Corporate Finance Trading and Sales Retail Banking Commercial
Banking
Payment and
Settlement
Agency and
Custody Services
Asset
Management
Retail Brokerage No Information
2000 2001
Quantifying OpRisk 28
-
8/6/2019 OpRiskHJF
30/59
Comparison of the LDCE with the QIS2 data
Figure 2b: Percent Severity by Event Type
0%
5%
10%
15%
20%
25%
30%
35%
40%
Internal Fraud External Fraud Employment
Practices and
Workplace Safety
Clients, Products
and Business
Services
Damage to Physical
Assets
Business Disruption
and System Failures
Execution, Delivery,
and Process
Management
No Information
2000 2001
Quantifying OpRisk 29
-
8/6/2019 OpRiskHJF
31/59
Comparison of the LDCE with the QIS2 data
Note:
- the QIS2 data shown in the previous graphs (highlighted in orange) onlydepicts the data collected in 2000 (enabling an annual comparison)
- significant differences in the samples of participating banks!- Frequency of loss events: roughly no changes by business line (Figure 1a)
by event type (Figure 1b), we observe an increase in External Fraud and Employment
Practices, whereas the share in Execution, Delivery, and Process Mgmnt decreased
- Severity of loss events: Striking changes by event types (Figure 2b): the distribution of loss amounts is sensitive
to low frequency/high severity events (e.g. 09/11)
Quantifying OpRisk 30
-
8/6/2019 OpRiskHJF
32/59
Comparison of the LDCE with the QIS2 data
To assess the extent of risk, it is necessary to assess the extent ofvariability of both number and amount of loss events.
The RMG is undertaking internal analysis to address such issues,see also, among others, the papers by Pezier [14].
Quantifying OpRisk 31
-
8/6/2019 OpRiskHJF
33/59
Some internal datatype 1
1992 1994 1996 1998 2000 2002
0
10
20
30
40
type 2
1992 1994 1996 1998 2000 2002
0
5
10
15
20
type 3
1992 1994 1996 1998 2000 2002
0
2
4
6
8
pooled operational losses
1992 1994 1996 1998 2000 2002
0
10
20
30
40
Quantifying OpRisk 32
-
8/6/2019 OpRiskHJF
34/59
Modeling issues
Stylized facts about OP risk losses
- Loss occurrence times are irregularly spaced in time
(selection bias, economic cycles, regulation, management interactions,. . . )
- Loss amounts show extremes
Large losses are of main concern!
Repetitive vs non-repetitive losses ([14])
Repetitive losses: loss events that may occur more than once aweek (e.g. settlement risk, minor external fraud, human error in transaction
processing)
Quantifying OpRisk 33
-
8/6/2019 OpRiskHJF
35/59
Modeling issues (contd)
Non-repetitive losses:
- ordinary: from once a week to once in a generation ( 2/3 of the riskcategories reported in the QIS studies)
- extraordinary: large but rare
Immaterial losses: to be ignored (both EL and UL are negligible)
Red flag: Are observations in line with modeling assumptions?
Example: iid assumption implies1 NO structural changes in the data as time evolves
1 Irrelevance of which loss is denoted X1, which one X2,. . .
Quantifying OpRisk 34
-
8/6/2019 OpRiskHJF
36/59
A mathematical (actuarial) model
OpRisk loss databaseX
(t,k) , t {T n + 1, . . . , T 1, T} (years),
k {1, 2, . . . , 7} (loss event type),
{1, 2, . . . , N (t,k)} (number of losses) 7 loss event types: Internal fraud
External fraud
Employment practices and workplace safetyClients, products & business practices
Damage to physical assets
Business disruption and system failures
Execution, delivery & process managementQuantifying OpRisk 35
-
8/6/2019 OpRiskHJF
37/59
A mathematical (actuarial) model (contd)
TruncationX
(t,k) = X
(t,k) 1
X(t,k)
>d(t,k)
A further index j, j {1, . . . , 8} indicating business line can be introduced(suppressed in the sequel)
Estimate a risk measure for FL(T+1,k)(x) = P
L(T+1,k) x
like
1 Op-VaR(T+1,k) = F
L(T+1,k)()
1 Op-EST+1 = E
L(T+1,k)|L(T+1,k) > Op-VaR(T+1,k)
where
L(T+1,k) =N(T+1,k)
=1
X(T+1,k) : OpRisk loss for loss event type k
over the period [T, T + 1].Quantifying OpRisk 36
-
8/6/2019 OpRiskHJF
38/59
A mathematical (actuarial) model (contd)
Discussion: Recall the stylized facts
- Xs are heavy-tailed
- N shows non-stationarity
Conclusions:
- FX(x) and FL(t,k)(x) difficult to estimate
- In-sample estimation of VaR ( large) almost impossible!
- actuarial tools may be useful:
Approximation (translated gamma/lognormal) Inversion methods (FFT) Recursive methods (Panjer) Simulation Extreme Value Theory (EVT)
Quantifying OpRisk 37
-
8/6/2019 OpRiskHJF
39/59
How accurate are VaR-estimates?
Make inference about the tail decay of the aggregate loss L(t,k) via the taildecay of the individual losses X(t,k):
L =N=1
X , 1 FX(x) xh(x) , x
1 FL(x) E[N] xh(x) , x .
Assumptions: (Xm) iid F and for some , and u large
Fu(x) := P[X u x|X > u] = G,(u)(x)
where
G,(x) =
1
1 + x(u)
1/, = 0,
1 ex/ , = 0.Quantifying OpRisk 38
-
8/6/2019 OpRiskHJF
40/59
How accurate are VaR-estimates? (contd)
Tail- and quantile estimate:
1
FX(x) =
Nu
n 1 + x u
1/
, x > u.
VaR = q = u
1
Nun(1 )
(1)
Idea: Comparison of estimated quantiles with the correspondingtheoretical ones by means of a simulation study ([13]).
Quantifying OpRisk 39
-
8/6/2019 OpRiskHJF
41/59
How accurate are VaR-estimates? (contd)
Simulation procedure:
Choose F and fix 0 < < 1, Nu {25, 50, 100, 200}(Nu: # of data points above u)
Calculate u = q0 and the true value of the quantile q
Sample Nu independent points ofF above u by the rejection method. Recordthe total number n of sampled points this requires
Estimate , by fitting the GPD to the Nu exceedances over u by means ofMLE.
Determine q according to (1)
Repeat N times the above to arrive at estimates of Bias(q) and SE(q)
Quantifying OpRisk 40
-
8/6/2019 OpRiskHJF
42/59
How accurate are VaR-estimates? (contd)
Accuracy of the quantile estimate expressed in terms of bias andstandard error:
Bias(q) = E[q q], SE(q) = E
(q q)21/2
Bias(q) =1
N
Nj=1
qj q ,SE(q) =
1N
Nj=1
(qj q)21/2
For comparison purposes (different distributions) introduce
Percentage Bias :=Bias(q)
q, Percentage SE :=
SE(q)
q
Quantifying OpRisk 41
-
8/6/2019 OpRiskHJF
43/59
How accurate are VaR-estimates? (contd)
Criterion for a good estimate: Percentage Bias andPercentage SE should be small, e.g.
= 0.99 = 0.999
Percentage Bias 0.05 0.10
Percentage SE 0.30 0.60
Quantifying OpRisk 42
-
8/6/2019 OpRiskHJF
44/59
Example: Pareto distribution 1 FX(x) = x1 FX(x) = x1 FX(x) = x, = 2 = 2 = 2
u = F(xq) Goodness of VaR0.99 A minimum number of 100 exceedances
(corresponding to 333 observations) is requiredto ensure accuracy wrt bias and standard error.
q = 0.7
0.999 A minimum number of 200 exceedances(corresponding to 667 observations) is requiredto ensure accuracy wrt bias and standard error.
0.99 Full accuracy can be achieved with the minimumnumber 25 of exceedances (corresponding to 250observations).
q = 0.90.999 A minimum number of 100 exceedances
(corresponding to 1000 observations) is requiredto ensure accuracy wrt bias and standard error.
Quantifying OpRisk 43
-
8/6/2019 OpRiskHJF
45/59
Example: Pareto distribution 1 FX(x) = x1 FX(x) = x1 FX(x) = x, = 1 = 1 = 1
u = F(xq) Goodness of VaR0.99 For all number of exceedances up to
200 (corresponding to a minimum of 667observations) the VaR estimates fail to meetthe accuracy criteria.
q = 0.7 0.999 For all number of exceedances up to200 (corresponding to a minimum of 667observations) the VaR estimates fail to meetthe accuracy criteria.
0.99 A minimum number of 100 exceedances(corresponding to 1000 observations) is requiredto ensure accuracy wrt bias and standard error.
q = 0.90.999 A minimum number of 200 exceedances
(corresponding to 2000 observations) is requiredto ensure accuracy wrt bias and standard error.Quantifying OpRisk 44
-
8/6/2019 OpRiskHJF
46/59
How accurate are VaR-estimates? (contd)
Large number of observations necessary to achieve targetedaccuracy.
Minimum number of observations increases as the tails becomethicker ([13]).
Remember: The simulation study was done under idealisticassumptions. OpRisk losses, however, typically do NOT fulfill
these assumptions.
Quantifying OpRisk 45
-
8/6/2019 OpRiskHJF
47/59
Pricing risk under incomplete information
Recall: L(T+1,k): OpRisk loss for loss event type k over the period [T, T + 1]
L(T+1,k) =N(T+1,k)
=1
X(T+1,k)
Question: Suppose we have calculated risk measures (T+1,k) , k {1, . . . , 7}
for each loss event type. When can we consider
7k=1
(T+1,k)
a good risk measure for the total loss LT+1 =7
k=1 L(T+1,k) ?
Quantifying OpRisk 46
-
8/6/2019 OpRiskHJF
48/59
Pricing risk under incomplete information (contd)
Answer: Ingredients
- (non-) coherence of risk measures (Artzner, Delbaen, Eber, Heath
framework)
- optimization problem: given (T+1,k) , k {1, . . . , 7}, what is
the worst case for the overall risk for LT+1?
Solution: using copulas in [9] and references therein.
- aggregation of banking risks [1]
Quantifying OpRisk 47
-
8/6/2019 OpRiskHJF
49/59
A ruin-theoretic problem motivated by OpRisk
OpRisk process:
V(k)t = u
(k) +p(k)(t) L(k)t , t 0
for some initial capital u(k)
and a premium function p(k)
(t) satisfyingP
[L
(k)
t p(k)(t) ] = 1.
Given > 0, calculate u(k)() such that
P
inf
TtT+1u(k)() +p(k)(t) L
(k)t
< 0
(2)
u(k)() is a risk capital charge (internal)
Quantifying OpRisk 48
-
8/6/2019 OpRiskHJF
50/59
Ruin-theoretic problem (contd)
Solving for (2) is difficult
- complicated loss process L(k)t
- heavy-tailed case
- finite time horizon [T, T + 1]
Classical risk theory revisited:
Y(t) = u + ct
N(t)
k=1
Xk = u + ct SXN(t)
(u) = P
supt0
SXN(t) ct
> u
Quantifying OpRisk 49
-
8/6/2019 OpRiskHJF
51/59
Ruin-theoretic problem (contd)
Pro memoria: Cramer-Lundberg Theorem (light-tailed case):Assume that there exists a constant > 0 fulfilling
h(r) =
rc
, h(r) :=Ee
rX
1and that
0
xexd
GI(x)/c
=: < . Then
(u)
1 /c
eu
, u .
Quantifying OpRisk 50
-
8/6/2019 OpRiskHJF
52/59
Ruin-theoretic problem (contd)
Important: Net-profit condition: P
limt
SXN(t) ct
=
= 1
Heavy-tailed case: Let the claim size distribution be such that P[X1 > x] x(+1)L(x), L slowly varying. Then
(u) uL(u) , u .
Now assume for some general loss process (S(t))
P limt
S(t) ct = = 11(u) = P
supt0
S(t) ct
> u
uL(u) , u . (3)
Quantifying OpRisk 51
-
8/6/2019 OpRiskHJF
53/59
Ruin-theoretic problem (contd)
Question: How much can we change S keeping (3)?
Solution: use time change S(t) = S((t))
(u) = Psupt0S(t) ct > uUnder some technical conditions on and S, general models are
given so that
limu
(u)
1(u) = 1
i.e. ultimate ruin behaves similarly under the time change
Quantifying OpRisk 52
-
8/6/2019 OpRiskHJF
54/59
Ruin-theoretic problem (contd)
Example:
- start from the homogeneous Poisson case (classical Cramer-
Lundberg, heavy-tailed case)
- use to transform to changes in intensities motivated by
operational risk, see [11].
Quantifying OpRisk 53
-
8/6/2019 OpRiskHJF
55/59
D. Conclusions
OP risk =market risk, credit risk
all risk types (market, credit, Op) must be consideredsimultaneously, as the reduction in one risk type is at the expense
of an increase in another type
(Internal) OpRisk loss databases must grow
Standard actuarial methods (including EVT) aiming to derive
capital charges for OpRisks are of limited use due to- lack of data
- inconsistency of the data with the modeling assumptions
Quantifying OpRisk 54
-
8/6/2019 OpRiskHJF
56/59
Conclusions (contd)
interesting source of mathematical problems
challenges: choice of risk measures, aggregation of risk measures
Pillar 2 and 3 (c.f. Basel Sound Practices [4]) important part in
the OpRisk management process
Quantifying OpRisk 55
-
8/6/2019 OpRiskHJF
57/59
E. References
[1] Alexander, C., and Pezier, P. (2003). Assessment and aggregation of bankingrisks. 9th IFCI Annual Risk Management Round Table, International FinancialRisk Institute (IFCI).
[2] Basel Committee on Banking Supervision. Working Paper on the RegulatoryTreatment of Operational Risk. September 2001. BIS, Basel, Switzerland,
www.bis.org/publ/bcbs wp8.htm
[3] Basel Committee on Banking Supervision. The Quantitative ImpactStudy for Operational Risk: Overview of Individual Loss Data
and Lessons Learned. January 2002. BIS, Basel, Switzerland,www.bis.org/bcbs/qis/qisopriskresponse.pdf
[4] Basel Committee on Banking Supervision. Sound Practices for theManagement and Supervision of Operational Risk. July 2002. BIS, Basel,Switzerland, www.bis.org/publ/bcbs91.htm
Quantifying OpRisk 56
-
8/6/2019 OpRiskHJF
58/59
[5] Basel Committee on Banking Supervision. The 2002 Loss DataCollection Exercise for Operational Risk: Summary of the Data Collected.
March 2003. BIS, Basel, Switzerland, www.bis.org/bcbs/qis/ldce2002.pdf
[6] Basel Committee on Banking Supervision. Third Consultative Paper onThe New Basel Capital Accord. 29 April 2003. BIS, Basel, Switzerland,www.bis.org/bcbs/bcbscp3.htm
[7] Center for Financial Studies. Latest Developments in Managing OperationalRisk. March 2004. CFS Research Conference, Eltville. www.ifk-cfs.de/English
[8] Embrechts, P., Furrer, H.J., and Kaufmann, R. (2003). Quantifying RegulatoryCapital for Operational Risk. Derivatives Use, Trading and Regulation,Vol. 9, No. 3, 217-233. Also available on www.bis.org/bcbs/cp3comments.htm
[9] Embrechts, P., Hoeing, A., and Juri, A. (2003). Using copulae to boundthe Value-at-Risk for functions of dependent risks. Finance and Stochastics,Vol. 7, No 2, 145-167.
Quantifying OpRisk 57
-
8/6/2019 OpRiskHJF
59/59
[10] Embrechts, P., Kaufmann, R., and Samorodnitsky, G. (2002). Ruin theoryrevisited: stochastic models for operational risk. Submitted.
[11] Embrechts, P., and Samorodnitsky, G. (2003). Ruin problem and how faststochastic processes mix. Annals of Applied Probability, Vol. 13, 1-36.
[12] Geiger, H. (2000). Regulating and Supervising Operational Risk for Banks.Working paper, University of Zurich.
[13] McNeil, A. J., and Saladin, T. (1997) The peaks over thresholds methodfor estimating high quantiles of loss distributions. Proceedings of XXVIIthInternational ASTIN Colloquium, Cairns, Australia, 23-43.
[14] Pezier, J. (2002) Operational Risk Management. ISMA Discussion Papersin Finance 2002-21. To appear in Mastering Operational Risk, FT-PrenticeHall, 2003.
Quantifying OpRisk 58