_{1}

^{*}

In this paper, we present a multi-name incomplete information structural model which possess the contagion mechanism and its efficient Monte Carlo algorithm based on Interacting Particle System. Along with the Credit Grades, which is industrially used single-name credit model, we suppose that investors can observe firm values and defaults but are not informed of the threshold level at which a firm is deemed to default. Additionally, in order to model the possibility of crisis normalization, we introduce the concept of memory period after default. During the memory period after a default, public investors remember when the previous default occurred and directly reflect that information for updating their belief. When the memory period after a default finish, investors forget about that default and shift their interest to recent defaults if exist. One of the variance reduction techniques, relying upon Interacting Particle System, is combined with the standard Monte Carlo method to address the rare but critical events represented by the tail of loss distribution of portfolio.

Interaction of default events play a central role for systemic risk measurement as well as the credit risk manage- ment and portfolio credit derivative valuation. Recent financial crisis has revealed a necessity of quantitative methodology to analyze default contagion effects which are observed in several financial markets. Default contagion is a phenomenon where a default by one firm has direct impact on the health of other surviving firms. Since the contagion effects heavily influence the correlations of defaults, capturing them in quantitative models is crucial. Existing dynamic credit risk models which deal with default contagion include, among others, Davis and Lo [

In this paper, we present a multi-name incomplete information structural model which possess a default contagion mechanism in the sense that the sudden change of default probabilities arise from the investors’ revising their perspectives towards unobserved factors which characterize the joint density of default thresholds. Here, in our model, default thresholds are assumed to be unobservable from public investors and a firm is deemed to default when firm value touch this level of threshold for the first time. This formulation is a slight generalization of Giesecke and Goldberg [

The rest of this paper is organized as follows. Section 2 introduce our model and deduces an expression for the conditional joint distribution of the default thresholds. Section 3 develops standard Monte Carlo simulation algorithm. Section 4 gives an overview of Feynman-Kac path measure which plays a central role of the Interacting Particle System and how the algorithm can be applied to the model. Section 5 provides numerical examples and Section 6 concludes.

Uncertainty is modeled by a probability space

information flow over time. We impose two additional technical conditions, often called the usual conditions. The first is that

Let

next stochastic differential equation

where

asset value processes have correlations, i.e.,

covariation. Filtrations generated by observed asset values are denoted by

random default threshold

Here random variables

lights to the multi-name setting and model the so-called default contagion. Let

process which indicate the default status of the firm

With the view to analyzing how the period of past default memories affect succeeding defaults, we consider the incomplete information framework which is known to represent default contagion. In order to depict the incomplete information structure more concretely, in addition to the assumption of the randomness of the default threshold, we postulate the following assumptions.

Assumption 1 Public investors can observe firm values and default events although they can not directly

observe the firm’s default thresholds

Define the set of survived firms

investors have knowledge that the surviving firms have lower default barrier than their running minimum of the firm value processes, it is natural to suppose the next assumption.

Assumption 2 At time

inequality

Let

some constants. And we assume that

. We denote.

Remark 1 The definition of the mean vector

original CreditGrades model. Finger, Finkelstein, Pan, Lardy and Ta [

rate

Assumption 3 There is a consensus on the prior joint distribution of firm’s default thresholds among the public investors. More concretely, investor’s uncertainty about the default thresholds

where

Assumption 4 For each default time

Remark 2 Since public investors observe all the history of the firm value, they know that the unobservable threshold should be located below the running minimum of the firm value. Despite these knowledge, we assume that public investors treat the logarithm of the recovery rate

Assumption 1, Assumption 3 and Assumption 4 provide the default contagion mechanism; The default of a firm reveals information about the default threshold and then public investors update their beliefs on surviving firm’s joint distribution of thresholds. From public investors’ perspective, this naturally causes the sudden change of default probabilities of survived firms, which is just what we wanted to model. The situation of contagious defaults can be translated to the recession, however, it will not continue forever. In our model, we further assume that public investors view the crisis will return to normal condition after some finite time interval.

Assumption 5 The covariance parameter jumps from

constants

and then assume that the elements of the variance-covariance matrix

Thus the mean vector

Assumption 6

Define the set

elements in the set

come after the elements of

submatrix formed by selecting the rows and columns from the subset

ponding

Assumption 6 implies that during the memory period, public investors remember the firm values at which the defaults occurred. We note that

By virtue of Assumption 3, we can deduce the conditional joint distribution of the default thresholds as follows. Here we don’t eliminate the possibility of simultaneous defaults, i.e., we don’t need to assume

Proposition 1 Let

where

where

Proof 1 From the continuity of the asset process

Here, whenever defaults occur, let the order of the firms be rearranged in such a way that the elements of

with the special case

to be the possible range of the recovery rate vector

for

from Bayes’ Theorem,

The last equality holds because

logarithm of recovery rates are given by the conditional distribution of

well known. See for example [

Therefore the conditional distribution

Remark 3 In the case

showed that the counting process

based on the total hazard with the case

Let

defaults with

the unconditional joint density of

For example, square bracket

Sequence of defaults and the corresponding memory periods

condition remains effective until

indicate that the random vector

at time

third default time

completely, i.e., the memory period of 5 after

sampled under the condition

In this subsection we see how the conditional distribution of the default threshold change at the default time.

In this subsection we see how the conditional distribution of the default threshold change at the default time.

Suppose that the first default occurred at time

and let

to

0.95. We see that the

Remark 4 Giesecke [

where

where

where

Consequently, by introducing the density function

which implies that

g_{i}(x) and g_{i}(x|T_{1} = T_{j}) whith r_{ij} = 0.01

g_{i}(x) and g_{i}(x|T_{1} = T_{j}) whith r_{ij} = 0.04

Lebesgue masure 0. Therefore

we argue the default probabilities instead of the default intensities in our model.

This section develops a numerical method to compute the distribution of the number of defaults via Monte Carlo simulation. Complicating matters is the fact that new information of defaults changes the mean and covariance of the joint distribution of the thresholds. At each moment, covariance matrix should be calculated relying upon whether the memory period have terminated or not. Therefore, the simulation depends on the path, i.e. the order of occurrence of sequential defaults.

The time interval

in analogy with its continuous time version (0.3). As mentioned in Carmona, Forque and Vestal [

Algorithm 1 To generate a one sample path of the total default

Step 0. Initialize

Step 1. Generate the

Step 2. Determine whether default occurred or not at time

If

Else, set

Let

If

Step 3. Determine

firms and store

Step 4. Renew the matrix

survived firms

ed above

Step 5. Set

In credit risk management, it is important to measure how often the rare but crucial events will occur. However, standard Monte Carlo algorithm in Section 3 may be inefficient in some situations such as the portfolio constituents have small default probabilities. This is because, in order to estimate accurately the probability of rare events, a large number of trials may be required with Algorithm 3.1.

In an effort to estimate the accurate probabilities within a reasonable computational time, we embed IPS to original standard Monte Carlo simulation algorithm. In the following two subsections, we provide a quick overview of the IPS inspired by the pioneering work Del Moral and Garnier [

Let

the discrete time Markov chain given by

Markov chain

which will be discussed later. In general, the random element

Next, we let

finally let

The Interacting Particle System consists of a set of

For each

transitions

formula

We also introduce the corresponding normalized measure

Note that

Therefore, for any given bounded measurable function

The above relationship has the merit of relating the un-normalized expectations in the left hand side to nor-

malized expectations in the right hand side. Furthermore, for any path

weighted indicator function

Then we see that

This formula can be applied to the situation that the

computed via computation of normalized measures. It is known (see Del Moral and Garnier [

starting from

which can be easily seen by substituting

and

into the integrand of the right hand side of (0.15).

For the purpose of numerical computations of the rare events of the form (0.14), we introduce an interacting particle system. We choose a large integer

a Markov chain

particles with respect to the measure

We start with an initial configuration

buted random samples from the distribution

i.e.,

distribution

where

and

of

The selection stage is performed by choosing randomly and independently

according to the Boltzmann-Gibbs measure

During the mutation stage, each selected particle

In other words, we set

where

A result of [

Mimicking (0.13), unbiased particle approximation measures

and then, by (0.14), we can get the particle approximation of the rare event probabilities. More precisely, if we let

then

We refer to Del Moral [

We introduce special notations

According to Del Moral and Garnier [

for some

This regularity condition ensures that the normalizing constants

Then our potential function is given by

The first term of (0.21) reflects the fact that the high weights are assigned to the particle which had renewed the running minimum during the period

observable, it is known that the IPS is effective to simulate the counting process

accuracy by Carmona, Fouque and Vestal [

Algorithm 2 Assume that we have a set of

define the Markov process

where

To generate an estimate of

Step 0. Initialize the particles

firm value processes, to be some small value. We start with a set of

chosen according to the initial distribution of

Set

Step 1. For each step

• Selection.

Compute the normalizing constant

and choose independently

The new particles are denoted by

• Mutation.

For every

and set

Step 2. The estimator of the probability

It is known that this estimator is unbiased in the sense that

theorem. (Refer to [

Instead of explicit calculation of the asymptotic variance, we notice that the approximate variance

defined by

can be easily calculated within the above IPS algorithm. This provides criteria to choose the parameter

This section demonstrates the performance of the IPS algorithm through numerical examples with a sample portfolio consists of 25 firms. We consider portfolio consisting of high credit quality names with high correlations

•

•

Those parameters are set with the intention to notice rare default events. As Carmona, Fouque and Vestal [

First we compare the results of the IPS algorithm to the results obtained by the standard Monte Carlo algorithm in case of

all the default by the time horizon.

Default probabilities

Default probabilities in log-scale

These numerical results show that the standard Monte Carlo with 10,000 trials can not capture the occurrence of rare default events such as over 20 defaults, however, one sees that there exist very small probabilities for such events via 500,000 trials which is indicated by solid blue line in

Next, we investigate how variance would reduced by IPS with following two cases

• Case 1:

• Case 2:

to see the difference with respect to time horizon with the same memory period. Preliminary version of this paper, Takada [

In contrast, current study focuses on how the variance is reduced with IPS algorithm compared to the standard Monte Carlo. Due mainly to the computation of sampling with replacement according to the distribution (0.23) in the selection stage, IPS algorithm generally requires more time than the standard Monte Carlo. Although it obviously depends on input parameters, with the above parameter set of 25 names and in case of

of the 1000 outcomes of the probabilities

outcome of

differences between

And figure 8 and figure 9 illustrate the differences between

Remarkable feature is that the IPS algorithm reduces variance for the rare events, i.e., more than 10 defaults in our example, while instead, demonstrates weak performance for

Case 1: 1 ≤ k ≤ 7

Case 1: 8 ≤ k ≤ 25

Case 2: 1 ≤ k ≤ 7

Case 2: 8 ≤ k ≤ 25

This paper proposed incomplete information multi-name structural model and its efficient Monte Carlo algorithm based on the Interacting Particle System. We extend naturally the CreditGrades model in the sense that we consider more than two firms in the portfolio and their asset correlation as well as the dependence structure of the default thresholds. For this purpose, we introduced the prior joint distribution of default thresholds among the public investors described by truncated normal distribution. Numerical experience demonstrated that the IPS algorithm can generate rare default events which normally requires numerous trials if relying upon a simple Monte Carlo simulation. Finally we verified the IPS algorithm reduces variance for the rare events.

The author is grateful to participants of the RIMS Workshop on Financial Modeling and Analysis (FMA2013) at Kyoto and participants of the Quantitative Methods in Finance 2013 Conference (QMF 2013) at Sydney for valuable comments which helped to improve this paper.