_{1}

^{*}

This paper studies the asymptotic normality of the Nelson-Aalen and the Kaplan-Meier estimators in a competing risks context in presence of independent right-censorship. To prove our results, we use Robelledo’s theorem which makes it possible to apply the central limit theorem to certain types of particular martingales. From the results obtained, confidence bounds for the hazard and the survival functions are provided.

The model of competing risks has been widely studied in the literature, see e.g., Heckman and Honoré [^{th} cause is a non-negative random variable (r.v.) τ j . The competing risks model postulates that only the smallest failure time is observable, it is given by the r.v. T = min ( τ 1 , ⋯ , τ m ) with distribution function (d.f.) denoted by F. The cause of failure associated to T is then indicated by a r.v. η which takes value j if the failure is due to the j^{th} cause for a j ∈ { 1, ⋯ , m } i.e. η = j if T = τ j . The following modeling technique is extracted in Njamen and Ngatchou [

( Z = min ( T , C ) , ξ = η δ ) ,

where δ = 1 1 { T ≤ C } and where 1 1 ( . ) denotes the indicator function. As T and C are independent, the r.v. Z has d.f. H given by 1 − H = ( 1 − F ) ( 1 − G ) . Let τ H = sup { t : H ( t ) < 1 } denote the right-endpoint of H beyond which no observation is possible. The subdistribution functions F ( j ) pertaining to the different risks or causes of failure are defined for j = 1 , ⋯ , m and t ≥ 0 by

F ( j ) ( t ) = ℙ [ T ≤ t , η = j ] , j = 1 , ⋯ , m (1)

When the independence of the different competing risks may not be assumed, the functions F ( j ) for j = 1 , ⋯ , m are the basic estimable quantities.

The Kaplan-Meier estimator was developed for situations in which only one cause of failure and the independent right-censoring are considered. Aalen and Johansen [

The asymptotic properties of the Kaplan-Meier estimator on the distribution function have been studied by several authors (see Peterson [

In this paper, in a region where there is at least one observation, we are interested in providing asymptotic properties of the Nelson-Aalen and Kaplan-Meier nonparametric estimators of the functions Λ ∗ ( j ) and S ∗ ( j ) . For j = 1 , ⋯ , m in the presence of independent right-wing censorship in the context of competitive risks set out in Njamen and Ngatchou ( [

The rest of the paper is organized as follows: Section 2 describes preliminary results and rappels used in the paper. In Section 3, we obtain two laws: In Section 3.1, we give limit law of Nelson-Aalen’s nonparametric estimator for competing risks as defined in Njamen and Ngatchou [

For t ≥ 0 , we introduce the following subdistribution functions H ( 0 ) and H ( 1 ) of H by:

H ( 0 ) ( t ) = ℙ [ Z ≤ t , ξ = 0 ] ,

and

H ( 1 ) ( t ) = ℙ [ Z ≤ t , ξ ≠ 0 ]

and for j = 1 , ⋯ , m

H ( 1 , j ) ( t ) = ℙ [ Z ≤ t , ξ = j ] .

The relations F ( t ) = ∑ j = 1 m F ( j ) ( t ) and H ( 1 ) ( t ) = ∑ j = 1 m H ( 1 , j ) ( t ) hold for t ≥ 0 since the different risks are mutually exclusive. The relation H ( t ) = H ( 0 ) ( t ) + H ( 1 ) ( t ) is also valid for t ≥ 0 . The relations that connect the observable distribution functions H ( 0 ) , H ( 1 ) and H ( 1 , j ) to the unobservable distributions F, G and F ( j ) are given by:

H ( 0 ) ( t ) = ∫ 0 t ( 1 − F ) d G ,

H ( 1 ) ( t ) = ∫ 0 t ( 1 − G − ) d F ,

and

H ( 1, j ) ( t ) = ∫ 0 t ( 1 − G − ) d F ( j ) .

The cumulative hazard function of T and the partial cumulative hazard function of T related to cause j for j ∈ { 1, ⋯ , m } are given for t ≥ 0 respectively by the following expressions:

Λ ( t ) = ∫ 0 t d F 1 − F − = ∫ 0 t d H ( 1 ) 1 − H − , (2)

Λ ( 1, j ) ( t ) = ∫ 0 t d F ( j ) 1 − F − = ∫ 0 t d H ( 1, j ) 1 − H − . (3)

Let us set estimators for the different quantities. Let ( Z i , ξ i ) i = 1 , ⋯ , n be n independent copies of the random vector ( Z , ξ ) . We define the empirical counterparts of H ( 0 ) , H ( 1 ) , H ( 1, j ) and H, for j ∈ 1, ⋯ , m by:

H n ( 0 ) ( t ) = 1 n ∑ i = 1 n 1 1 { Z i ≤ t , ξ i = 0 } ,

H n ( 1 ) ( t ) = 1 n ∑ i = 1 n 1 1 { Z i ≤ t , ξ i ≠ 0 } ,

H n ( 1 , j ) ( t ) = 1 n ∑ i = 1 n 1 1 { Z i ≤ t , ξ i = j } ,

H n ( t ) = 1 n ∑ i = 1 n 1 1 { Z i ≤ t } .

The relations H n ( t ) = H n ( 0 ) ( t ) + H n ( 1 ) ( t ) and H n ( 1 ) ( t ) = ∑ j = 1 m H n ( 1, j ) ( t ) are valid for t ≥ 0 . As T is independently randomly right-censored by C, a well-known estimator for F is the Kaplan-Meier estimator defined for t ≥ 0 by:

F ^ n ( t ) = 1 − ∏ i = 1 n ( 1 − 1 1 { Z i ≤ t , ξ i ≠ 0 } n ( 1 − H n − ( Z i ) ) ) ,

where the left-continuous modification of any d.f. L is denoted by L − . The Nelson-Aalen estimators of Λ and of Λ ( 1, j ) for j = 1 , ⋯ , m respectively are defined for t ≥ 0 by:

Λ n ( t ) = ∫ 0 t d H n ( 1 ) 1 − H n − , (4)

Λ n ( 1, j ) ( t ) = ∫ 0 t d H n ( 1, j ) 1 − H n − . (5)

The Aalen-Johansen estimator for F ( j ) is defined for t ≥ 0 by:

F ^ n ( j ) ( t ) = ∫ 0 t 1 − F ^ n − 1 − H n − d H n ( 1, j ) .

For all t ≥ 0 , the following equalities hold:

1 − H n ( t ) = ( 1 − F ^ n ( t ) ) ( 1 − G ^ n (t) )

Λ n ( t ) = ∫ 0 t d F ^ n 1 − F ^ n − ,

where G ^ n , the Kaplan-Meier estimator of G, is defined for t ≥ 0 by:

G ^ n ( t ) = 1 − ∏ i = 1 n ( 1 − 1 1 { T i ≤ t , ξ i = 0 } n ( 1 − H n − ( Z i ) ) ) .

In this section, we continue the works of Njamen and Ngatchou [

In what follows, we study the asymptotic normality of Nelson-Aalen’s non-parametric estimator in competitive risks. For that, considering, for all j ∈ { 1, ⋯ , m } and t ≥ 0 , one has the Nelson-Aalen type cumulative hazard function estimator (Nelson, [

Λ ^ n ( t ) = ∫ 0 t J ( u ) Y ( u ) d N ( u ) , (6)

where J ( t ) = 1 1 { Y ( t ) > 0 } .

The cumulative risk in a region where there is at least one observation is given for all j ∈ { 1, ⋯ , m } , by (see Njamen, [

Λ ∗ ( j ) = ∫ 0 t L ∗ ( j ) λ ∗ ( j ) ( s ) d s , (7)

with L i ∗ ( j ) ( t ) = 1 1 { Z i ≥ t } which indicates whether the individual i is still at risk just before time t (the individual has not yet undergone the event). Its estimator was defined in Njamen and Ngatchou ( [

The following theorem gives the limit law of the Neslson-Aalen estimator Λ ^ n ∗ ( j ) in competing risks of Njamen (2017, p. 9). This is the first fundamental result of this article.

Theorem 1.

In a region where there is at least one observation, it is assumed that F i ∗ ( j ) ( t ) < 1 for i ∈ { 1, ⋯ , n } and j ∈ { 1, ⋯ , m } . Then, for all t ≥ 0 ,

n ( Λ ^ n ∗ ( j ) ( t ) − Λ ∗ ( j ) ( t ) ) → L U i ∗ ( j ) ( t ) , (8)

where U i ∗ ( j ) is a centered Gaussian martingale of variance such that:

{ U i ∗ ( j ) ( 0 ) = 0 V ( U i ∗ ( j ) ( t ) ) = ∫ 0 t α i ∗ ( j ) ( u ) y i ∗ ( j ) ( u ) d u , (9)

where for all s ≥ 0 ,

y i ∗ ( j ) ( s ) = [ 1 − F i ∗ ( j ) ( s ) ] [ 1 − G i ∗ ( j ) ( s − ) ] (10)

with G i ∗ ( j ) standing for the distribution function of C i ∗ ( j ) and α i ∗ ( j ) the instant risk function.

To prove this theorem, we need the Robelledo theorem. In fact, the Rebolledo theorem below makes it possible to apply the central limit theorem for certain types of particular martingales.

Theorem 2. (Rebolledo’s Theorem)

Let M n = ∑ i = 1 n M i a sequence of martingales where M i = K i − A i , K i denotes a counting process and A i its compensator. Consider the processes I n ( t ) = ∫ 0 t f n ( s ) d M n ( s ) , and for all ε > 0 , I n , ε ( t ) = ∫ 0 t f n ( s ) 1 1 { | f n ( s ) | > ε } d M n ( s ) . Suppose that f n and f are predictable and locally bounded F s − processes such that

sup s | f n − f ( s ) | → 0 ( n → ∞ ) .

Suppose also that the processes K i , A i , f n are bounded. Let’s for all t > 0 , α ( t ) = ∫ 0 t f 2 ( s ) d s . If

1) 〈 I n 〉 t → ℙ α ( t ) , ( n → ∞ ) ;

2) for all ε > 0 , 〈 I n , ε 〉 t → ℙ 0, ( n → ∞ ) .

Then,

( I n ( t ) , t > 0 ) ⇒ ( ∫ 0 t f ( s ) d W ( s ) , t > 0 ) , ( n → ∞ ) ,

where ⇒ denotes the weak convergence in the space of continuous functions on the right, having a left-hand boundary with the topology of Skorokhod and where W is a Brownian motion.

To prove Theorem 1, it is sufficient to check whether the previous conditions of Rebolledo’s Theorem are satisfied:

Proof. For all j ∈ { 1, ⋯ , m } and t ≥ 0 , M i ∗ ( j ) ( t ) also decomposes into

M i ∗ ( j ) ( t ) = K i ∗ ( j ) ( t ) − ∫ 0 t d Λ i ∗ ( j ) ( s ) d s ,

which in turn can be written in terms of α j ( t ) by

M i ∗ ( j ) ( t ) = K i ∗ ( j ) ( t ) − ∫ 0 t α i ∗ ( j ) ( s ) L i ∗ ( j ) ( s ) d s ,

which finally, can be rewritten as

d K i ∗ ( j ) ( t ) = α i ∗ ( j ) ( t ) L i ∗ ( j ) ( t ) d t + d M i ∗ ( j ) ( t ) ,

where d M i ∗ ( j ) ( t ) can be seen as a random noise process. The martingale M i ∗ ( j ) ( t ) above represents the difference between the number of failures due to a specific cause j observed in the time interval [ 0, t ] , i.e. K i ∗ ( j ) ( t ) (see Njamen, [^{th} cause. This definition fulfills the Doob-Meyer decomposition.

This martingale is used in Fleming and Harrington ( [

Now, to explain the asymptotic nature of the results, we defined, for all t ≥ 0 , j ∈ { 1, ⋯ , m } , to pose:

N ( n ) ( t ) = ∑ i = 1 n K i ∗ ( j ) ( t ) , Y ( n ) ( t ) = ∑ i = 1 n L i ∗ ( j ) ( t ) , J ( n ) = 1 1 { Y ( n ) ( t ) > 0 } ,

In a subgroup A ( j ) , where there is at least one observation, the survival function of Z i = min ( T i , C i ) is defined for all t ≥ 0 by:

S Z ∗ ( j ) ( t ) = ( 1 − F i ∗ ( j ) ( t ) ) ( 1 − G i ∗ ( j ) ( t − ) ) .

Recall also that F i ∗ ( j ) is the distribution function of T i , G i ∗ ( j ) is that of C i ’s and [ 1 − ( 1 − F i ∗ ( j ) ) ] [ 1 − G i ∗ ( j ) ] that of the Z i ’s. From the Glivenko-Cantelli theorem, one has:

sup s ∈ [ 0, t ] | Y ( n ) ( s ) n − [ 1 − F i ∗ ( j ) ( s ) ] [ 1 − G i ∗ ( j ) ( s − ) ] | → ℙ 0 ( n → ∞ ) . (11)

Otherwise,

J ( n ) ( t ) = 1 1 { Y ( n ) ( t ) > 0 } ,

one has:

1 − J ( n ) ( t ) = 1 1 { Y ( n ) ( t ) = 0 } = 1 1 { B ( n , [ 1 − F i ∗ ( j ) ( t ) ] [ 1 − G i ∗ ( j ) ( t − ) ] ) = 0 } → ℙ 0 ( n → ∞ ) ,

from which one obtains (see Theorem 3, p. 11 of Njamen, [

J ( n ) ( t ) → ℙ 1 ( n → ∞ ) .

Differentiating the martingale M i ∗ ( j ) ( t ) = K i ∗ ( j ) − ∫ 0 t L i ∗ ( j ) ( s ) α i ∗ ( j ) ( s ) d s , one has:

d M i ∗ ( j ) ( t ) = d K i ∗ ( j ) ( t ) − L i ∗ ( j ) ( t ) α i ∗ ( j ) ( t ) d t ,

and from

d 〈 M i ∗ ( j ) 〉 t = V a r ( d M i ∗ ( j ) ( t ) / F t − ) ,

one obtains

d 〈 M i ∗ ( j ) 〉 t = V a r ( d K i ∗ ( j ) ( t ) − L i ∗ ( j ) ( t ) α i ∗ ( j ) ( t ) d t / F t − ) = V a r ( d K i ∗ ( j ) ( t ) / F t − ) = L i ∗ ( j ) ( t ) α i ∗ ( j ) ( t ) d t .

Consequently, the increasing process of

D t = ∫ 0 t J ( n ) ( u ) Y ( n ) ( u ) d M i ∗ ( j ) ( u ) , t ≥ 0,

is given by

〈 D 〉 t = ∫ 0 t ( J ( n ) ) 2 ( u ) ( Y ( n ) ) 2 ( u ) d 〈 M 〉 u , t ≥ 0.

Next, for all t ≥ 0 and j = { 1 , ⋯ , m } , one has

〈 n ∑ i = 1 n ∫ 0 t J ( n ) ( u ) Y ( n ) ( u ) d M i ∗ ( j ) ( u ) 〉 t = ∑ i = 1 n n ∫ 0 t ( J ( n ) ) 2 ( u ) ( Y ( n ) ) 2 ( u ) L i ∗ ( j ) ( u ) α i ∗ ( j ) ( u ) d u = ∫ 0 t n ( J ( n ) ) 2 ( u ) ( Y ( n ) ) 2 ( u ) ∑ i = 1 n L i ∗ ( j ) ( u ) α i ∗ ( j ) ( u ) d u = ∫ 0 t n ( J ( n ) ) 2 ( u ) ( Y ( n ) ) 2 ( u ) Y ( n ) ( u ) α i ∗ ( j ) ( u ) d u = ∫ 0 t n J ( n ) ( u ) Y ( n ) ( u ) α i ∗ ( j ) ( u ) d u .

Also, for all t ≥ 0 and for all j ∈ { 1, ⋯ , m } , the process

n ( Λ ^ n ∗ ( j ) ( t ) − Λ ∗ ( j ) ( t ) ) = n ∑ i = 1 n ∫ 0 t J ( n ) ( u ) Y ( n ) ( u ) d M i ∗ ( j ) ( u ) = R n ( t ) , ∀ i ∈ { 1, ⋯ , n } ,

is a martingale. We apply the central limit theorem for the martingales (Rebolledo’s Theorem). In this purpose, we show that the condition of this theorem is satisfied by R n ( t ) .

One has, for all i ∈ { 1, ⋯ , n } ,

〈 R n 〉 t = ∫ 0 t n J ( n ) ( u ) Y ( n ) ( u ) α i ∗ ( j ) ( u ) d u , ∀ j ∈ { 1, ⋯ , m } ,

and also by the proof of the Theorem 3 of Njamen ( [

Y ( n ) ( u ) n → ℙ ( 1 − F i ∗ ( j ) ( u ) ) ( 1 − G i ∗ ( j ) ( u − ) ) , J ( n ) ( u ) → ℙ 1, ( n → ∞ ) .

So that, for all j ∈ { 1, ⋯ , m } , when n → ∞ ,

〈 R n 〉 t = ∫ 0 t J ( n ) ( u ) Y ( n ) ( u ) n α i ∗ ( j ) ( u ) d u → ℙ ∫ 0 t α i ∗ ( j ) ( u ) d u ( 1 − F i ∗ ( j ) ( u ) ) ( 1 − G i ∗ ( j ) ( u − ) ) = β ( t ) , ( n → ∞ ) ,

which is determinist. Thus, the first condition of Robelledo Theorem holds.

To check the second condition, for all ϵ > 0 and t ≥ 0 , define

R n , ε ( t ) = ∫ 0 t n J ( n ) ( u ) Y ( n ) ( u ) 1 1 { | n J ( n ) ( u ) Y ( n ) ( u ) | > ϵ } d M ( n ) ( u ) ,

where for all j = 1 , ⋯ , m , M ( n ) ( u ) = ∑ i = 1 n M i ∗ ( j ) ( u ) .

We have to show that as n → ∞ , 〈 Z n , ϵ 〉 t converges to 0 in probability.

One has, for all t ≥ 0 ,

〈 R n , ε 〉 t = ∫ 0 t n J ( n ) ( u ) ( Y ( n ) ( u ) ) 2 1 1 { | n J ( n ) ( u ) Y ( n ) ( u ) | > ϵ } d 〈 M ( n ) 〉 u = ∫ 0 t n J ( n ) ( u ) ( Y ( n ) ( u ) ) 2 1 1 { | n J ( n ) ( u ) Y ( n ) ( u ) | > ϵ } Y ( n ) ( u ) α i ∗ ( j ) ( u ) d u = ∫ 0 t n J ( n ) ( u ) Y ( n ) ( u ) 1 1 { | n J ( n ) ( u ) Y ( n ) ( u ) | > ϵ } α i ∗ ( j ) ( u ) d u → ℙ 0 , ( n → ∞ ) ,

because

n J ( n ) ( u ) Y ( n ) ( u ) → ℙ 1 ( 1 − F i ∗ ( j ) ( u ) ) ( 1 − G i ∗ ( j ) ( u − ) ) , ( n → ∞ ) .

Then

n J ( n ) ( u ) Y ( n ) ( u ) = 1 n n J ( n ) ( u ) Y ( n ) ( u ) → ℙ 0, ( n → ∞ ) .

Thus, the second condition of Robelledo Theorem holds.

The conditions of the Rebolledo Theorem are verified and by consequently, for all t ≥ 0 ,

( R n ( t ) , t > 0 ) ⇒ ( ∫ 0 t f ( s ) d W ( s ) , t > 0 ) , ( n → ∞ ) ,

with γ ( t ) = ∫ 0 t f 2 ( s ) d s .

Finally, for all t > 0 ,

This ends the proof of the Theorem 1.

The following subsection gives the asymptotic law of nonparametric Kaplan-Meier’s estimator of the survival function in the competing risks of Njamen and Ngatchou ( [

The Kaplan-Meier estimator of the survival function (Kaplan and Meier, [

S ^ n ( t ) = ∏ s ≤ t ( 1 − Δ Λ ^ n ( s ) ) = ∏ s ≤ t ( 1 − J ( n ) ( s ) Δ N ( n ) ( s ) Y ( n ) ( s ) ) ,

where Λ ^ n ( t ) is the Nelson-Aalen estimator and where, for a process X ( t ) continuous to the right with a left limit such that

Δ X ( t ) = X ( t ) − X ( t − ) .

For all j = 1 , ⋯ , m , an estimator of the variance of S ^ n ( j ) ( t ) / S ∗ ( j ) ( t ) , where S ∗ ( j ) is the survival function associated with the subgroup A ( j ) is given by

σ ^ ( j ) 2 ( t ) = ∫ 0 t J ( n ) ( s ) ( Y ( n ) ) 2 ( s ) d N ( n ) ( s ) .

The variance of S ^ n ( j ) ( t ) / S ( j ) ( t ) approximated by that of S ^ ( j ) ( t ) / S ∗ ( j ) ( t ) is:

V [ S ^ n ( j ) ( t ) S ∗ ( j ) ( t ) − 1 ] = E [ 〈 S ^ n ( j ) S ∗ ( j ) − 1 〉 ( t ) ] = ∫ 0 t { S ^ n ( j ) ( s − ) S ∗ ( j ) ( s ) } 2 × J ( n ) ( s ) Y ( n ) ( s ) α i ∗ ( j ) ( s ) d s ∀ i ∈ { 1, ⋯ , n } . (12)

The estimator of the corresponding variance of S ^ n ( j ) ( t ) is given by

V ^ ( S ^ n ( j ) ( t ) ) = [ S ^ n ( j ) ( t ) ] 2 σ ^ i ( j ) 2 ( t ) ∀ i ∈ { 1, ⋯ , n } . (13)

The following result concerning the asymptotic law of nonparametric Kaplan-Meier estimator and constituted the second fundamental result of this paper:

Theorem 3.

In an area where there is at least one observation, if we assume that for all j ∈ { 1, ⋯ , m } and i ∈ { 1, ⋯ , n } ,

1) for all s ∈ [ 0, t ] ,

n ∫ 0 s J ( n ) ( u ) Y ( n ) ( u ) α i ∗ ( j ) ( u ) d u → ℙ σ i ∗ ( j ) 2 ( u ) ( n → ∞ ) ,

2) for all ε > 0 ,

n ∫ 0 t J ( n ) ( u ) Y ( n ) ( u ) α i ∗ ( j ) 1 1 { n | J ( n ) ( u ) Y ( n ) ( u ) | > ε } d u → ℙ 0 ( n → ∞ ) ,

3) for all t > 0 ,

n ∫ 0 t ( 1 − J ( n ) ( u ) ) α i ∗ ( j ) ( u ) d u → ℙ 0 ( n → ∞ ) .

Then, for all t > 0 and j ∈ { 1, ⋯ , m } , the non-parametric estimator S ^ n ∗ ( j ) checks

n ( S ^ n ∗ ( j ) ( t ) − S ∗ ( j ) ( t ) ) ⇒ − U i ∗ ( j ) ( t ) × S ∗ ( j ) ( t ) , ( n → ∞ ) ,

where U i ∗ ( j ) is the center Gaussian martingale and where ⇒ denotes the weak convergence in the space of continuous functions on the right, having a left-hand boundary with the topology of Skorokhod.

Proof. To prove this theorem, it suffices to show that it satisfies the conditions of the Rebolledo Theorem.

In an area where there is at least one observation, by posing, for all j = 1 , ⋯ , m , i = 1 , ⋯ , n ,

S ˜ n ∗ ( j ) ( t ) = exp ( − Λ ˜ n ∗ (j) )

where Λ ˜ n ∗ ( j ) = ∫ 0 t J ( n ) ( u ) α i ∗ ( j ) ( u ) d u .

For t ∈ [ 0, τ [ and τ > 0 , we have for all j = 1 , ⋯ , m and i = 1 , ⋯ , n ,

n 〈 ( S ^ n ( j ) S ˜ n ∗ ( j ) − 1 ) 〉 t = n ∫ 0 t S ^ n ( j ) ( u − ) 2 S ˜ n ∗ ( j ) ( u ) 2 J ( n ) ( u ) Y ( n ) ( u ) α i ∗ ( j ) ( u ) d u → ℙ σ i ∗ ( j ) 2 , ( n → ∞ ) .

By the proof of Theorem 3 of Njamen ( [

S ^ n ∗ ( j ) ( u − ) S ˜ n ∗ ( j ) ( u ) → ℙ 1, ( n → ∞ ) .

Hence the 1st condition of Robolledo’s Theorem.

For the second condition of Robolledo’s Theorem, condition B is similar to the proof of Theorem 1 above, we find that for all ε > 0 ,

n ∫ 0 t S ^ n ( j ) ( u − ) 2 S ˜ n ∗ ( j ) ( u ) 2 J ( n ) ( u ) Y ( n ) ( u ) 1 1 { n | J ( n ) ( s ) Y ( n ) ( s ) | > ε } α i ∗ ( j ) ( u ) d u → 0 , ( n → ∞ ) .

So, for each t > 0 ,

n ∫ 0 t S ^ n ( j ) ( u − ) S ˜ n ∗ ( j ) ( u ) J ( n ) ( u ) Y ( n ) ( u ) d M ( n ) ( u ) ⇒ U i ∗ ( j ) ( t ) ,

where M ( n ) ( u ) = ∑ i = 1 n M i ∗ ( j ) ( u ) and where

Finally,

n ( S ^ n ( j ) ( t ) S ˜ n ∗ ( j ) ( t ) − 1 ) ⇒ − U i ∗ ( j ) ( t ) .

The fact that S ∗ ( j ) ( u ) ≤ S n ∗ ( j ) ( u ) , for all u ∈ [ 0, s [ and condition C implies:

n | S ∗ ( j ) ( s ) S ˜ n ∗ ( j ) ( u ) − 1 | ≤ n ∫ 0 t S ∗ ( j ) ( u ) S ˜ n ∗ ( j ) ( u ) d ( Λ ∗ ( j ) − Λ ˜ ∗ ( j ) ) ( u ) ≤ n ∫ 0 t ( 1 − J ( u ) ) α i ∗ ( j ) ( u ) d u → ℙ 0 ( n → ∞ ) .

As S ˜ n ∗ ( j ) ( t ) → S ∗ ( j ) ( t ) when n → ∞ , we deduce that:

n ( S ˜ n ∗ ( j ) − S ∗ ( j ) ( t ) ) → ℙ 0, n → ∞ .

It follows that:

n ( S ^ n ∗ ( j ) ( t ) − S ∗ ( j ) ( t ) ) = n ( S ^ n ∗ ( j ) ( t ) − S ˜ n ∗ ( j ) ( t ) ) + n ( S ˜ n ∗ ( j ) ( t ) − S ∗ ( j ) ) = n ( S ^ n ∗ ( j ) ( t ) − S ˜ n ∗ ( j ) ( t ) ) S ˜ n ∗ ( j ) S ˜ n ∗ ( j ) + n ( S ˜ n ∗ ( j ) ( t ) − S ∗ ( j ) ) ⇒ − U i ∗ ( j ) ( t ) S ∗ ( j ) , ( n → ∞ ) .

This ends the proof of the theorem.

For α ∈ ( 0,1 ) , we wish to find two random functions b L and b U such that ∀ t > 0 ,

ℙ [ b U ( t ) ≥ S ( t ) ≥ b L ( t ) ] = 1 − α .

Recall that from the previous sections, for all j ∈ { 1 , ⋯ , m } , n ( S ^ n ∗ ( j ) ( t ) − S ∗ ( j ) ( t ) ) / S ∗ ( j ) ( t ) converges in distribution to a Gaussian martingale centered (see Theorem 3 above). As a consequence, S ^ n ∗ ( j ) ( t ) is asymptotically Gaussian centered on S ∗ ( j ) . Given the above results, the estimated standard deviation of S ∗ ( j ) , noted σ ^ S t is given for all t ≥ 0 by:

σ ^ S t * ∗ 2 ( t ) = V ^ ( S ^ n ∗ ( j ) ( t ) ) [ S ^ n ∗ ( j ) ( t ) ] 2 . (14)

Therefore a threshold confidence level 100 ( 1 − α ) % can be built for all t ≥ 0 and j ∈ { 1, ⋯ , m } , by:

S ^ n ∗ ( j ) ( t ) − Z 1 − α / 2 σ ^ S t * ∗ ( j ) ( t ) S ^ n ∗ ( j ) ( t ) , S ^ n ∗ ( j ) ( t ) + Z 1 − α / 2 σ ^ S t * ∗ ( j ) ( t ) S ^ n ∗ ( j ) ( t ) . (15)

Here z 1 − α / 2 is the 1 − α / 2 percentile of a standard normal distribution.

A threshold confidence interval 100 ( 1 − α ) % can also be obtained for all j ∈ { 1, ⋯ , m } , by:

S ^ n ∗ ( j ) ( t ) ± z α / 2 σ ^ S t * ∗ ( j ) , (16)

where z α / 2 is the rank of fractile 100 × α / 2 of the standardized normal distribution.

A disadvantage of the construction of the confidence interval (CI) with the previous formula is that the bound can be obtained external to the interval [ 0,1 ] . A solution is to consider a S ∗ ( j ) ( t ) ( j ∈ { 1, ⋯ , m } ) transform via a continuous function g, differentiable and invertible such that g ( S ∗ ( j ) ( t ) ) belongs to a more wide space ideally unbounded and best approximate a Gaussian random variable. The delta method then allows for the estimation of

the standard deviation of the object created by σ ^ g ( S t * ) ∗ ( j ) defined by σ ^ g ( S t * ) ∗ ( j ) ( t ) = g ′ ( S ^ n ∗ ( j ) ) σ ^ S t * ∗ ( j ) ( t ) . The confidence interval associated with the risk threshold α is built as for all j ∈ { 1, ⋯ , m } ,

g − 1 ( g ( S ^ n ∗ ( j ) ) ± z α / 2 g ′ ( S ^ n ∗ ( j ) ) σ ^ S t * ∗ ( j ) ( t ) ) .

The most common transformation is g ( S t * ) = log [ log ( S t * ) ] , and in this case we have: for all j ∈ { 1, ⋯ , m } ,

σ ^ log [ − log ( S t * ) ] ∗ ( j ) = σ ^ S t * ∗ ( j ) S ^ n ∗ ( j ) log S ^ n ∗ ( j ) and S ^ n ∗ ( j ) exp ( ± z α / 2 σ ^ S t * ∗ ( j ) S ^ n ∗ ( j ) log ( S ^ t ∗ ( j ) ) ) .

Remark 1. It is also possible to use log, square-root or logit-type transformations in most software defined respectively by for all j ∈ { 1, ⋯ , m } ,

g ( S t ∗ ( j ) ) = log [ S t ∗ ( j ) ] , g ( S t ∗ ( j ) ) = sin − 1 [ S t ∗ ( j ) ] , g ( S t ∗ ( j ) ) = log [ S t ∗ ( j ) 1 − S t ∗ ( j ) ] .

The challenge now is to find an area containing the survival function with probability 1 − α , or a set of bounds b L ( t ) and b U ( t ) which, with probability 1 − α , contains S ∗ ( j ) ( t ) for all t ∈ [ t L , t U ] and j ∈ { 1, ⋯ , m } . Among the proposed solutions, the two most commonly used are firstly Hall and Wellner ( [

Remark 2. The starting point uses the fact that for all j ∈ { 1, ⋯ , m } , n ( S ^ n ∗ ( j ) ( t ) S ∗ ( j ) ( t ) − 1 ) converges to a centered Gaussian martingale. We then go through a transformation making appear a Brownian bridge { W 0 ( x ) , x ∈ [ 0,1 ] } , weighted by 1 x ( 1 − x ) at Nair, to retrieve the suitable critical value.

In particular, because of the joined character, for a given t their extent is wider than that of the corresponding point IC. In what follows we give the expressions obtained in the absence of transformation.

Under the assumption of continuity of survival functions S ∗ ( j ) ( t ) and C ∗ ( j ) ( t ) respectively related to the event time and the time of censorship, Hall and Wellner show that for every t ∈ [ t L , t U ] , the IC joined the risk threshold α is given for all j = 1 , ⋯ , m and t ≥ 0 by:

S ^ n ∗ ( j ) ( t ) ± h α ( x L , x U ) n − 1 2 [ 1 + n σ ^ S t * ∗ ( j ) 2 ( t ) ] S ^ n ∗ ( j ) ( t ) , (17)

where x L and x U are given by

x i = n σ ^ S t i * ∗ ( j ) 2 ( t ) ( 1 + n σ ^ S t i * ∗ ( j ) 2 ( t ) ) , for i = L , U

and h α ( x L , x U ) is bounds checking

α = ℙ [ sup x L ≤ x ≤ x U | W 0 ( x ) | > h α ( x L , x U ) ] .

Using a weighted Brownian bridge will notably modify the bounds to IC. For α ∈ ( 0,1 ) , t ∈ [ t L , t U ] and all j ∈ { 1, ⋯ , m } , they are then given by:

S ^ n ∗ ( j ) ( t ) ± e α ( x L , x U ) σ ^ S t * ∗ ( j ) , (18)

where e α ( x L , x U ) satisfies

α = ℙ [ sup x L ≤ x ≤ x U | W 0 ( x ) | x ( 1 − x ) > e α ( x L , x U ) ] .

If we compare (12) and (14), we see that the bounds relating to Nair ( [

In this paper we have studied the asymptotic normality of Nelson-Aalen and Kaplan-Meier type estimators in the presence of independent right-censorship as defined in Njamen and Ngatchou ( [

As a perspective, obtaining actual data would allow us to perform numerical simulations to gauge the robustness of our obtained estimators.

We thank the publisher and the referees for their comments which allowed to raise considerably the level of this article.

The author declares no conflicts of interest regarding the publication of this paper.

Njomen, D.A.N. (2019) Asymptotic Normality of the Nelson-Aalen and the Kaplan-Meier Estimators in Competing Risks. Applied Mathematics, 10, 545-560. https://doi.org/10.4236/am.2019.107038