Statistically Dual Distributions and Estimation

The reconstruction of a parameter by the measurement of a random variable depending on the parameter is one of the main tasks in statistics. In statistical inference, the concept of a confidence distribution and, correspondingly, confidence density has often been loosely referred to as a distribution function on the parameter space that can represent confidence intervals of all levels for a parameter of interest. In this short note, the notion of statistically dual distributions is discussed. Based on properties of statistically dual distributions, a method for reconstructing the confidence density of a parameter is proposed.


Introduction
Let n denote the observed number of events in a simple Poisson process.Its distribution can be described by a gamma random variable, where µ is a variable and n is a parameter (in the case of the Poisson distribution, µ is a parameter and n is a variable).It means, as shown below, that we can estimate the value and error of Poisson distribution pa-rameter by the mean of a Poisson random variable and by using the corresponding gamma distribution.This approach is also correct in the case of the normal distribution, Cauchy distribution, Laplace distribution, the inverse gamma distribution.
Let us name such distributions, which allow one to exchange the parameter and the variable, conserving the same formula for the distribution of probabilities, as statistically dual distributions [1].In many cases statistical duality of such type can be used for construction of confidence intervals for parameters.
In the next section, we show that Poisson and gamma distributions are statistically dual distributions and that the normal and Cauchy distributions are statistically self-dual distributions.An application of statistical duality for estimation of parameters is discussed in Section 3.

Statistically Dual Distributions
Definition: If a function ( ) , f x y can be expressed as a family of pdfs for variable x given parameter ( ) , y p x y , and a family of pdfs for variable y given parameter ( ) , then ( ) p x y and ( ) p y x are said to be statistically dual.This definition is a purely probabilistic (and, in this sense, a frequentist) definition.Nevertheless, statistically dual distributions considered also belong to conjugate families defined in the Bayesian framework (see, for example, [2]).
The statistical duality of Poisson and gamma distributions follows by a simple example.Let us consider the gamma distribution with pdf ( ) ( ) , α and x by a, 1 n + and µ , respectively, we obtain the following formula for the pdf ( ) ( ) where a is a scale parameter and n + 1> 0 is a shape parameter.If a = 1 then the pdf of 1, 1 Γ n+ is ( ) The Poisson distribution is a popular model for counts.For instance, if there are n events of a certain kind then it is reasonable to say ( ) One can see that the parameter and the variable in Equations ( 1) and ( 2) are exchanged.In other aspects the formulae are identical.As a result these distributions (gamma and Poisson) are statistically dual distributions.These distributions are connected by the identity [3] (this identity arises in other forms in [4]- [6]) for any 1 2 0, 0 µ µ ≥ ≥ and integer 0 m n > ≥ .Another example of statistically dual distributions is the normal distribution with mean a and variance 2 σ : where x is a real variable, a −∞ < < ∞ and 0 σ > are parameters.Here, we can exchange the parameter a and the variable x without changing the formula for the pdf.It allows one to estimate the parameter a by the mean value of x.In this case, the new pdf with variable a and parameters x −∞ < < ∞ and Hence, the normal distribution can be named as a statistically self-dual distribution.The identity analogous to (3) is for any real b, c and d.
The Cauchy distribution also has statistical self-duality like the normal distribution.The pdf of the Cauchy distribution is where x is a real variable, and a is a real parameter.Here, we can exchange the parameter a and the vari- able x without altering the pdf.In this case, ( . π 1 Hence, the Cauchy distribution is also a statistically self-dual distribution.The identity analogous to ( 4) is The same property applies to several other distributions, for example, the Laplace distribution.
The definition of the confidence interval ( ) , µ µ for a Poisson parameter, µ , is [7] ( ) ( ) ( ) where This definition is consistent with the identity (5).It contrasts with other frequentist definitions of confidence intervals.The right hand side of (6) represents the frequentist definition.
Let us suppose that ( ) n g µ is the pdf of the Poisson parameter 1 if number of observed events is equal to n .
It is a conditional pdf.It follows from the formulae (1), ( 5) that ( ) n g µ is a gamma pdf.
On the other hand: if ( ) n g µ is not equal to this pdf and the pdf of the Poisson parameter is some other func- tion ( ) This identity is correct for any 1 0 µ ≥ , and 2 0 µ ≥ .The sum on the left hand side determines the boundary conditions on the confidence interval.
If we subtract Equation ( 7) from Equation ( 5) then we have Statistically dual distributions allow one to exchange the parameter and the random variable.It means that one can construct the confidence interval ( ) , n n for the parameter n of the gamma distribution (Equation ( . For the normal distribution the identity (4) can be written as , d ; , d ; , d 1 for any 0 c ≥ and 0 d ≥ .
This identity (8) also shows that the conditional distribution (if observed value is x ) of the true parameter a obeys the normal distribution with mean x and variance 2 σ (here, in contrast to the previous example, x is an unbiased estimator of the parameter a ).As a result we can construct the distribution of error and confidence intervals for a 4 , taking into account systematics and statistical uncertainties in accordance with standard analy- 1 The definition of conjugate families as stated in [8] is: given a family F of pdf's ( ) f x θ indexed by a parameter θ , a family, Π , of prior distributions is said to be conjugate for the family F if the posterior distribution of θ is in the family Π for all f F ∈ , all priors π ∈Π and all possible data x.
2 Bayesian methods suppose that , where ( ) π µ is the prior pdf for µ . 3The confidence density is a natural notion in the frequentist concept of confidence distributions [10]- [12].Confidence distributions are a way to represent all possible confidence intervals.The area under the confidence distribution between any two points gives the confidence that the parameter value will lie between those points.In Bayesian notation, the confidence density can be considered as a posteriori pdf with the assumption that we have a uniform prior. 4In this case, the definition of confidence intervals as sis of errors [14].
In case of the Cauchy distribution for any 0 c ≥ and 0 d ≥ .This implies that the "confidence density" of the Cauchy distribution parameters is itself a Cauchy pdf.

Conclusions
We have discussed the notion of statistically dual distributions.The relation between the measurement of a casual variable and estimation of the given distribution parameter is discussed for three pairs of statistically dual distributions.
The proposed approach allows one to construct the distribution of the estimator of the distribution parameter by using statistically dual distributions.For example, the confidence density of the Poisson distribution parameter can be built by Monte Carlo by using properties of statistically dual distributions [15].This notion is used for the construction of unified approach to measurement error and missing data [16].The numerical example of the use of statistical duality for the confidence intervals construction is considered in the paper [17].
In summary, statistical duality gives a frequentist interpretation of the "confidence density" of a parameter.It allows one to construct confidence intervals easily.

1 , 1 Γ
n+ with the probability density function (pdf) that looks like a Poisson probability:


for any real b , c and d.
for a finite set of points).As a result we can mix Bayesian probabilities without logical inconsistencies.The iden- tity (5) leaves no place for any prior except for the uniform prior ( ) π µ = const 2 .Actually, it shows that the pdf ( ) n g µ is the "confidence density"[9] 3 .