Evaluation of some non-elementary integrals involving some exponential, hyperbolic and trigonometric elementary functions and derivation of new probability measures generalizing the gamma-type and normal distributions

The non-elementary integrals involving elementary exponential, hyperbolic and trigonometric functions, $ \int x^\alpha e^{\eta x^\beta}dx, \int x^\alpha \cosh\left(\eta x^\beta\right)dx, \int x^\alpha \sinh\left(\eta x^\beta\right)dx, \int x^\alpha \cos\left(\eta x^\beta\right)dx$ and $\int x^\alpha \sin\left(\eta x^\beta\right)dx $ where $\alpha, \eta$ and $\beta$ are real or complex constants are evaluated in terms of the confluent hypergeometric function $_1F_1$ and the hypergeometric function $_1F_2$. The hyperbolic and Euler identities are used to derive some identities involving exponential, hyperbolic, trigonometric functions and the hypergeometric functions $_1F_1$ and $_1F_2$. Having evaluated, these non-elementary integrals, some new probability measures generalizing the gamma-type and normal distributions are also obtained. The obtained generalized distributions may, for example, allow to perform better statistical tests than those already known (e.g. chi-square ($\chi^2$) statistical tests and those based on central limit theorem (CLT)).

3. An elementary function is a function of one variable constructed using that variable and constants, and by performing a finite number of repeated algebraic operations involving exponentials and logarithms. An indefinite integral which can be expressed in terms of elementary functions is an elementary integral.
And if, on the other hand, it cannot be evaluated in terms of elementary functions, then it is non-elementary [6,13].
One of the goals of this work is to show how non-elementary integrals having one of the types x α e ηx β dx, x α cosh ηx β dx, x α sinh ηx β dx, x α cos ηx β dx and x α sin ηx β dx, where α, η and β are real or complex constants can be evaluated in terms of the special functions 1 F 1 and 1 F 2 . These integrals are the generalization of the nonelementary integrals evaluated by Nijimbere [7][8][9], and have been not evaluated before. For instance, if α < 0, the integrals in (4) become, respectively, the (indefinite) sine and cosine integrals which are evaluated in Nijimbere [8,9]. If, on the other hand, α = 0, the non-elementary integrals in (3) and (4) reduce to the non-elementary integrals evaluated in Nijimbere [7]. However, it is important to observe that the integrals in (3) and (4) may be elementary or non-elementary depending on the values of the constants α and β. If, for instance, α = β − 1, then the integral is elementary because it is expressed in terms of the elementary function e ηx β . In that case, the other integrals in (3) and (4) are also elementary since they can be expressed as linear combination of integrals such that in (5) using the hyperbolic identities cosh ηx β = e ηx β + e −ηx β /2, sinh ηx β = e ηx β − e −ηx β /2 and the Euler's identities Using Liouville 1835's theorem, it can readily be shown that if α is not an integer and α = β −1, then the integrals in (4) and (5) are non-elementary [6,13]. Another goal of this work is to obtain some identities (or formula) involving exponential, hyperbolic, trigonometric functions and the hypergeometric functions 1 F 1 and 1 F 2 using the Euler and hyperbolic identities. Other interesting identities involving hypergeometric functions may be found, for example, in [2,5,[7][8][9]12].
, then it is possible to define probability measures in terms of the Lebesgue measure dx as µ(dx) = A g(x)dx, x ∈ Ω, and Ω ⊆ R, satisfying the integrability condition Ω⊆R |X| α µ(dx) < ∞, α = 0, α > −β − 1 and A being a (normalization) constant. In that case, probability measures (or distributions) that generalize the gamma-type and Gaussian-type distributions may be constructed, and corresponding distribution functions and moments can be evaluated as well. For example, it can be shown using the results in this paper that the n th moments of the Gaussian random variable are given by the formula where θ ∈ R is the mean of the Gaussian random variable and σ 2 > 0 its variance. It is also shown, for instance, that the inverse gamma distribution, frequently used in signal processing in wireless communications, see for instance [4,14], is as well a particular case of the generalized gamma-type distribution derived in this study. The paper is organized as follows. In section 2, the integrals in (4)-(5) are evaluated, and some identities (or formula) that involve the exponential, hyperbolic, trigonometric functions and the hypergeometric functions 1 F 1 and 1 F 2 are obtained. In section 3, probability measures that generalize the gamma-type and Gaussian-type distributions are constructed, and their corresponding distribution functions are written in terms of the confluent hypergeometric function. Formulas to evaluate the n th moments are also derived in section 3. A general discussion is given in section 4.

Evaluation of the non-elementary integrals
Let first prove an important lemma which will be used throughout the paper.
Having evaluated (15), the following results hold. Theorem 2.3. Let α be an arbitrarily real or complex constant, β a nonzero real or complex constant (β = 0), and η a nonzero real or complex constant with a positive real part (Re(η) > 0).
(ii) Moreover, if the integrand is even, then Proof. It can readily be shown using Proposition 2.2 and the asymptotic expansion of the confluent hypergeometric function (formula 13. (21) If the integrand is even, then x α e −ηx β dx, and this gives (20).
As it will shortly be shown (see section 3), Theorem 2.3 canl be used to obtain new probability distributions that generalize the gamma-type and Gaussian-type distributions that may lead to better statistical tests than those already known which are based on the central limit theorem (CLT) [3].
Proof. The change of variable u β = ηx β yields Successive integration by parts that increases the power of u gives Using (7) and (8) in Lemma 2.1 yields Hence, using the fact u β = ηx β and rearranging terms gives (22).

New probability measures that generalize the gamma-type and Gaussian-type distributions
In this section, Theorem 2.3 is used to generalize the gamma-type (χ 2 distribution, inverse gamma distribution) distribution and Gaussian-type distributions.

Generalization of the gamma-type distributions
Define a probability measure µ in terms of the Lebesgue measure dx as [3] dµ = µ(dx) = A g(x; α, η, β)dx = f X (x; α, η, β)dx, x ∈ [0, +∞), where f X (x; α, η, β) is the probability density function (p.d.f.) of some random variable X, and A is a normalized constant which can be obtained using formula (19) in Theorem 2.3. After normalization, it is found that the p.d.f. of X is given by The distribution function of the random variable X can be obtained using Proposition 2.2 and is given by The n th moments (M (X n )) can, as well, be evaluated using formula (19) in Theorem 2.3 to obtain These results are summarized in the following theorem.
Theorem 3.1. Let X be a random variable with the generalized gamma-type p.d.f.
(49) Then, the distribution function F X (x; α, η, β) of the random variable X is given by and the n th moments M (X n ) of X are given by If Y is some gamma distribution random variable, then the random variable X = 1/Y is said to be an inverse gamma distribution random variable. The inverse gamma distribution find applications in wireless communications, see for example [4,14]. Its distribution function may be evaluated using Theorem 3.1.
Corollary 3.2. Let X be a random variable with the inverse gamma distribution, X ∼ IG(θ, η). Then, the distribution function F X (x; θ, η) is given by while the n th moments M (X n ) are given by Proof. Setting α = −(θ + 1), β = −1, and using the fundamental theorem of which is the p.d.f of the inverse gamma distribution. The n th moments M (X n ) of X are obtained by setting α = −(θ + 1) and β = −1 in (51).
It is important to note that f X in this case is even, and so, a factor of 2 has to appear in the denominator. The distribution function F X can also be obtained using Proposition 2.2 and is thus given by The moment (M (X n )) can be evaluated using formula (20) in Theorem 2.3 to obtain These results are summarized in the following theorem.
(60) Then, the distribution function F X (x; α, η, β) of the random variable X is given by (61) and the n th moments M (X n ) of X are given by if n is odd.
More general results can be achieved by introducing two additional parameters.
Formula (65) is obtained by making the substitution u = (x − θ)/σ, and by applying the binomial theorem and Theorem 2.3.
A generalized Gaussian-type distribution may be derived by setting α = 0 in Theorem 3.4.