Generalized Discrete Entropic Uncertainty Relations on Linear Canonical Transform

Uncertainty principle plays an important role in physics, mathematics, signal processing and et al. In this paper, based on the definition and properties of discrete linear canonical transform (DLCT), we introduced the discrete HausdorffYoung inequality. Furthermore, the generalized discrete Shannon entropic uncertainty relation and discrete Rényi entropic uncertainty relation were explored. In addition, the condition of equality via Lagrange optimization was developed, which shows that if the two conjugate variables have constant amplitudes that are the inverse of the square root of numbers of non-zero elements, then the uncertainty relations touch their lowest bounds. On one hand, these new uncertainty relations enrich the ensemble of uncertainty principles, and on the other hand, these derived bounds yield new understanding of discrete signals in new transform domain.

In this paper, we made some contributions such as follows.The first contribution is that we extend the traditional Hausdorff-Young inequality to the DLCT domain with finite supports.It is shown that these bounds are connected with lengths of the supports and LCT parameters.The second contribution is that we derived the Shannon entropic uncertainty principle in LCT domain for discrete data, based on which we also derived the conditions when these uncertainty relations have the equalities via Lagrange optimization.The third contribution is that we derived the Renyi entropic uncertainty principle in DLCT domain.As far as we know, there have been no reported papers covering these generalized discrete entropic uncertainty relations on LCT.

LCT and DLCT
Before discussing the uncertainty principle, we introduce some relevant preliminaries.Here we first briefly review the definition of LCT.For given analog signal 1 2 e e e d 0, 1 e 0 , and is the complex unit, is the transform pa- rameter defined as that in [5,6].In addition, be a discrete time series with length N and 2 1 X  .Assume its DLCT (discrete FLCT) Then the DLCT [5] can be written as Also, we can rewrite the definition (2) as Clearly, for DLCT we have the following property [5]: In the following, we will assume that the transform parameter .Note the main difference between the discrete and analog definitions is the length: one is finite and discrete and the other one is infinite and continuous.

Shannon Entropy and Rényi Entropy
For any discrete random variable N and its probability density function , the Shannon entropy [9] and the Rényi Entropy [10] are defined as, respectively Hence, in this paper, we know that for any DLCT ), the Shannon entropy and the Rényi Entropy [13] associated with DLCT are defined as, respectively   as shown in [13],

Discrete Hausdorff-Young Inequality on DLCT
Lemma 1: For any given discrete time series , , , , , is the DLCT transform matrix associated with the transform parameter , respectively), then we can obtain the generalized discrete Hausdorff-Young inequality be a discrete time series with length N and its DLCT   . Clearly, we can obtain the inequality [13]: Hence, we have [11,12], we can obtain the discrete Hausdorff-Young inequality [11,12]   In addition, from the property of DLCT [5] we can have Hence we can obtain from the above equations Since the value of X can be taken arbitrarily in , can also be taken arbitrarily in .Therefore, we can obtain the lemma.
Clearly, this lemma is the discrete version of Hausdorff-Young inequality.In the next sections, we will use this lemma to prove the new uncertainty relations.

Shannon Entropic Principle
Theorem 1: For any given discrete time series , , , , , is the DLCT series associated with the transform parameter counts the non-zero elements of ˆA x ( ˆB x , respectively), then we can obtain the generalized discrete Shannon entropic uncertainty relation where Proof: From lemma 1, we have Take natural logarithm in both sides in above inequality, we can obtain and Parseval equality, we know we can obtain the final result in theorem 1 by setting p = 2. Now consider when the equality holds.From theorem 1, that the equality holds in (3) implies that then we have .Hence we have

Rényi Entropic Principle
2 ln 1 0 , Theorem 2: For any given discrete time series , , , , , is the DLCT series Solving the above equations, we finally obtain . From the definition of Shannon entropy, we know that if In addition, we also can obtain zero elements of ˆA x ( ˆB x , respectively), then we can obtain the generalized discrete Renyi entropic uncertainty relation ) From the above proof, we know that   , can be complex values, and only if their amplitudes are constants, the equality will hold.Now we can obtain the following corollary out of above analysis.
is the DLCT series associated with the transform parameter Take the square of the above inequality, we have Take the power 1    of both sides in above inequallity, we obtain Take the natural logarithm on both sides of (5), we can obtain Clearly, as 1   and 1   , the Renyi entropy reduces to Shannon entropy, thus the Renyi entropic uncertainty relation in ( 4) reduces to the Shannon entropic uncertainty relation (3).Hence the proof of equality in theorem 2 is trivial according to the proof of theorem 1.Note that although Shannon entropic uncertainty relation can be obtained by Rényi entropic uncertainty relation, we still discuss them separately in the sake of integrality.

Another Shannon Entropic Principle via Sampling
The discrete Shannon entropy can be defined as where  is the density function of variable s .Discrete Rényi entropy can be defined as follows:   , discrete Rényi entropy tend to discrete Shannon entropy.
In order to obtain the discrete spectrum, the sampling must be done.For two continuous functions' DLCT , respectively), we set the sampling periods 1 and 2 T and assume that they isfy the Shannon sampling theorem [16].Set Since when 1 is a convex function, and when 1 is a concave function, we have the following inequalities Take the power of    1 on the both sides of above equation and use the relation between  and  , we have Take logarithm on both sides of above equation That is, , , , cos ,sin , sin , cos

Conclusion
In this article, we extended the entropic uncertainty relations in DLCT domains.We first introduced the generalized discrete Hausdorff-Young inequality.Based on this inequality, we derived the discrete Shannon entropic uncertainty relation and discrete Rényi entropic uncertainty relation.Interestingly, when the variable's amplitude is equal to the constant, i.e. the inverse of the square root of number of non-zero elements, the equality holds in the uncertainty relation.In addition, the product of the two numbers of non-zero elements is equal to hand, these new uncertainty relations enrich the ensemble of uncertainty principles, and on the other hand, these derived bounds yield new understanding of discrete signals in new transform domain.