Discrete Entropic Uncertainty Relations Associated with FRFT *

Based on the definition and properties of discrete fractional Fourier transform (DFRFT), we introduced the discrete Hausdorff-Young inequality. Furthermore, the discrete Shannon entropic uncertainty relation and discrete Rényi entropic uncertainty relation were explored. Also, the condition of equality via Lagrange optimization was developed, as shows that if the two conjugate variables have constant amplitudes that are the inverse of the square root of numbers of non-zero elements, then the uncertainty relations reach their lowest bounds. In addition, the resolution analysis via the uncertainty is discussed as well.


Introduction
Uncertainty principle not only holds in analog signals, but also in discrete signals [1,2].Recently, with the development of fractional Fourier transform (FRFT), analog generalized uncertainty relations associated with FRFT have been carefully explored in some papers such as [3,4,16], which effectively enrich the ensemble of FRFT.However, up till now there has been no reported article covering the discrete generalized uncertainty relations associated with discrete fractional Fourier transform (DFRFT).From the viewpoint of engineering application, discrete data are widely used and seem to be more profitable than the analog ones.Hence, there is enough need to explore discrete generalized uncertainty relations.DFRFT is the discrete version of FRFT [5,6], which is applied in practical engineering fields.In this article we will discuss the entropic uncertainty relations [7,8] associated with DFRFT.In this article, we made some contributions such as follows.The first contribution is that we extend the traditional Hausdorff-Young inequality to the FRFT domain with finite supports.It is shown that these bounds are connected with lengths of the supports and FRFT parameters.The second contribution is that we derived the Shannon entropic uncertainty principle in FRFT domain for discrete case, based on which we also derived the conditions when these uncertainty relations have the equalities via Lagrange optimi-zation.The third contribution is that we derived the Renyi entropic uncertainty principle in FRFT domain for discrete case.The final contribution is that we discussed the resolution in multiple FRFT domains as a succession of above derivative, including new proofs.In a word, there have been no reported papers covering these generalized discrete entropic uncertainty relations on FRFT.

FRFT and DFRFT
First, confirm that you have the correct template for your paper size.This template has been tailored for output on the custom paper size (21 cm * 28.5 cm).
Before discussing the uncertainty principle, we introduce some relevant preliminaries.Here we first briefly review the definition of FRFT.For given analog signal and , its FRFT [5,6] is defined as ) where Z  n and is the complex unit, i  is the transform parameter defined as that in [5,6].In addition, ), 2 ( ), 1 ( Let N be a discrete time series with cardinality N and . Assume its DFRFT under the transform parameter Then the DFRFT [5] can be written as Also, we can rewrite the definition (2) as where .
Clearly, for DFRFT we have the following properties [5]: In the follows, we will assume that the transform parameter   2 0   and    .The max difference between the discrete and analog definitions is the support: one is finite and discrete and the other one is infinite and analog.

Shannon Entropy and Rényi Entropy
For any discrete random variable ( ) and its probability density function , the Shannon entropy [9] and the Rényi Entropy [10] is defined as, respectively n x ) Hence, in this article, we know that for any DFRFT , the Shannon entropy and the Rényi Entropy associated with DFRFT is defined as, respectively

Discrete Hausdorff-Young Inequality Associated with DFRFT
Let be a discrete time series with cardinality N and its DFRFT with and the transform parameter . Clearly, we can ob- Then from Riesz's theorem [11,12], we can obtain the discrete Hausdorff-Young inequality , then , we obtain , then .In addition, from the property of DFRFT we can have Since the value of X can be taken arbitrarily in , can also be taken arbitrarily in .Therefore, we can obtain the following lemma.
Clearly, this is the discrete version of Hausdorff-Young inequality.In the next sections, we will use this lemma to prove the new uncertainty relations.

Shannon Entropic Principle
Theorem 1: For any given discrete time series is the DFRFT series associated with the transform parameter respectively),  (  N ) counts the non-zero elements of  (  , respectively), then we can obtain the generalized discrete Shannon entropic uncertainty relation where , where ˆ( ) we can obtain the final result in theorem 1.Now consider when the equality holds.From theorem 1, that the equality holds in (3) implies that To solve this problem let us consider the following Lagrangian In order to simplify the computation, we set Solving the above equations, we finally obtain . From the definition of Shannon entropy, we know that if , the equality in (3) holds.In addition, we also have . From the proof we know that that can be complex, and only if their amplitudes are constants, the equality will hold.Now we can obtain the following corollary out of above analysis.
,  (  ) is the DFRFT series associated with the transform parameter x ˆx ˆ (  , respectively), ( ) counts the non-zero elements

Rényi Entropic Principle
Theorem 2: For any given discrete time series C N with cardinality N and 1 2 ,  (  ) is the DFRFT series associated with the transform parameter x ˆx ˆ (  , respectively),  (  N ) counts the non-zero elements of  (  , respectively), then we can obtain the generalized discrete Renyi entropic uncertainty relation where Copyright © 2013 SciRes.JSIP Discrete Entropic Uncertainty Relations Associated with FRFT 123 Take the square of the above inequality, we have Take the power of both sides in above inequality, we obtain Take natural logarithm in both sides of (5), we can obtain Clearly, as 1   and 1   , the Renyi entropy reduces to Shannon entropy, thus the Renyi entropic uncertainty relation in (4) reduces to the Shannon entropic uncertainty relation (3).Hence the proof of equality in theorem 2 is trivial according to the proof of theorem 1.
Note that although Shannon entropic uncertainty relation can be obtained by Rényi entropic uncertainty relation, we still discuss them separately in the sake of integrality.

Resolution Analysis in Time-Frequency Domain
In many cases [13][14][15], we often discuss the data concentration in both time domain and frequency domain, therefore, we will adopt a new measure on entropy: where  (  ) is the DFRFT series associated with the transform parameter x ˆx ˆ (  , respectively) for time series x Ĥ are the Shannon entropies for the DFRFT series ( ).

 
Theorem 3: For any given discrete time series ,  (  ) is the DFRFT series associated with the transform parameter  X x ˆx ˆ (  , respectively),  (  N ) counts the non-zero elements of  (  , respectively), then we can obtain the generalized discrete entropic uncertainty relation Proof: For variable , we have for . Therefore, we have Taking into account of the non-negative second derivative, therefore we know that q x 1 log  is convex.Therefore, we obtain and the Riesz-Thorin theorem [4], one has By applying negative logarithm to both sides of above inequality, we obtain the following relation ˆlog log .
Taking into account of 1 ~2 .Hence, ( 7) and ( 8) imply Thus the proof is completed.Note that this proof can be obtained via the similar manner with Section 3 A, however, we still give a new proof so that we can understand this point out of different aspect.On the other hand, different proofs yield the same result, as validates the conclusion derived here.

Conclusions
In

Lemma 1 :
For any given discrete time series

2 ,
 (  ) is the DFRFT transform matrix associated with the transform parameter  X U U  (  , respectively), then we can obtain the generalized discrete Hausdorff-Young inequality in both sides in above inequality, we can obtain 0 ) (  p T

Corollary 1 :
For any given discrete time series this article, we extended the entropic uncertainty relations in DFRFT domains.We first introduced the generalized discrete Hausdorff-Young inequality.Based on this inequality, we derived the discrete Shannon entropic uncertainty relation and discrete Rényi entropic uncertainty relation.Interestingly, when the variable's amplitude is equal to the constant, i.e. the inverse of the square root of number of non-zero elements, the equality holds in the uncertainty relation.In addition, the product of the two numbers of non-zero elements is equal to