R(D) Definition Domain Deriving of Limited Distortion Source Coding With Different Sources ()

Chunhua Zhu^{}, Huan Mi^{}, Mengfei Kan^{}, Yi Song^{}

School of Information Science and Engineering, Henan University of Technology, Zhengzhou, China.

**DOI: **10.4236/ijcns.2017.108B028
PDF
HTML XML
842
Downloads
1,445
Views
Citations

School of Information Science and Engineering, Henan University of Technology, Zhengzhou, China.

For limited distortion source coding, it is generally considered that the minimum value of the coding average distortion is 0, and the maximum value is the minimum distortion value of making *R(D)* = 0. This is the definition domain of the information rate distortion function. In this paper, the upper and lower bounds of the information rate distortion function *R(D)* are derived and computed for the typical sources. The results show that the lower bound of the coding average distortion D is related to the symbol distortion function, which can further improve the theory of limited distortion source coding.

Share and Cite:

Zhu, C. , Mi, H. , Kan, M. and Song, Y. (2017) R(D) Definition Domain Deriving of Limited Distortion Source Coding With Different Sources. *International Journal of Communications, Network and System Sciences*, **10**, 263-268. doi: 10.4236/ijcns.2017.108B028.

1. Introduction

From the point of view of information processing, no distortion source coding is entropy preserving. And the entropy preserving coding is not always necessary. Such as the human eyes to accept the visual signal, then there is no need to carry out the distotion-free entropy coding. But entropy preserving coding is not always possible. For example, when the continuous signal is subjected to digital processing, it is impossible to fundamentally remove the quantization error. At this point. Although the relative entropy of the continuous source is limited, the amount of information is infinite [1]. But the actual channel always has interference, and its capacity is limited, so the distortionless transmission of this continuous information is impossible. Only when certain distortion is allowed, the value of is finite, and the transmission is possible.

Reducing the rate of information is beneficial to transmission and processing, so it is necessary to perform entropy compression coding. Therefore, it is necessary to introduce the source coding with limited distortion. is a very important parameter in the source code is the information rate distortion function [2] [3]. In the case of allowing a certain degree of distortion, the rate-distortion function of the source can be used as a measure of the performance of various compression coding methods. It is generally assumed that the minimum value of the coding average distortion is 0 and the maximum value is the minimum distortion value of, that is the definition field of the function is. In fact, in many cases, is not necessarily zero [4] [5], its value is related to the single symbol distortion function. Only when the distortion matrix has at least one zero element in each row, the average distortion of the source can reach zero. In this paper, the upper and lower bounds of the definition domain of are deduced, and the upper and lower bounds of the definition domain of under different source and different distortion function are given. In this way, we can further improve the limitless source coding theorem.

2. The Infer of Domain Definition of the Information Rate Distortion Function

The information rate distortion function independent variable is the average distortion caused by a limited information source coding algorithm. The average distortion allowed is the upper limit of the stipulated average distortion. The definition domain question of information rate distortion function is that the minimum and maximum values of average distortion are studied when the information source and the distortion function are known. The value of must satisfy the fidelity criterion

(1)

The average distortion is determined according to the statistical properties of a information source, Statistical property, coding mapping function and distortion function, that is [1]

(2)

2.1. The Defined Domain Lower Bound of Information Rate Distortion Function

The mathematical expectation of nonnegative real numbers is, therefore, is a nonnegative real number, with a lower limit of 0, so that the permissible lower bound of the average distortion must be 0, that is, this is the case where no any distortion is allowed. As for whether the permissible average distortion can reach its lower limit value of 0, it is related to the distortion function of a single symbol. When solving the minimum mean distortion of the information source, for each, find a corresponding that makes the minimum, and the corresponding minimum values for different are also different. This is equivalent to finding a smallest in each line of the distortion matrix, and the minimum of each line is different. The mathematical expectation of all these different minima is the minimum mean distortion of the information source [1], that is

(3)

It can be seen by (3) formula, When each line of the distortion matrix has at least one zero element, the average distortion of the information source can reach the lower bound value, otherwise must not be equal to zero. When, that is, the information source does not allow any distortion to exist, which is equivalent to noiseless channels. At this point, the amount of information transmitted by the channel is equal to the entropy of the information source. That is

(4)

If the formula (4) is established, each line in the distortion matrix has at least one zero, and each column can have at most one zero. Otherwise, can be less than, which means that the information source symbols are concentrated, and some symbols can be compressed and merged without any distortion.

2.2. The Defined Domain Upper Bound of Information Rate Distortion Function

For the maximum average distortion of the information source, the smaller the required compression rate, the greater the tolerance distortion. When, the corresponding average distortion is maximal, is the upper bound of the domain of the function definition. Since the information rate distortion function is the minimum of average mutual information. So when, the minimum value of the average mutual information at that time is equal to 0, this moment. Because is a nonnegative function, is also a nonnegative function. When, can only be equal to 0. Therefore, the upper bound of the defined domain is the minimum value for all that satisfy.

equivalent input and output statistical independence, which means that any information sent not received at the receiving end, equivalent to the information source does not send any information, or transmission of information source symbol information rate can be compress to 0. Therefore, has nothing to do with. That is

(5)

By Equation (5), can be solved.

(6)

From the Equation (6) can be observed, In, you can find the

that minimizes the value. When corresponds to the

, and the rest of the, the right of the formula is minimized. At this point, the upper Formula can be simplified into

(7)

Formula (7) is for different , using the input probability distribution, the mathematical expectation of the is taken, the minimum of the mathematical expectation corresponding to the is taken as, at this point, making the is the corresponding output probability of the, while the other.

Thus, the domain of definition can be defined as.

2.3. Domain of the Different Sources Rate Distortion Function

The information source is n element equal probability distribution, the distortion is Hamming distortion

(8)

Then the rate distortion function of the information source is [2]

(9)

By the Equation (9), the value of and can be computed, the corresponding simulation diagram of based on MATLAB is shown in Figure 1.

For the information source of unequal probability distribution, the several typical information sources are considered, and the corresponding upper bound and lower bound are obtained by the above derivations as Table 1.

3. Conclusion

In this paper, the upper and lower bounds of the definition domain of the information rate distortion function are deduced and analyzed. The information source type and the distortion function determine the ex-

Figure 1. R(D) curve.

Table 1. The upper and the lower bound of typical information source.

pression of the information rate distortion function and the upper and lower bounds of the definition domain. Ideally, we all think that the lower bound of the definition domain of the information rate distortion function is 0, but in practical applications, it’s necessary to consider if the lower bound of the definition domain of can be taken as zero. Accurately speaking, the definition domain of is. The value of is related to whether the transmission rate can reach the entropy of the source. Therefore, it is necessary to calculate the maximum information transmission rate after the source coding, so as to solve all kinds of practical problems in information transmission. In this paper, by defining the precise deduction of the lower bound of the definition domain, it is explained in detail whether or not it can get zero value, and under what conditions zero value can be reached. It further prefects the theory of distortion limited source coding, and it has practical significance for data compression and data transmission of real information source.

Acknowledgements

This work was funded by National Natural Science Foundation of China―Re- search on no proportion coding cooperative transmission method based on dynamic antenna selection in large scale MIMO systems (61601170) and Henan Provincial Department of science and technology project―Study on wireless channel characteristics of bulk grain reactor (172102210230).

Conflicts of Interest

The authors declare no conflicts of interest.

[1] | Cao, X.-H. and Zhang, Z.-C. (2009) Information Theory and Coding. 2nd Edition, Tsinghua University Press, Beijing. |

[2] | Li, Y.-L. and Zhao, J. (2006) Calculation Method of Discrete Source Rate Distortion Function. Water Conservancy Science and Technology and Economy, 12, 564-566. |

[3] | Lu, C.-G. (2012) The Relation between GPS Information and Error Lim-ited Information Rate and Information Rate Distortion and Complexity Distortion. Journal of Chengdu University of Infor-mation Technology, 6, 615-622. |

[4] | Jiang, S.-F. (2006) Information Rate Distortion Function R(D) Prediction-Correc- tion Calculation Method. Journal of Shanghai University of Electric Power, 2, 187- 191. |

[5] | You, X.-X. and Wang, J.-H. (2014) Calculation Method of Information Rate Distortion Function Based on Reverse Test Channel. Journal of Hubei Normal Uni-versity (Natural Science Edition), 4, 12-16. |

Journals Menu

Contact us

customer@scirp.org | |

+86 18163351462(WhatsApp) | |

1655362766 | |

Paper Publishing WeChat |

Copyright © 2023 by authors and Scientific Research Publishing Inc.

This work and the related PDF file are licensed under a Creative Commons Attribution 4.0 International License.