A Form of Information Entropy

In this paper, by axiomatic way, a form of information entropy will be presented on crisp and fuzzy setting. Information entropy is the unavailability of information about a crisp or fuzzy event. It will use measure of information defined without any probability or fuzzy measure: for this reason it is called general information.


Introduction
The setting of entropy was statistical mechanics: in [1] Shannon introduced entropy of a partition A π of a set A , linked to a probability measure.Now, we recall this definition.Let Ω be an abstract space,  a σ -algebra of subsets A of Ω and P a probability measure defined on  .Moreover  is the collection of the partition A π , where  Basic notions and notations can be found in [2].Setting ( ) ( ) and it was used in many problems [5] [6].
In generalizing Bolzmann-Gibbs statistical mechanics, Tsallis's entropy was introduced [7]: , , 1 . 1 We note that all entropies above are defined through a probability measure.In 1967 J. Kampé de Feriét and B. Forte gave a new definition of information for a crisp event, from axiomatic point of view, without using probability [8]- [10].Following this theory other authors have presented measures of information for an event [11].In [12], with Benvenuti we have introduced the measure of information for fuzzy sets [13] [14] without any probability or fuzzy measure.
In this paper we propose a class of measure for the entropy of an information for a crisp or fuzzy event, without using any probability or fuzzy measure.
We think that not using probability measure or fuzzy measure in the definition of entropy of the information of an event, can be an useful generalization in the applications in which probablility is not known.
So, in this note, we use the theory explained by Khinchin in [15] and we give a new definition of entropy of information of an event.In this way it is possible to measure the unavailability of information.
The paper is organized as follows.In Section 2 there are some preliminaries about general information for crisp and fuzzy sets.The definitions of entropy and its measure are presented in Section 3. Section 4 is devoted to an application.The conclusion is considered in Section 5.

General Information
Let Ω be an abstract space and  the σ -algebra of crisp sets .C ⊂ Ω General information J for crisp sets [8] [10] is a mapping J Ω = In analogous way [12], the definition of measure of general information was introduced by Benvenuti and ourselves for fuzzy sets.Let Ω be an abstract space and  the σ -algebra of fuzzy sets.General information is a mapping

General Information Entropy
Using general information recalled in Section 2, in this paragraph a new form of information entropy will be introduced, which will be called general information entropy.Information entropy means the measure of unavailability of a given information.

Crisp Setting
In the crisp setting as in Section 2, given information J the following definition is proposed.
Definition 3.1.General information entropy for crisp sets is a mapping ( )
So, general information entropy ( ) E ⋅ is a monotone, not-increasing function with ( ) It is possible to extend the definition above to fuzzy sets.

Fuzzy Setting
Given , , J ′ Ω  as in Section 2, the following definition is considered.Definition 3.2.General information entropy for fuzzy sets is a mapping ′ ′ ⋅ +∞ → +∞ with the following properties: 1) monotonicity: The universal values can be considered a consequence of monotonicity.So, general information entropy ( ) E′ ⋅ is a monotone, not-increasing function with ( ) E′ +∞ = Assigned information J ′ on  an example of this entropy is

Application to the Union of Two Disjoint Crisp Sets
In this paragraph, an application of information entropy will be indicated: it concerns the value of information entropy for the union of two disjoint crisp sets.The procedure of solving this problem is the following: first, the presentation of the properties, second the translation of these properties in functional equations, by doing so, it will be possible to solve these systems [16].It is possible to extend this application also to the union of two disjoint fuzzy sets.On crisp setting as in Section 2, let 1 C and 2 C two disjoint sets.In order to characterize information entropy of the union, the properties of this operation are used.The approach is axiomatic.The properties used by us are classical .
( ) where , , , , 0, x y z x y ′ ′∈ +∞ the properties ( ) ( )  lead to solve the following system of functional equations: We are looking for a continuous function Ψ as an universal law with the meaning that the equations and the inequality of the system ( ) ( )  must be satisfied for all variables on every abstract space satisfying to all restrictions.Proposition 4.1.A class of the solutions of the system ( ) ( ) where [ ] [ ] : 0, 0, h +∞ → +∞ is any continuous bijective and strictly decreasing function with ( ) h +∞ → Proof.The proof is based on the application of the theorem of Cho-Hsing Ling [17] about the representation of associative and commutative function with the right element (here it is +∞ ) as unit element.
 From ( 1) and (2) information entropy of the union of two disjoint set is expressed by h +∞ →

Conclusion
By axiomatic way, a new form of information entropy has been introduced using information theory without probability given by J. Kampé De Fériet and Forte.For this measure of information entropy, called by us, general because it doesn't contain any probability or fuzzy measure, it has been given a class of measure for the union of two crisp disjoint sets.


and it is measure of uncertainty of the system A π .Shannon's entropy is the weight arithmetic mean, where the weights are i p .Many authors have studied this entropy and its properties, for example: J. Aczél, Daróczy, C. T. Ny; for the bibliography we refer to[3] [4].Another entropy was introduced by Rényi, called entropy of order α , 1 α = / : is any continuous bijective and strictly decreasing function with ( )