Wavelets and Entropy for Power Quality Assessment

In this paper, wavelet transform and entropy are evaluated using the mathematical analysis concepts of reflexibility, regularity and series obtention, these concepts remark the reason to make a selective reference framework for power quality applications. With this idea the paper used the same treatment for the two algorithms (Multiresolution and Multiscale Entropy). The wavelet is denoted to have the most power full consistence to the light off the reflexibility, regularity and series obtention. The paper proposes a power quality technique namely MpqAT.


Introduction
The natural harmony of Fourier's analysis in the Hilbert space is demonstrated by Riesz-Fischer's [1] and Plancherel's [2] theorems. In this harmony, three concepts are summarized: reflexibility, regularity and series obtention.
These three concepts are intended to be shown in two ways, firstly, using wavelet transformation, and secondly, through numerical entropy.
In [1] an approximation of the n-dimensional wavelet transform was shown through heuristic treatment. Following the same methodology, this article aims to show coincidences between the two methods (in wavelets and entropy) highlighting the following three basic concepts: reflexibility, regularity and series obtention. The orientation of these three concepts determines the way that engineers approach definitions for quality concepts.
In particular, power quality allows identifying the health state of a power system by means of applying processes signal techniques to current and voltage waveforms. Therefore, power quality is used as the concept with which the achieved definition will be tested. consequently the wavelet series is:

Wavelet Transform
Then , , , Then: wavelet transform relative to basic wavelet φ, like the function: And thus the C jk parts of the wavelet series are obtained from the wavelet transform for: The demonstration can be seen in [3].

Entropy
From the point of view of physical, entropy is the concept that measures the tendency towards disorder in nature. This concept has had an important development in the applications derived from it, for example: evaluation of the efficiency in electric motors or power systems. Philosophically, the concept has been used, given the implications for understanding of natural elements and their interaction with life.
With regard to information, entropy has made information visible as a message, which must generate a link between sender and receiver by means of propagation or transmission (whether physical or abstract). This characterizes the degree of difficulty in nature as the goal of entropy. That is, these difficulties are noise, interruptions, etc.
They are disturbances in the message during transmission, and can represent loss of information as a result of system conditions, conformed by the source, transmitter and receiver.
In this particular case, the tendency to decrease information can be visualized as loss or disorder, and so, it is visualized as a form of entropy [4].

Information Entropy
Information Entropy is also known as "Shannon's entropy". The coding theorem focuses its attention on random behavior of nature, such as disturbing elements or noise [5].
It is said that an extensive property is one that we can define through the analysis of systems composed by other subsystems; the properties of large systems require varying slopes.
Entropy is an extensive property. The information contained in two information channels should equal the sum of the information carried by the two channels individually [6].
Entropy is defined as a measure of uncertainty for a random variable [5].
Shannon's entropy H(X) is defined as: where X represents the random variable with Θ set of values, and probability The equation is generally calculated in binary logarithm. In this case, entropy is expressed in (for example, the entropy of throwing a die is 0.1870 bits). Note that ( ) -log 0 p p ≥ because 0 1 p ≤ ≤ , therefore, entropy is strictly positive, as observed in reference [4]. If we change the base of the Neperian logarithm i.e.: e, the entropy is measured in nats [7].
For a time series representing the output of a stochastic process, which is an , with a set of values 1 , , n Θ Θ  respectively, ndimensional entropy is defined as: is the joint probability for n variables 1 , , n X X  .
The state of a system at a certain moment X n , is partly determined by its history, X n , 2 1 , , n X X −  . However, each new state of the system brings a certain amount of new information with it.

Coexistence of Reflexibility
Theorem 1: Reflexibility: Let E be a Hilbert space, with E' as its dual. Denoted by the duality between E', Part B: Involves demonstrating the existence of the inverse transform [2].

The Theorem for Information Entropy
This is constituted by definition: But the sample is not recoverable; it cannot be obtained its inverse transformation, and there is no recoverable application.

Characterization of Regularity
The Parseval theorem and the central limit: by definition, regularity indicates the variation of a number, with respect to its mean.

The Parseval Identity as It Relates to Wavelet Transform
In wave transformation, it is used to characterize the regularity of f in L 2 , as measured by the Sobolev norm, which indicates the Parseval identity.
In other words: Applying the Parseval identity: NOTE: The change in integration is performed in accordance with Fubini, and the second integral on the right side is C ψ .
Applying the Parseval equality again: With the amount of energy from the signal as a function of the energy is decomposed in each element or component in frequency bands (wavelets).

Regularity Basis of Entropy
The definition of entropy comes from the central limit theorem, that is: Suppose that m is a measure of probability in the data from the real signal, Convergence in (16) shows the definition of entropy. From this definition comes:

Approximate Entropy Algorithms ApEn and Etropy Sampler (SamPEn)
Derived from Shannon's work, Pincus [8] proposed the approximate entropy algorithm (Approximate Entropy) ApE n , which measures regularity from the mathematical analysis [9] point of view.
ApE n algorithm description: Given an N sample, time series , two input parameters m and r, must be incorporated. These belong to parameters from the correlation dimension postulated by Grassberger and Procaccia [10]. Parameter m corresponds to the length of vectors u m (i), generated from the data, and which correspond to the number of samples in the series. Parameter r is the tolerance, which is the distance to be defined, which evaluates the points immediately next to a reference point.
According to length value m, vectors . These vectors represent m consecutive values of time series x, starting with the first event-tracking element, as shown in Figure 1.
The distance between vectors u m (i) and u m (j) is defined as the maximum of the absolute value of the difference between vector components: Each element of ( ) m i C r then measures the regularity, or frequency, of similar values, within length m with r tolerance [11].
( ) m i C r is constructed by (19): C r element average of i, and is expressed as follows: Therefore, ApE n is estimated as follows: Equations ( (22) and (23)) define vector SampE n elements using the number of pairs u m (i), u m (j) that comply with parameter r, so long as That said, i ≠ j, and so the pairing of a vector with itself is not taken into account. Richman and Moorman defined the sample entropy as: Which is estimated in statistics [13] as:

Approach of the Series
With the definition of regularity, it is necessary to introduce an approach to series from a signal to show the components that can be disaggregated and have the same degree of regularity of these components.

Axiomatic Definition of Multiresolution Using Wavelets
An intuitive idea for the division of the spectrum by series of discrete waves, using filters is represented in Figure 2.

Definition 2 [2]
A multiresolution structure is a sequence of subspaces {V j } in L 2 (R), such that: The symbol ⊕ should be interpreted as the orthogonal sum of two subspaces.
From Figure 2, one can observe that W j + 1 is the orthogonal complement of V j + 1 in V j . W j is the subspace of a band limited to the Beginning with: {ψ m,n }, n∈Z an orthonormal basis for W m , where W m is the orthogonal complement of V m in V m − 1. A demonstration can be seen in [1].
By virtue of the previous theorem, the simple choice of a 0 = 2 and b 0 = 1 generates an orthonormal basis of functions. Observation: From the multiresolution analysis, we have, therefore, two spaces and for each of them, a set of generating functions.
As V 1 ⊆ V 0 and W 1 ⊆ V 0 , the functions of these subspaces are boundaries (in L 2 ) FOR linear combinations of the base function V 0 . There is a sequence {v(k)}, such that: Reorganizing internal parentheses: In accordance with the sampling theorem results: And since φ(t) satisfies the equation between two scales, it is called scale function.
In the same way, {w(k)} is a sequence such that: Resulting in: The following relationships result from orthogonality: where δ(m) is a generalized function or a Dirac delta [1]. The internal product between the functions is symbolized by 〈,〉.
The V spaces are generated by scale functions φ(t), and similarly, W spaces are generated by wavelet functions ψ(k).
In other words, wavelet functions and scale functions are used as blocks on which to construct or decompose the signal at different levels of resolution. Wavelet functions will generate different versions of details of the composite signal and the scale function will generate the approximate version of the signal, object of the decomposition. This can be mathematically represented by the following equation: where, c is the coefficient of the scale, d j is the coefficient of the wavelet in scale j, φ(t) and ψ(t) are the functions scale and wavelet, respectively, and k is the coefficient of translation. Partial conclusion: This main result proposed by French mathematician Yves F. Meyer was the core for posterior (section VII) assessment in power quality. Equation (37) has all three central elements proposed in this article: reflexivity, regularity, and it is a series.

Multiscale Entropy (MSE)
With algorithms ApE n [14] and SampE n , the loss of regularity in the time series is measured. Madalena et al. [13] have proposed taking into account a reconstitution of the time series on scales. With this, they have managed to increase the classification level of the pathologies that they study. This decomposition of the series is known as the Multi-scale Entropy (MSE) algorithm. The decomposition process is shown in Figure 3 and is described below. Description:

Applications for Power Quality
The three characteristics cited, reflexivity, regularity, and series, are indispensa- Figure 3. Two series construction from entropy. Adapted from reference [13]. ble properties for the design of a quality indicator.
Quality itself can be classified as the valuation that is given to a physical object that comes from the production process of another such object. This representation shows its own degree of excellence. For this, it is necessary to have an instrument which allows measurement of a signal from the physical object to be evaluated. This signal must be able to be placed in a comparative framework, where it is demonstrated that the degree of deviation from a reference is measurable. This deviation speaks to its degree of quality.

Definition 3-Measurement of Excellence
It is possible to normalize the workspace with a scalar type value, this will be a representation of the degree of excellence of a measured point versus its reference value, it is a normalized value, and since the signal analysis is equivalent to the signal noise ratio.

Definition 4-The Quality Index
In Equation (39), x i indicates the components that are deviated from reference x r .
In this way, the definition of the quality index is reached. You can use continuous parameter or discrete parameter space {L n , l n }. As it is a work that can be carried to the transformed frame, the measure and integral within will be defined according to Lebesgue [2] [15] and the analysis can be performed in discrete space.

Representation of the Quality Index from the Wavelet Transform
where MSC is multiscale entropy, k is the scale where the energy of major im-portance is concentrated against the rest. This will be the level of quality deviation.

Framework for Evaluation-Modified pqAT Technique (M pq AT)
Much work has been done on the classification and characterization of disturbances [16]- [26].
Here a modification to the power quality Analysis Technique-pqAT [27] is made. This is an algorithm whose objective is the characterization and classification of disturbances in the electrical system. This new technique, M pq AT, is a previous step to quality maps [28].
The signal analysis method starts from the definition of the instantaneous power tensor, and the transformation is then performed on the frame of the transformed wavelet [1]. There, the parameters of active, reactive, and disturbance power are determined. Figure 4 shows the structure of the technique for power measurement, and the classification of events in power systems.
In Figure 4, we see seven blocks, divided into three structures. shows all aspects of error characterization (possibly followed by the performance of the protection system).
• These premises, accompanied by the inference engine, produce results presented by the classification block. 5) Block 6: In the calibration block, two parameters are set, on which the entire multiresolution analysis depends, making the method entirely dependent on them. These parameters are: the sampling frequency and the number of decomposition levels (in the case of wavelets or signal multiscale in the case of entropy).
In the case of any number of levels, because it is a dyadic decomposition (division of the frequency axis into octaves), the signal size will also be limited to a multiple number of samples, which agrees with most instruments that use the FFT.
The most current literature shows significant advances in the treatment of information from the point of view of classification techniques, using waveform transform. From pioneering studies [27] to the results presented in [16] [17] [18] [19], two trends are observed: the first is analysis, and the second classification.
With the MpqAT technique, an attempt is made to unify these two criteria, and give unity to the way to determine three typical effects of electromagnetic phenomenon in three-phase systems: imbalance, harmonics and transients.

Conclusions
Two traditionally used techniques have been compared in signal analysis, entropy and wavelets. The comparison has focused on three criteria: reflexivity, regularity, and series construction.
The article showed that for the case of wavelet theory, these three criteria are perfectly fulfilled. In the case of entropy, the concept of a series in an "artificial" signal shape is introduced, but the case of reflexibility is not fulfilled. Consequently, entropy is a valuable tool for regularity measurement only.
Additionally: • Quality has been defined from two main points of view, as a series of attributes of a physical object, and a degree of excellence that must be qualified according to that set of attributes. − The first part of the definition involves decomposing the attribute into a measurable series using the property of regularity. The second proposes the idea of quantifying and the degree of excellence through definition of quality indexes. This is based on those conservative-type parameters that are determined through energy definitions in the transformed frame-Parseval's theorem. − Finally, any technique that exhibits decomposition in reflexivity, regularity, or series is a candidate for use as quality evaluation framework. This article will close with a proposal to evaluate power quality using the structure of an expert system, dedicated to the measurement and classification of perturbations, a system called M pq AT. The novelty of this technique is that, through use of the same structure, analysis of both transient and stationary perturbation in any type of frame of reference is unified. System topology has been considered in the most general way possible, and is based on the results obtained by the series criteria, regularity, and reflexivity.