^{*}

^{*}

Vast segments of the frequency spectrum are reserved for primary (licensed) users. These legacy users often un-der-utilize their reserved spectrum thus causing bandwidth waste. The unlicensed (secondary) users can take advantage of this fact and exploit the spectral holes (vacant spectrum segments). Since spectrum occupancy is transient in nature it is imperative that the spectral holes are identified as fast as possible. To accomplish this, we propose a novel adaptive spectrum sensing procedure. This procedure scans a wideband spectrum using Hilbert Huang Transform and detects the spectral holes present in the spectrum.

Wireless networks, till date, are regulated by fixed spectrum allocation policies to operate in a particular time frame, certain frequency bands and are also constrained to geographical regions. Recent trends show that certain radio bands are overcrowded while some are moderately or sparsely used. In order to utilize the spectrum optimally and efficiently, cognitive radio technology has been proposed as a potential communication paradigm [

Spectrum Sensing: refers to detect the unused spectrum and sharing it without any interference with other users, by sensing spectrum holes. Spectrum sensing techniques can be classified into three categories, namely 1) Transmitter detection by inspecting the spectrum generated, 2) Cooperative detection, which pools information from multiple cognitive radio users and, 3) Interference based detection, by using an interference temperature model.

Spectrum Management: Having known the spectral holes, it has to decide on the best spectrum band to meet the Quality of Service requirements over all available spectrum bands Spectrum Mobility: As the objective is to use the spectrum in a dynamic manner they should enable transition to better spectrum by secondary users.

Spectrum Sharing: Involves spectrum scheduling for efficient spectrum usage.

In this paper transmitter detection method is used. Some authors used wideband spectrum joint detection for the presence of a primary user [

Traditional data analysis methods are based on assumptions that the data is linear and stationary. In recent years new methods in the field of data analysis have been introduced. For example, wavelet analysis [

The Hilbert Huang Transform is an empirically based data analysis method. Its basis of expansion is adaptive so that physical meaning can be derived from the nonlinear and non-stationary processes. The Hilbert Huang Transform consists of two parts: empirical mode decomposition (EMD) and Hilbert spectral analysis (HSA). We have incorporated a variant of the EMD called Empirical Ensemble Mode Decomposition (EEMD) to deal with input signals that are contaminated with noise.

The empirical mode decomposition method is vital to deal with data from non-stationary and nonlinear process. The decomposition is based on the assumption that any data consists of different simple intrinsic modes of oscillations. Each intrinsic mode, linear or nonlinear, represents a simple oscillation, which will have the same number of extrema and zero-crossings. Furthermore, the oscillation will also be symmetric with respect to the “local mean.” At any given time, the data may have many different coexisting modes of oscillation, one superimposing on the others. The result is the final complicated data. Each of these oscillatory modes is represented by an intrinsic mode function (IMF) with the following definition:

1) In the whole dataset, the number of extrema and the number of zero-crossings must either equal or differ at most by one, and

2) At any point, the mean value of the envelope defined by the local maxima and the envelope defined by the local minima is zero.

An IMF represents a simple oscillatory mode as a counterpart to the simple harmonic function, but it is much more general: instead of constant amplitude and frequency, as in a simple harmonic component, the IMF can have a variable amplitude and frequency as functions of time.

IMFs can be generated by following the steps:

1) Identify all the local maxima and join these points using a cubic spline to give an upper envelope 2) Repeat the above procedure for local minima’s to give lower envelope. (

3) The mean of the envelopes is designated as m_{1}.

4) The difference between the data and mean is the first component h_{1}.

h_{1} is treated as a proto-IMF. In the next step h_{1} is treated as the input data and this procedure continues until the new component satisfies the IMF definition. The input data is then sifted and decomposed and the procedure continues until we obtain a monotonic residue.

To overcome some of the shortcomings of EMD, noise assisted data analysis method (NADA) EEMD was incorporated [

1) A collection of white noise cancels each other out in a time-space ensemble mean; therefore, only the signal can survive and persist in the final noise-added signal ensemble mean.

2) Finite, not infinitesimal, amplitude white noise is necessary to force the ensemble to exhaust all possible solutions; the finite magnitude noise makes the different scale signals reside in the corresponding IMF, dictated by the dyadic filter banks, and render the resulting ensemble mean more meaningful.

After obtaining the IMF (

The combination of the ensemble empirical mode decomposition and the Hilbert spectral analysis is also known as the “Hilbert–Huang transform” (HHT) for short. Empirically, all tests indicate that Hilbert Huang transform is a superior tool for time-frequency analysis of nonlinear and non-stationary data. It is based on an adaptive basis, and the frequency is defined through the Hilbert transform. Consequently, there is no need for the spurious harmonics to represent nonlinear waveform deformations as in any of the a priori basis methods, and there is no uncertainty principle limitation on time or frequency resolution from the convolution pairs based also on a priori basis.