^{1}

^{*}

^{1}

^{*}

^{1}

^{*}

Image registration is an important issue in medical analysis. In this process the spatial transformation that aligns the reference image and the floating image is estimated by optimizing a similarity metric. Mutual information (MI), a popular similarity metric, is a reliable criterion for medical image registration. In this paper, we present an improved method for multimodal image registration based on maximization of a new form of normalized MI incorporating particle swarm optimization, PSO, as a searching strategy. Also a new hybrid PSO algorithm is applied to approach more precise and robust results with better performance.

Medical image registration plays an increasingly important role in many clinical applications, including the detection and diagnosis of diseases, planning of therapy, guidance of interventions, and the follow-up and monitoring of patients [

Medical images which achieved by different sensors (modalities), basically can be grouped in two categories: first, anatomical images such as computed tomography (CT), magnetic resonance (MR) and ultrasound (US) that show body organs in their total structure; second functional images such as positron emission tomography (PET) and single photon emission computed tomography (SPECT) that show soft tissues and their internal activities. The aim of multimodal medical image registration is combining data of different modalities to obtain more complete and detailed information about the patient.

The first step in registering two images is selecting some common properties of images and then matching them [

Over the last few years, mutual information (MI) has become one of the most popular and widely studied similarity criterions for intensity-based registration [

For the optimization of similarity measure, local methods or global methods can be used. Local methods such as steepest descent gradient, Powell’s direction set, conjugate gradient, Levenberg-Marquardt [

On the other hand, the conventional GA and PSO cannot find the global optimum well. So a new approach named hybrid particle swarm optimization, HPSO, have been proposed which incorporates two concepts (subpopulation and crossover) of GA into the conventional PSO [

In this paper we present brain image registration with affine transformation by maximization a modified logarithmic NMI, MNMI, by using proposed HPSO as an optimization algorithm. In Section 2 registration method and in Section 3 PSO and HPSO will be explained. The experimental results and the conclusion will be presented in Sections 4 and 5 respectively.

The required steps in image registration is shown in

In this equation is the best transformation, x is the coordinates of the image and T shows the transformation and its parameters for simplicity.

The image registration algorithm can be classified into two categories of rigid and non-rigid registration. Rigid transformation involves the translation and rotation parameters, whereas non-rigid contains these parameter as well as any other changes.

The affine transformation is a non-rigid transformation which maps straight lines to straight lines and preserves the parallelism between lines. It estimates rotation, scaling, shear and translation parameter that can be shown as R, S, H and T matrices respectively as below

The affine transformation matrix, A, is

The two dimensional affine transformation which is used in this paper contains, , and s that are representing translations along x and y axes, rotation and scaling. Eq.7 shows the mapping of image coordinates based on these parameters.

Mutual information is a reliable and most used method based on the gray levels to measure the similarity metric between two images. It is a concept of information theory that measures the statistic correlations between two data, which is based on the Shannon entropy [

Shannon entropy weights the information per outcome by the probability of that outcome occurring. The Shannon entropy can also be computed for an image, in which case we focus on the distribution of the gray values of the image. If each pixel in an image be viewed as random events, the information an image contains can be measured by Shannon entropy. Shannon’s entropy can be viewed as a measure of uncertainty or how much information an image contains.

For an image the probability of pixels with gray level x is, the Shannon entropy of an image can be defined as

implies occurring probability of gray level x. The Shannon entropy is also a measure of dispersion of a probability distribution.

In the calculation of MI the joint entropy is used. It is shown as H(A,B) in Eq.9.

In the above equation is the joint probability distribution function of the pixels values a and b in the images A and B. The joint probability distribution of the two images is estimated by calculating a joint histogram of the gray values. It is a two dimensional plot showing the combinations of gray values in each of the two images for all corresponding points. The joint probability distribution of the gray values of the images is achieved by dividing each entry in the histogram by the total number of entries. The mutual information of the two images A and B is defined as following equation.

In this equation H(A) and H(B) are entropies of the images A and B which are obtained from probability distribution function, Eq.11 and Eq.12.

In the complete image registration the joint entropy has the lowest value and the MI becomes maximum [

Although MI is a good metric, but it is sensitive to overlap regions of the images, so that by decreasing these regions, the samples will be decreased which lessen the power of statistical probability function estimation. Also the MI can be increased with more dismatching of the images. The normalized mutual information, NMI, metric has been proposed to overcome this problem. It has less sensitivity to overlap changes [

By every change in the parameters values in each step, a new transformation is applied to the floating image. So its entropy is changed. As the result the MI measure is not a uniform function and has many fluctuations. To have a smoother curve a logarithmic normalized mutual information, LMNI, has been used [

In this paper we propose a modified normalized mutual information, MNMI, which is more efficient and has smoother curve than LNMI, and is as following expression.

As we see in the LNMI equation, the entropy of the floating image H(B) is less effective than entropy of the reference image H(A), so it has less role in estimation of the transformation parameter. However as the Eq.15 shows the effects of both images are similar in MNMI.

PSO has been used as a searching strategy for finding the transformations parameters. It is a populated searching method based on the stochastic technique that is inspired by social behavior of bird flocking and fish schooling [

In PSO, a population of individuals is evolved by cooperation and competition among the individuals themselves through iterations. Each individual, named particle, of the population, called swarm, represents a potential solution to a problem. Each particle changes its position in search space and updates its velocity according to its own movement experience and neighbors’ movement experience, aiming at a better position for itself. All of particles have fitness values which are evaluated by the fitness function to be optimized.

PSO is initialized with a number of random particles as a group. The ith particle of the group is defined by a velocity vector and a position vector in a D dimensional space. In each iteration, the best position that gives the most fitness for each particle, pbest, and for all of the particles, gbest, are achieved. According to these values, particles update their position and velocity by the Eq.16 and Eq.17.

In these equations k is the iteration number, d = 1, 2, ···,D, i = 1, 2, ···, N and N is the size of the population. and are acceleration coefficient and usually have constant value of 2. and are random number between 0 and 1. w is inertia coefficient which will vary according to the Eq.18.

, and g is the maximum number of iteration. The velocity of particles should be in the range to be ensured that particle does not exit from allowed searching space. The process will be stopped when it reaches to a predetermined number of iterations or a minimum error.

PSO is done in following steps:

1) PSO is initialized with N number of random particles in searching space.

2) The fitness function is calculated for each particle in initial population. And the pbest and gbest is determined.

3) The velocity and position vectors of particles is updated according to Eq.16 and Eq.17 .

4) The fitness function is evaluated again.

5) If is better than then.

6) If is better than then.

7) If the stopping condition is satisfied the algorithm will be terminated, else repeat from step 3.

In this paper, we propose a new hybrid particle swarm optimization (HPSO), which incorporates two concepts of genetic algorithms, subpopulation and crossover, into the PSO. In this algorithm, the particles will be grouped in the M number of subpopulations, Each of them has its own global best particle, , for m = 1, ···, M.

Here, the best four subpopulations with most is determined and the particle related to the in each of these four subpopulations will be a candidate to be a parent for crossover. These candidates will be ranked 1 to 4 according to the values, in which 1 is related to highest value. The four parents then will be chosen among the candidates with the probability allocated to each candidate as Eq.19 .

where n is the ranking number.

Each pair of parents and (in the case of) generate two children and by arithmetic crossover shown as below.

where rand is a uniformly distributed random number among 0 to 1. The velocities are given by

The children then will be replaced the worth particles of their parents’ subpopulations. If, a candidate will be randomly chosen with equal probability and will substitute one of the parents.

The above procedure of HPSO is added after every evaluating of fitness function in conventional PSO algorithm.

In this section, we used several experiments to show the better performance of MNMI metric than LNMI and NMI metrics, and the advantage of using proposed HPSO algorithm over GA, conventional PSO and HPSO proposed by Chen [

In this experiment, we have used a pair of MR, CT images as data set. The MR image is the reference image, and CT image which is rotated 12˚, is considered as floating image. The registration of these two images is performed with three similarity metrics MNMI, LNMI and NMI individually by a comprehensive search (without optimizing algorithm) in the range of [−20˚, 20˚].

values versus rotation. The maximum value of the similarity metrics occurs at −12˚. As these figures demonstrate, our MNMI measure has very smoother curve with less fluctuations. It causes that optimization algorithms do not trap in local optima.

Here, the results of performing HPSO for image registration based on maximization of MNMI is presented . The algorithm has been applied to a pair of CT images of the brain as monomodal; and a pair of “MR-T2, MR-PD” and “MR-T2, CT” as multimodal.

In the experiment, first image is the reference image, and by applying the translations of and pixels, the rotation of and the scaling of to the second image, the floating image is obtained. The used HPSO algorithm has the population of 40 particles, the number of 8 subpopulations and 40 iterations. The desired parameters for proper registration is achieving the minus value of, , and.

To evaluate the performance of our algorithm, it has been compared to a GA and a PSO with similar population and iterations; and also the HPSO introduced in [

The proposed algorithm is implemented in MATLAB and evaluated using real patient brain images from the Whole Brain Atlas, WBA, [