Hybrid Designing of a Neural System by Combining Fuzzy Logical Framework and PSVM for Visual Haze-Free Task

Brain-like computer research and development have been growing rapidly in recent years. It is necessary to design large scale dynamical neural networks (more than 10 neurons) to simulate complex process of our brain. But such kind task is not easy to achieve only based on the analysis of partial differential equations, especially for those complex neural models, e.g. Rose-Hindmarsh (RH) model. So in this paper, we develop a novel approach by combining fuzzy logical designing with Proximal Support Vector Machine Classifiers (PSVM) learning in the designing of large scale neural networks. Particularly, our approach can effectively simplify the designing process, which is crucial for both cognition science and neural science. At last, we conduct our approach on an artificial neural system with more than 10 neurons for haze-free task, and the experimental results show that texture features extracted by fuzzy logic can effectively increase the texture information entropy and improve the effect of haze-removing in some degree.


Introduction
Driven by rapid ongoing advances in computer hardware, neuroscience and computer science, artificial brain research and development are blossoming [1].The representative work is the Blue Brain Project, which has simulated about 1 million neurons in cortical columns and included considerable biological detail to reflect spatial structure, connectivity statistics and other neural properties [2].The more recent work of a large-scale model for the functioning brain is reported in the famous journal Science, which is done by the group of Chris Eliasmith's group [3].In order to bridge the gap between neural activity and biological function, Chris Eliasmith's group presented a 2.5-million-neuron model of the brain (called "Spaun") to exhibit many different behaviors.Among these large scale visual cortex simulations, the visual cortex simulations are most concerned.The two simulations aforementioned are all about the visual cortex.The number of neurons in cortex is enormous.According to [4], the total number in area 17 of the visual cortex of one hemisphere is close to 160,000,000.For the total cortical thickness the numerical density of synapses is 276,000,000 per mm 3 of tissue.It is almost impossible to design or analyze a neural network with more than 10 8 neurons only based on partial differential equations.The nonlinear complexity of our brain prevents our progress from simulating useful and versatile functions of our cortex system.Many studies only deal with simple neural networks with simple functions, and the connection matrices should be simplified.The visual functions simulated by "Blue Brain Project" and "Spaun" are so simple that they are nothing in the traditional pattern recognition.
On the other hand, logic inference plays a very important role in our cognition.With the help of logical design, the things become simple, and this is the reason why computer science has made great progress.There are more than 10 8 transistors in a CPU today.Why don't we use similar techniques to build complex neural networks?The answer is yes.As our brains work in the non Turing computable way, fuzzy logic rather than Boolean logic should be used.For this purpose, we introduce a new concept-fuzzy logical framework of a neural network.Fuzzy logic is not a new topic in science, but it is really very fundamental and useful.If the function of a dynamical neural network can be described by fuzzy logical formulas, it can greatly help us to understand behavior of this neural network and design it easily.
For neural systems, the basic logic processing module to be used as a building module in the logic architectures of the neural network comes from OR/AND neuron [3,5], also referred by [6].The ideal of hybrid design neural networks and fuzzy logical system is firstly proposed by [7].While neural networks and fuzzy logic have added a new dimension to many engineering fields of study, their weaknesses have not been overlooked, in many applications the training of a neural network requires a large amount of iterative calculations.Sometimes the network cannot adequately learn the desired function.Fuzzy systems, on the other hand, are easy to understand because they mimic human thinking and acquire their knowledge from an expert who encodes his knowledge in a series of if/then rules [7].
Neural networks can work either in dynamical way or static way.The former can be described by partial differential equations and denoted as "dynamical neural networks".Static points or stable states are very important for dynamical analysis of a neural network.Many artificial neural networks are just abstract of static points or stable states of dynamical neural networks, e.g.perception neural networks, such a kind of artificial neural networks work in a static way and are denoted as "static neural networks".There is a natural relation between a static neural network and a fuzzy logical system, but for dynamical neural networks, we should extend the static fuzzy logic to dynamic fuzzy logic.A novel concept denoted as "fuzzy logical framework" is defined for this purpose.
At last, we give out an application of our hybrid designing approach for the visual task about image matting-haze removing from a single input image.Image matting refers to the problem of softly extracting the foreground object from a single image.The system designed by our novel hybrid approach has a comparable ability with ordinary approach proposed by [8].Texture information entropy (TIE) is introduced for roughly evaluating the effect of haze removing.Experiments show texture features extracted by fuzzy logic can effectively increase TIE.
The main contributions of this paper include: 1) we develop a novel hybrid designing approach of neural networks based on fuzzy Logic and Proximal Support Vector Machine Classifiers (PSVM) learning in the artificial brain designing, which greatly simplifies the designing of large scale artificial brain; 2) a novel concept about fuzzy logical framework of neural network is firstly proposed; 3) instead of the linear mapping in [8], a novel nonlinear neural fuzzy logical texture feature extracting, which can effectively increase TIE, is introduced in the task of haze free application.The experiments show that our approach is effective.

Hopfield Model
There are many neuron models, e.g.Fitz Hugh (1961), Morris, Lecar (1981), Chay (1985) and Hindmarsh, Rose (1984) [9][10][11].Whether the fuzzy logical approach can be used in all kinds of neural networks for different neuron models?In order to answer this question, we consider a simple neuron model-Hopfield model [12] (see Equation (1.1)) as a standard neuron model, which has a good character of fuzzy logic.We have proved that Hopfield model has universal meaning, such that almost all neural models described by first order differential equations can be simulated by them with arbitrary small error in an arbitrary finite time interval [13], these neural models include all the models summarized by H D I [14].

 
; where sigmoid function   S can be a piecewise linear function or logistic function.Hopfield neuron model has a notable biological characteristic and has been widely used in visual cortex simulation.One example of them is described in [7,10,[15][16][17]), (see Equation (1.2)).Such cellos membrane potential is transferred to output by a sigmoid-like function.Only the amplitude of output pluses carries meaningful information.The rising or dropping time   t  of output pluses conveys no useful information and is always neglected.According to [15], the neural networks described by Equation (1.2) are based on biological data [18][19][20][21][22][23].
In such kind neural networks, cells are arranged on a regular 2-dimensional array with image coordinates .
where  

Fuzzy Logical Framework of Neural Network
Same fuzzy logical function can have several equivalent formats; these formats can be viewed as the structure of a fuzzy function.When we discuss the relationship between the fuzzy logic and neural network, we should not only probe the input-output relationship but also their corresponding structure.Section 3.1 discusses this problem.Section 3.2 discusses the problem about what is the suitable fuzzy operator, and in Section 3.3, we prove three theorems about the relationship between the fuzzy logic and neural network.

The Structure of a Fuzzy Logical Function and Neural Network Framework
In order to easily map a fuzzy formula to a dynamical neural network, we should define the concept about the structure of a fuzzy logical function.

Definition 1.
[The structure of a fuzzy logical function] If is a set of fuzzy logical functions(FLF), and a FLF can be represented by the combination of all FLFs in with fuzzy operators " " and "  ", but with no parentheses, then the FLFs in is denoted as the  , , , has a recurrent structure, then it can be represente s Equa time needed for output is d a t tion (1.3), and the and , , , , , , n f x x x  can create a time serial output and can be written in pa fferential fo rtial di rm as Equation (1.4).

The Suitable Fuzzy Operator e all s sub logi fuzzy
After the theory of fuzzy logic was co many fuzzy logical systems have bee nceived by [24], n presented, for example, the Zadeh system, the probability system, the algebraic system, and Bounded operator system, etc.According to universal approximation theorem [25], it is not difficult to prove that q-value weighted fuzzy logical functions (5) can precisely simulate Hopfield neural networks with arbitrary small error, or vice versa, i.e. every layered Hopfield neural network has a fuzzy logical framework of the q-value weighted bounded operator with arbitrary small error.This means that if the sigmoid function used by Hopfield neurons is a piecewise linear function, such kind fuzzy logical framework is structure keeping.Unfortunately, if the sigmoid function is logistic function, such kind fuzzy logical framework is usually not structure keeping.Only in an approximate case(see Appendix A), a layered Hopfield neural network may have a structure keeping fuzzy logical framework.
and Bounded sum: In order to simulate neural cells, it is necessary to exe Bounded erator to Weighted Bounded Oper tend th Op ator.The fuzzy formulas defined by q-value weighted bounded operators is denoted as q-value weighted fuzzy logical functions.
where and In fact, if we introduce a new variable i i y x   change , the Van-der-Pol generator [28]    , , , , , , where every a Hopfield neural circuit which only has one cell wi nput k I .
At the fixed point, every neuron works just lik ron in a perception neural network.Theorem 1 tries to sh e a neu-ow the condition of Equation (1.17) to simulate disjunctive normal form (DNF) formula.The fixed point of Equation (1.17) can easily simulate binary logical operators; on the other hand, a layered neural network can be simulated by a q-value weighted fuzzy logical function.
Theorem 1 Suppose in Equation (1.17), 1 i a  , and every , 0, 0,1 is the class which has the following two characters: (1).for every , , then acc e is th

S C
 eorem: if ording to th static error defined by Equat 6) trends to 0 .

If the point of the neural cell de Eqauti fixed
scribed by on (1.17) can simulate the Boolean formula which is not a constant with arbitrary x x  small error, and for a definite binary input 1 2 , , , k x x x  , then the arbitrary small error is achieved when  trends to infinite and   where l S is the set of the labels and , it is necessary for lim  is the static error ween the neural cell Equation (1.18) is just a perception neural network, so a perception neural network can be view of static points or stable states of a real described by Equation (1.18).red neural network can be si ed as an abstract neural network Theorem 2 shows the fact that a continuous function can be simulated by a layered Hopfield neural network just like a multi layered perception neural network with arbitrary small error, and a laye mulated by a q-value weighted fuzzy logical function.Theorem 2 is directly from the universal approximation theorem [25,37]'s proof.
, we can build a layered neural network  defined by Equation (1.18), and its fixed point can be viewed ontinuous map as a c , , , , , we can find a q-value fuzzy ction logical fun , , , , .
nt neu networks described by the Equation (1.1 Theorem 3 tries to prove that all kind recurre ral 6) can be simulated by Hopfield neural networks described by Equation (1.1).The ordinary differential Equation (1.16) has a strong ability to describe neural phenomena.The neural network described by Equation (1.16) can have feedback.For the sake of the existence of feedback of a recurrent neural network, chaos will occur in such a neural network.As we known, the important characteristics of chaotic dynamics, i.e., aperiodic dynamics in deterministic systems are the apparent irregularity of time traces and the divergence of the trajectories over time (starting from two nearby initial conditions).Any small error in the calculation of a chaotic deterministic system will cause unpredictable divergence of the trajectories over time, i.e. such kind neural networks may behave very differently under different precise calculations.So any small difference between two approximations of a trajectory of a chaotic recurrent neural network may create two totally different approximate results of this trajectory.Fortunately, all animals have only limited life and the domain of trajectories of their neural network are also finite, so for most neuron models, the Lipschitz condition is hold in a real neural system, and in this case, the simulation is possible.

Hybrid Designing Based on the Fuzzy Logic and PSVM
For the sak networks are discussed.For a layered neural network if the number of neurons at every layer is fixed mber of structure keepin N , the of N is fixed, so the number of the fuzzy logical frameworks of a neural network is also fixed.When the coefficients of N are continuously changed, the neural network N is shifted from one structure keeping fuzzy ical framework to another.There are two different parameters in a dynamical layered neur network.The first kind parameters are weigh hich represent the connection topology of neural network.The second param hich control the time of spikes.Time coefficients should be decided according to the dynamical behavior of the whole neural network.
There are two ways to design weights of a layered neural network: (1) according out this neural network, we can design the weights based on the q-value weighted fuzzy logical functi ccording to the input and output relation function , , , , , we use machine learning approaches ,e.g.Back Propagation method, to learn weights for   , , , , . In order to speed up the learning process, for a layered neural network, we esigning with PSVM [15], called as "Logical support vector machine (LPSVM)".
LPSVM a

Hybrid Design of Columnar Organi of a Neural Network Based on Fuzzy Logic and PSVM
In the neural science, the design of nonlinear dynamic neural networks to model bioneural experimental results an intricate task.But with the help of fuzzy log n design neural models networks.In our Hybrid designing approach(LPSVM), we firstly design neural networks with the help of fuzzy logic, and then we use PSVM to accomplish the learning for some concrete visual tasks.
Although there are already many neural model to simulate the functions of the primary visual cortex, they only focus on very limited function.Early works only try to address the problem of mbining ideas and approaches from biological and computational vision [39][40][41], and the most recent works [1,3] are only for very simple pattern recognition tasks.In this experiment, we try to hybrid design a model of a columnar organization in the primary visual cortex which can separate haze from its background.

The Theory of Image Matting
According to Levin A et al. (2008), image matting refers to the problem of softly extracting the fo from a single input image.Formally, methods take I as an input, which is assumed to be a composite of a foreground image F and a background B in a linear form and can be written as After training, the neural fuzzy logical network will generate the result of alpha matte.In the application of alpha matting, our method can remove the haze using dark channel prior as the trimap.

Neural System for Haze-Free Task with Columnar Organization
Many functions of th known, but the columnar organization is well unders The lateral geniculate nucleus (LGN) transfers info from eyes to brain stem and (V1) [42].Columnar organization of V1 plays an important role in the processing of visual information.V1 is composed of a grid   2 1 1 mm  of hypercolumns (hc).Every hypercolumn contains a set of minicolumns (mc).Each hypercolumn analyzes information from one small region of the retina.Adjacent hypercolumns analyze information from adjace the retina.The recognition of our hypercolumns' system (see Figure 2) is started with the recognition orientation or simple structure of local patterns, then the trimap image is computed based on these local patterns.The hypercolumns is designed by LPSVM, the weights of 1 st and 2 nd layers are designed by fuzzy logic, and the weights of the 3rd layer are designed by PSVM to learn the trimap image.

The 1 st Layer
Every minicolumn (Figure 3) in the 1st layer tries to change a 3 3   pixels' image block into a binary 3 3 

alized. cus a
The pixels' tex attern.The input image is norm Hopfield neurons to fo ture p there This process needs 3 3  3 3  small window, every neuron focuses only one pixel, and are two kinds of fuzzy processing. 1 st processing directly transforms every pixel's value to a fuzzy logical one b sigmoid function, and the 2nd ssing is also completed by a sigmoid function, the difference is that every boundary pixel's value subtracts h the center pixel's value before sending it to a sigmoid function.Such processing emphasizes the contrast of texture, and our experiments support this fact.These two processing scan be viewed as some kind preprocessing of input image.Every neuron in a 1st layer's minicolumn has only one input weight ij w in Figure 3, which equals 1; when y a proce wit    , the coefficient  in Equation (1.17) changes the outputs from fuzzy values to binary numbers(see Figure 4).

The 2 nd Layer
Every minicolumn in the 2 nd layer works in the way described by Hopfield neuron equation as Equation (1.18) or Equation (1.17) and can be viewed as a hypercolumn columns, which focuses on same of the 1 st layer mini small 3 3   window, and has some ability to recognize a definite shape (see Figure 5).If there are total q local small patterns, a hypercolumn in the 2 nd layer contains q (in our system 256 q  or 512) minicolumns of the 2 nd lay hich have same receptive field, and try to recognize q local small patterns from q minic lumns of the 1 st layer .The pixel value is "1" for white and "0" for black.In this mode, 3 3   Hopfield neurons of the st 1 layer output a

 
, , , 1, ,8 are the outputs from the 1 st he center pixel's value is also sent layer's minicol ery 2 nd ntains 512 2 inicolumns, In our system, a hypercolumn in he 2 nd -layer contains 512 2 nd layer's minicolumns for L PW and LBIPW way, 6 2 nd layer's minicolu percolumn in the 2 nd layer has a 3. Hybrid LIPW and LBPW (LBIPW).In this approach, the boundary pixels' value are substracted by the center pixel's value in a 3 3   small window similar LBPW, except that t to to the input of every 2 nd umn.So ev layer hypercolumn also co (1.21)In order to recognize a binary pattern, an "and" neuron with index i is needed (see Figure 5) for every 2 ndlayer mi column, and the weights of this "and" neuron to the 1 -layer minicolumns are set as Equation (1.22), the corresponding threshold 5.1 T  , the parameter 0.9 and th coefficient of a quation (1.17 where for LIPW and LBIPW, LB 2) 1, 2,3, , 9 j   ; for PW,the center 1 st -layer minicolumn is useless, so 1, 2,3, , j 8   .

The 3 rd Layer
The output of a hypercolumn in layer, which has the 2 nd arget is provided so calle channel prior which is computed by the a mentioned in [8].As the small windows focused by rlapped, the fo

The 4 th
There are two kinds minicolumns in this layer.At first, yer minicolumn, which is just a matte im from t co Layer every 1 st kind 4 th -la Hopfield neuron, computes a pixel value of the alpha age he overlapped output of minicolumns in the 3 rd layer, then the 2 nd kind mini lumn tries to remove the haze from original image.A 2 nd kind minicolumn in the 4 th layer computes a pixel of a haze free image according to the Equation (1.23) where q is max gray or RGB value of a pixel, and where i J is the haze free image, i I is the original image, i A is the global atmospheric light which can be estimated from dark channel prior, i  is the alpha matte generated by 3 rd layer.We can use back propagation approach to compute pixels' value   where i J is the haze free image, i I is the original image, i A is the global atmospheric light which can be estimated from dark channel prior, i  is the alpha matte generated by 3 rd layer, and 0  is a threshold, a typical value is 1.In order to compute Equation (1.24).

L mini
These two minicolumns are constructed by a fuzzy formula similar to Eqaution (1.21).In the experiment, the threshold changes from 0.0 to 1.
In Figures 8 (a)-( m is opposite to m1's output in the task of recognize a horizontal bar or a vertical bar, so the most suitable threshold for logical operator "and" can be selected by le 1, which e about texture information entropy of the image, we can see that the texture information entropy is increased after haze-free processing, so our approaches have higher ability to increase the texture information entropy than the linear approach proposed by [8].Theoretically speaking, LBPW is a pure texture processing, so LBPW has a highest value, LIPW is much more weaker than LBPW, LBIPW is the hybrid of LBPW and LIPW, so it has a average ability.The texture information entropy of the Area1 correctly reflects this fact.But for the Area2, as it already has a clearest texture structure in the original im (1.26).
Here  1  i fr m is the fitting rate (output) at the place P1 of the minicolumn i m .2. The Haze-Free Experiment Result (a) The Haze-Free and texture information entropy Texture information can give out a rough measure about the effect of haze-freeing, we use the entropy of the texture histogram to measure the effect of deleting haze from images.The entropy of the histogram is desc age, the deleting of haze may cause overdone.The texture information is over emphasized by LBPW in the Aera2, so it has a lowest texture information entropy and almost becomes a dark area.This fact means that overtreatment is more easier to appear in a non linear processing than a linear one in the haze-free task.
(b) The effect about the degree of fuzzyness (Figure 9) Just as the theorem 1 mentioned above, the parameter  in Eqaution (1.17) can control the fuzzyness of a Hopfield ribed in Equation (1.27).
Haze makes the texture of an image unclear, so theoretically speaking, haze removing will increase the entropy of the texture histogram.rn, bu y info LB mation at least for the center pixel of a 3 3   small window.

Discussion
It is very difficult to des n or analyze a large-scale nonlinear neural network.Fortunately, almost all neural models which are described by the first order differential equations can be simulated by Hopfield neural models or logical functions with Weighted Bounded operator.We can find fuzzy logical frameworks for almost all neunetworks, so it becomes possible to debug thousands of parameters of a huge neural system with the help of fuzzy logic, for more fuzzy logic can help us to find useful feature for visual tasks, e.g.haze-free.

Appendix A
A Hopfield neuron can approximately simulate Bounded operator.
Bounded operator If there are only two inputs , so in this case, if is large enough, according to equation (a).We can select a , that makes   Based on above analysis, the Bounded operator fuzzy system is suitable for neural cells described by Equation (1.1) when 1.0 i a  , 1 and 2 .For arbitrary positive i , 1 and 2 , we can use corresponding q-value weighted universal fuzzy logical function based on Bounded operator to simulate such kind neural cells.If a weight is negative , a N-norm operator  is 0, when 1.0 a  , 1 1.0 w  and .In this case the "errOr" and "errAnd" is less than 0.01.Our experiments show that suitable i can be found.So in most cases, the bounded operator   By inductive approach, we can prove that f  also follows the associative condition and max 0, 1 For more if we define (usually, a negative weight min , max 0, max 0, 1 q q wx q wx w q x w q Copyright © 2013 SciRes.IJIS

1 S
sigmoid-like activation functions, and  is the local inhibition connection in the location , and , i j i J    and , i j W    are the synaptic connections between the excitatory cells and from the excitatory cells to inhibition cells, respectively.If we represent the excitatory cells and inhibitory cells with same symbol i and summarize all connections (local U  , global exciting , i j W    and global inhibiting , i j J    ) as ij , the Equation (1.2) can be simplified as Hopfield model Equation (1.1).w

2 n
13) which is just a special case o ation (1.16); for the integrate-and-fire model, if we use logistic function model can also has the e integrate-and-fire form of the Equation (1.16).So the Equation (1.16) can be viewed as a general representation of almost all neuron models, if we can prove the Equation (1.16) can be simulated by a neural network based on the Hopfield neuron model [see Equation (1.1)], then almost all neuron models can be simulated in the same way.

Figure 2 .
Figure 2. A 4 layers' structure of a columnar organization of V1 for haze-background separation.where in this case runs over the 8 neighbors of the xe ngle color ed n e R

Figure 3 . 9 HopfiFigure 4 .
Figure 3. Every 1 st layer minicolumn tries to change local images into binary texture patterns, for a 3 3  small window, a minicolumn in the 1 st layer co eld neurons, and every neuron focuses only one pixel.ntains 9 Hopfi

Figure 5 .Figure 6 .
Figure 5.A hypercolumn in the 2 nd layer contains minicolumns which have same receptive field and try to recognize q q definite small shapes.A "and" ne on is needed for ry 2 nd layer minicolumn.ur eve way for every color R,G or B. So a hy 512 3  dimensions output or 256 3  dimensions output.To recognize above two patterns is simple, a Hopfield neuron defined by E ion (1.17) is enough to recognize a 3 3 quat  image.For example, the "  " shape in Figure 5 can be described by a fuzzy logical formula (Equation (1.21)).The "and" operator for 9 inputs in Equation (1.21) can be created by c (see . a neuron m Figure 5).In Equation (1.21), every pixel ij P has two states7 ij m and ij m .Suppose the unified gray value of ij P is ij g , and an image module needs a high value ij g at the place of ij m and a low value at ij m .So the input neuron mc at ij m is

1 .m and 1 m have 3 3 Figure 7 .
Figure 7.The threshold can control the function of a Hopfield neural cell from logical "or" to "and".b) using a horizontal model m0 to recognize a horizontal line L; c) using a vertical model m1 to recognize a horizontal line L focus of these 2 minicolumns is just upon the line ; at L 1 P , these 2 minicolumns focus the nearby of L ; at 2 P , the line is out of the focus of these 2 columns.

f
Based on Eqaution (1.1)), the membrane potential's fixed point under input is is the best fuzzy operator to simulate neural cells described by (3) and the threshold T i can change the neural cell from the bounded operator f  to f  by analyzing the output at the fixed point


 should be used.Experiments done by scanning the whole region of show that above analysis is sound.We denote the input in (5b) as    Figure A as the solid line and the dotted line respectively.In Figure10, the threshold is scanned from 0 to 4.1 with step size 0.01.The best i in Equation (4) for T f  is 2.54 and the best in Equation (4) for i T f the suitable fuzzy logical framework for the neuron defined by Equation (3).If the weight 1


, we can prove the associative condition is

Figure 10 .
Figure 10.Simulating fuzzy logical and-or by changing thresholds of neural cells.The X-axis is the threshold value divided by 0.02, the Y-axis is errG.The real line is perrAndq between 1 I f  and 2 I , the rest H. HU ET AL. 149 11 neuron models are all the special cases of the generalized model described by the ordinary differential Equation (1.16).
for all RH) neuron is much more complicate than Hopfield neurons.To simulate a complicate neuron by simple neurons is not a difficult task, but the reverse task is almost impossible to complete, i.e., it is almost impossible to simulate a Hopfield neuron by a set of RH neurons.
Step 1: Except for the last output layer's weights,