Threshold Optimization Of Hopfield Neural Network To Gain Higher Success Rate For Image Recognition

DOI : 10.17577/IJERTV2IS4734

Download Full-Text PDF Cite this Publication

Text Only Version

Threshold Optimization Of Hopfield Neural Network To Gain Higher Success Rate For Image Recognition

Shantanu Jana

ABSTRACT -Hopfield Neural Network takes decision on predefined threshold value. In this paper I have

shown a new threshold derivation technique which reduce false pattern and increase the success rate of image recognition .

General Terms

Optical character recognition, Hopfield network, Image correction, Threshold value deduction .

Keywords

Threshold optimization, Image correction, Hopfield Network, Threshold value deduction, Threshold optimization.

1. INTRODUCTION

Hopfield network is very much effective for image correcting purpose. . It is also used for feature extraction of patterns [1] .It has been proved that Hopfield Neural network can store 15N pattern in the network where N is the number of neuron.

But if the number of train set and image size

increases the performance reduced due to two reasons

  1. The stored patterns become unstable;

  2. Spurious stable states appear (i.e., stable states which do not correspond with stored patterns).

    That is why it is not suitable for pattern recognition

    purpose where we store large number of patterns in

    the network. It best works where number of stored

    pattern and the number of pixel are very less.

    2. New threshold derivation algorithm

    It has observed that if the number of test pattern and size of the matrix increase then Hopfield neural network produces false pattern. Thus Hopfield neural network is not suitable for pattern recognition.

    New threshold derivation technique reduces the false pattern and increase the success rate for large pattern size and large sate of training pattern

    1. Concepts

      The Hopfield net consist of a number of nodes , each connected to every other node

      [2 3] . It is fully connected network .It is also a symmetrically weighted network,

      The weights on the link from one node to another are same in both directions. This network have matrix of weight [4] a

      Where D is the number of class patterns {

      }, vectors consisting of +/-

      elements to be stored in the network, and n is the

      number of components, the dimension, of the class pattern vectors.

      The update function for nodes in a Hopfield network, given below

      Here k is the number of test patterns and j is the row number of weight matrix and jth test pattern has chosen from the k number of patterns.

      The threshold value is taken here is 0 .

      . In this case, the weights of

      the connections between the neurons have to be

      thus set that the states of the system corresponding with the patterns which are to be stored in the network are stable.

      I have constructed a HNN classifier according to

      the Hopfield theory.

      I have consider the bipolar value to prevent data loss.

      I have multiplied Pattern Matrix array value with 2 and then a subtraction has done by

      1 to make the input data

      bipolar. Bipolar simply is a representation of binary

      string with 1s and 1s rather than 0s and 1s. This is done because binary has one minor flaw. Which is that 0 is not the inverse of 1. Rather 1 is the mathematical inverse of 1.

    2. Algorithm

      Step-1: Zi=wji*Xki for i=0 to number neuron K is pattern number

      Step-2: Low =| Zlow |

      High=|Zhigh |

      X=low+ high

      Step-3: Boundary value calculation For v= 1 tvo n v

      Step-5:

      For all ( P2n+1 to P2n+2 ) values the Xki =0 and ( P2n to P2n+1) values Xki=1 .while n= 0 to X

      Step-6:

      And the value of Xki remain unchanged if the value of Xki matches with the

      randomly chosen ¼ th numbers of Z value ..

      1. Train phase

        I have used binarized Bangla numerals in a minimum bounding box and normalized into 32 × 32 pixels .

        Then I have trained both network with the Bengali digit 1,2 and 3

        Fig 2.Binary representation of the

        Numeric image 1and 2

        Pv=X/2 while 2 < X/2

        Threshold assignment for boundaries

        Step-4:

        For all ( – P2n+1 to – P2n+2) values the Xki =1

        and ( – P2n to – P2n+1 ) values Xki=0 .while n= 0 to X

      2. Test phase

        I have tested both type network with a corrupted image of digit 1 and 2

        Fig 3. Binary representation of the Test

        image

  3. TEST RESULT

Case 1

When I am taking threshold 0 for 8*8 pixel Image

With number of different 10 patterns and number of train set 10 Hopfield neural network success fully recognize the test image with no false state . New Threshold optimization algorithm also recognize test image

success fully with no false state.

Case 2

When I am taking threshold 0 then for the 32 * 32 Image with number of different 1 pattern and number of train set 5 then Hopfield neural net field to recognize

test image with false state. But with new threshold values the network successfully recognize the test image .

Case 3

when I am taking threshold 0 then for the 32 * 32 image with number of different pattern 2 and number of train set 5 then Hopfield neural net failed to recognize the test image with false sate. but with new threshold values it success

full to recognize.

3. CONCLUSION Image size 8*8 Pixel

Threshold Value

Number of different pattern

Number of train set

Recognition result

0

10

10

Successful

New value

10

10

Successful

Image size 32*32 Pixel

Threshold Value

Number of different pattern

Number of train set

Recognition result

0

1

1

Failed with false pattern

New value

1

1

Successful

Image size 32*32 Pixel

Threshold Values

Number of different pattern

Number of train set

Recognition result

0

2

5

Failed with false pattern

New value

2

5

Successful

New threshold optimization technique reduces false pattern and increase the success rate for large image size and training set.

  1. ACKNOWLEDGMENTS

    I am thankful to the department of Computer Science & Engineering of Jadavpur University, Kolkata, for giving me the platform for planning and developing this work in departmental laboratories.

  2. REFERENCE

  1. A.Nag, S. Biswas, D. Sarkar, P.P. Sarkar, B. Gupta,A Simple Feature Extraction Technique of a Pattern By Hopfield

    Network, International Journal of Advancements in Technology ISSN 0976-4860 .

  2. Young, S.S., Scott, P.D., Nasrabadi, N.M. Object recognition using multilayer

    Hopfield neural Network, IEEE Transactions on Image Processing, 1997 Vol. 6, No. 3, pp. 357-372

  3. Hofield, J.J and Tank, D.W., neural Computation of Decision inOptimizations Problems , Biol. Cybern. No. 52

  4. Bipul Pandey, Sushil Ranjan, Anupam Shukla,and Ritu Tiwari,

Sentence

Recognition Using Hopfield Neural Network, IJCSI International Journal of Computer Science Issues, Vol. 7, Issue 4, No 6, July 2010 .

Authors Profile

Shantanu Jana

B.Tech form WBUT and M.Tech from Jadavpur University,Kolkata,India. Ex Assistant Professor Adamas Institute of Technology, Computer Science and Engineering Department. Presently working in shansoft as application developer.

Leave a Reply