Generation of Associative Memories Using Cellular Automata

DOI : 10.17577/IJERTV2IS111161

Download Full-Text PDF Cite this Publication

Text Only Version

Generation of Associative Memories Using Cellular Automata

B. Luna-Benoso

Instituto Politécnico Nacional

Escuela Superior de Cómputo

Av. Juan de Dios Bátiz, esq.

Miguel Othón de Mendizábal, Cuidad de México 07320, México

J. C. Martínez Perales

Instituto Politécnico Nacional

Escuela Superior de Cómputo

Av. Juan de Dios Bátiz, esq.

Miguel Othón de Mendizábal, Cuidad de México 07320, México

R. Flores-Carapia

Instituto Politécnico Nacional

Centro de Innovación y Desarrollo Tecnológico en Cómputo

Av. Juan de Dios Bátiz, esq.

Miguel Othón de Mendizábal, Ciudad de México 07700, México

Abstract

This paper presents the proposal of an associative memory implemented model with cellular automata. The model was applied to the iris plant database of the repertoire bases available by the UCI Machine Learning Repository. The model was compared with others by the reported performance making use of the k-fold cross validation.

Keywords: cellular automaton, associative memories, patterns classification.

  1. Introduction

    The concept of a cellular automaton (CA) was introduced in 1951 by John Von Newmann [1]. Von Newmann defines a cellular automaton as a space able to reproduce itself [2]. The cellular Automaton are mathematical models where the behavior of each one of the elements in the system depends of the local interaction with each other. A CA d-dimentional consist in a lattice or lattice d- dimentional extended infinitely that represents the "space", where each site of the lattice is called cell and have asociated a state variable, called the cell state that fluctuates on an infinite set, called state set. The time advances in discrete stages and the dynamic is given by an explicit rule called local funtion; the local function is used in each time stage for each cell to determine its new state from the current state of certain cells in its neighborhood. The cells alter their states synchronously in discrete time stages according to the local function. The Lattice is homogeneous so that all cells operate under the same local funtion. The state assigment to all the cells in the lattice is called a configuration, which is considered as the state of the total lattice. The cellular automatons have had a variety of applications in various science disciplines [3,4,5,6,7].

    Oblivious to the field of cellular automata. there is the development and study of pattern

    assigned in some of the created regions in the caracteristics space. In general, the full description of the classes is unknown. instead of this. there is a finite and reduced set of patterns that provides partial information about a specific problem.

    Moreover, there is the development of associative memories, wich have been in force since the early 60's. The fundamental proposal of an associative memory is recover correctly full patterns from input patterns, wich can be alterated with additive noise, substractive or combined. The patterns clasification is one of the applications that are given to the asoctivas memories.

    Several researchers have addressed the problem of developing models of associative memories [8,9,10,11,12,13] and have achieved important results for the field of research.

  2. Associative Memories.

    An associative memory can be formulated as a system input and output which is divided into two phases:

    Learning phase: x M y (associative memory generation).

    Recovering phase: x M y

    (associative memory operation).

    The input pattern is represented by a column vector denoted by x and the output pattern for a column vector denoted by y. Each one of the input patterns generate an association with the corresponding output pattern. The notation for an association is similar to an ordered pair (x, y).

    The associative memory M is represented by a matrix whose component ij-th is mij [14]; the matrix M is generated from a finite set of associations previously known, called fundamental set. We denote by p the cardinality of the fundamental set.

    The fundamental set is represented as follows:

    recognition, and a problem of this arearefers to the patterns clasifications. The objective in the clasification consiste in partition the caracteristic

    (x

    , y

    ) | 1,2,…, p

    space to generate regions, which will be assigned to a category or a class. Different patterns must be

    The patterns that form the fundamental set

    associations are called fundamental patterns.

  3. Cellular Automata

Be A a countable family of closed intervals in R such that meet the following conditions:

  1. L is a regular lattice.

  2. S is a finite set of states

  3. N is a defined neighborhoods set as follows.

    N {N (r) | r is a cell and N (r) is a

    1. X [a,b]

    XA

    for some

    a,b R or

    neighborhood of r of size n}

    X R.

  4. f : N S is a function called transition

XA

2. [ai ,bi ] A bi ai 0.

function.

Definition 3.6 Is Q (L,S,N, f ) and

3. [ai ,bi .],[c j , d j .] A

W (L,S,N´, g) two CA. Is defined the CA

[ai ,bi ] [cj , d j ]

composition of the CA Q y W in the time t t0

[ai ,bi ] [cj , d j ] bi c j

denotated as W *Q by the CA

Definition 3.1 Be [a, b] an interval of R with

W *Q (L,S,N, f )

where

h, f

y g are

a b and A

a closed intervals family that satisfy

relationated as follows:

1,2 and 3. A lattice of dimentions 1 or 1-

Ct 1(r) f ({Ct (i) : i N(r)}

dimentional is the set

L {xi [a,b]| xi A } . If 0 0

A1, A 2 ,…, An are intervals families thet satisfy

Ct 2 (r) g({Ct (i) : i N´(r)}

1, 2 and 3 , so a lattice of dimention

0 0

n 1 is the

0 0

0 0

set L {x1 x 2 xn | x1 Ai }. Ct 2 (r) h({Ct (i) : i N(r)}

Definition 3.2 Be

r R

a lattice 1-

dimentional is regular if

[ai ,bi ] r

for each

[ai

,bi

] A . A lattice n-dimentional is regular

a a

a a

if [a ,b ] r

ik ik

= 1, …, n.

for each [a , a ] A for i

a a

a a

ik ik i

Fig. 2. Example of a CA composition.

Definition 3.3 Be L a lattice. A cellule, cell or site is an elemnt of L. This is, a cell is an elemnt of

4. Proposed model

k

k

k

k

the form

[a ,b

1 k 1 k

][an ,b n ]

with

This section will build the associative memory

[a ,b

i k i k

] A

i

for i 1,…,n. .

by Cellular Automata.

In what follows, consider the set

A {0,1} an

Definition 3.4 Be L a lattice, and is r a cell of

the fundamental set

L. A neighbourhood of size n N to r , is the set

CF {(x , y ) | 1,2,…, p}

with

v(r) :{{k1, k2, …,kn}| kj is a cell of L for each

j}.

x An y y Am .

Definition3.5 Be n N . A cullular automata, is a tuple (L,S,N, f ) such that:

The lattice L for the CA will be composed by the matrix of size 2m 2n with the first index the couple (0,0).

The set S {0,1} is the finite states set

so yi , xj {0,1}, , then

Is I { | i 2k

for some

  1. if

    yi x j 0, so

    k 0,1,2,…, n 1]} {0,2,4,…,2(n) 2}

    and

    (2 j 2 y ,2i 2 x ) (2 j 2,2i 2) v .

    J { j | j 2k 1

    i j

    for some

    (2 j 2,2i1)

    k 0,1,2,…, m 1]} {1,3,5,…,2m 1} .

  2. if

    yi 0 y

    xj 1, so

    Considerate the partition of L formed by the

    (2 j 2 y ,2i 2 x ) (2 j 2,2i 1) v .

    subsets family

    IJ {v(i, j ) | (i, j) I J}

    i j

    with

    (2 j2,2i1)

    v(i, j )

    (i, j), (i, j 1), (i 1, j), (i 1, j 1)}.

  3. if

    yi 1 y

    xj 0, so

    Inasmuch as IJ is a partition of L, given vl

    exist

    (2 j 2 yi ,2i 2 xj ) (2 j 1,2i 2) v(2 j2,2i1).

    (

    (

    an unique IJ such that vl v

    i , j )

    . For example, if

  4. if

    yi xj

    1, so

    l (3,0) , so

    (2 j 2 y ,2i 2 x ) (2 j 1,2i 1) v .

    l v v

    l v v

    (3,0)

    (2,1)

    {(2,1), (2,0), (3,1), (3,0)}.

    i j (2 j2,2i1)

    From the previous fact is defined the neighbourhood set

    Is defined the set

    L {(2 j 2 yu ,2i 2 yu | 1 p,1 i myl j n} l.

    N {v´| l L)

    CF i j

    .

    Consider the cellular automata

    Definition 4.1 Consider the set

    Ak . Is defined

    Q (L,S,N, fQ )

    N ' IJ , y

    and

    W (L,S,N' , fw )

    with

    the projected funtion of the

    i

    i

    (1 i k) Pr : Ak A as

    i th

    component

    fQ : N S, fw : N ' S defined as follows:

    1 if (i, j) LCF

    Pri (z) zi , con

    z (z1, z2,…,zk )

    f (v(i, j ) )

    Q

    Q

    1 if (i, j) LCF

    Proposition 4.2 If

    1 in the position (i 1, j) if (i, j 1) 1

    ( yi, xj ) Pryx {(yi, xj ) | yi Pri ( y)

    and

    xj Prj (x)}, so

    j

    j

    (2 j 2 yi ,2i 2 xj ) v(2 2,2i 1) .

    Demonstration Must be

    j

    j

    v(2 2,2i 1) {(2 j 2,2i 1), (2 j 2,2i 2), .

    (2 j 1,2i 1),(2 j 1,2i 2)} .

    fw (v(i, j) )

    1 in the position (i, j 1) if (i 1, j) 1

    Inasmuch as

    ( yi , xj ) Pryx , so

    yi Pri ( y)

    y xi

    Pr j(x)

    and in as much as

    x An

    and

    It defines the CA associative (CAA) in its

    Am , rearning phase as W *Q (L,S,N, fA )

    For the learning phase, is represented two

    different algorithms, aim is recover an associative pattern associated to an input pattern. First is represented the max algorithm to recover the patterns and follows the min algorithm to recover the patterns.

    Algorithm to recovering patterns max

    for i 1,2,…,m

    y

    y

    i

    i

    ~ 0

    for j 1,2,…,n

    if ~x j 1& &(2 j 2,2i 1) 1

    continue

    if ~x j 0 & &((2 j 2,2i 1) ||(2 j 1,2i 1) 1)

    continue else

    ~x

    ~y 1

    ~1

    INPUT : Pattern that recognized ~x x2

    i

    break

    x

    x

    end sec ond

    for

    ~y

    ~

    n

    end

    first for

    ~

    ~

    1

    y

    Example 4.3. Be m 4, n 3 and p 3 . The

    OUTPUT: Recovered pattern ~y 2

    fundamental set CF{(x1, y1),(x2 , y2 ),(x3, y3 )} is

    y

    y

    PROCESS:

    ~

    n

    given by:

    1

    0

    0

    x1 0 y1

    for i 1,2,…,m

    0

    ~y 1

    0

    i

    for j 1,2,…,n

    if ~x j 0 & &(2 j 1,2i 2) 1

    1

    1

    continue

    0

    ~ x2 1 y 2 0

    if x j 1 & &((2 j 2,2i 2) || (2 j 1,2i 2) 1)

    0

    0

    continue

    0

    else

    ~y 0

    i 0

    break

    0

    end sec ond

    for

    x3 0

    y3 1

    0

    end

    first for

    1

    0

    Algorithm to recovering patterns min

    1

    1

    ~x

    ~x

    ~x

    • The lattice L is composed by the matrix of size 2m 2n 8 6 .

      INPUT : Pattern to recognized

      ~x 2

    • The states set S{ 0,1} .

      x

      x

      ~

      n

      ~y

      • The Neighbourhood set is given by

        N {v1:1 L}

        ~

        ~

        1

        y

      • The next set LCF , is one in which fQ take the

        OUTPUT: Recovered Pattern

        ~y 2

        value of 1 and 0 in its complement. (Figure 3a)

        ~y u u

        PROCESS:

        n

        LCF {(2 j 2 yi ,2i 2 x j ) |1 3,1 i 4,

        1 j 3} {(0,0), (1,0), (2,0), (4,0), (5,0), (0,1), (3,1), (4,1),

        (0,2), (1,2), (2,2), (3,2), (4,2), (0,3), (2,3), (5,3), (0,4), (2,4), (4,4),

        (0,5), (2,5), (4,5), (0,6), (2,6), (3,6), (4,6), (5,6), (1,7), (2,7), (4,7)}

    • Applying

      fW to the previous CA, is obteined

      the CAA that it shows in the figure 3b.

    • Now apply the algorithm to recovering patterns

    max and min from the CAA

    i

    y1

    i

    j

    x1

    j

    condition

    y1

    i

    1

    1

    1

    2

    1

    0

    (2 j 2,2i 2) 1

    (2 j 1,2i 2)! 1

    1

    0

    2

    1

    1

    2

    3

    1

    0

    0

    (2 j 2,2i 2) 1

    (2 j 1,2i 2) 1

    (2 j 1,2i 2)! 1

    1

    1

    0

    3

    1

    1

    2

    1

    0

    (2 j 2,2i 2) 1

    (2 j 1,2i 2)! 1

    1

    0

    4

    1

    1

    2

    3

    1

    0

    0

    (2 j 2,2i 2) 1

    (2 j 1,2i 2) 1

    (2 j 1,2i 2) 1

    1

    1

    1

    i

    y1

    i

    j

    x1

    j

    condition

    y1

    i

    1

    1

    1

    2

    1

    0

    (2 j 2,2i 2) 1

    (2 j 1,2i 2)! 1

    1

    0

    2

    1

    1

    2

    3

    1

    0

    0

    (2 j 2,2i 2) 1

    (2 j 1,2i 2) 1

    (2 j 1,2i 2)! 1

    1

    1

    0

    3

    1

    1

    2

    1

    0

    (2 j 2,2i 2) 1

    (2 j 1,2i 2)! 1

    1

    0

    4

    1

    1

    2

    3

    1

    0

    0

    (2 j 2,2i 2) 1

    (2 j 1,2i 2) 1

    (2 j 1,2i 2) 1

    1

    1

    1

    0

    From the table 1 we have

    y1 0 . So recover

    0

    1

    Figure3. Configuration of the CA of the example

    the pattern

    y1 when is presentated the input pattern

    5.3, in a) after to apply

    fQ and in b) after to apply

    fW .

    x1 using the max algorithm for pattern recovery.

    Similarly to the above table, Table 2 shows the

    Consider the input pattern

    initial and the final value for each component of the

    output vector y1 when is applied he min algorithm

    1

    x1 0

    for pattern recovery.

    0

    Table 2. Configuration of the CA of the example

    5.3 in a) after to apply

    fQ and in b) after to apply

    The table 1 shows the initial an the final value

    fW .

    for each component of the output vector y1 when

    i

    i

    i

    y1

    i

    j

    x1

    j

    condition

    y1

    i

    1

    0

    1

    1

    (2 j 2,2i 1) 1

    0

    2

    3

    0

    0

    (2 j 1,2i 1) 1

    (2 j 1,2i 1) 1

    0

    0

    2

    0

    1

    1

    (2 j 2,2i 1) 1

    0

    2

    3

    0

    0

    (2 j 2,2i 1) 1 (2 j 1,2i 1) 1

    0

    0

    3

    0

    1

    1

    (2 j 2,2i 1) 1

    0

    2

    3

    0

    0

    (2 j 2,2i 1) 1 (2 j 2,2i 1) 1

    0

    0

    4

    0

    1

    1

    (2 j 2,2i 1)! 1

    1

    i

    y1

    i

    j

    x1

    j

    condition

    y1

    i

    1

    0

    1

    1

    (2 j 2,2i 1) 1

    0

    2

    3

    0

    0

    (2 j 1,2i 1) 1

    (2 j 1,2i 1) 1

    0

    0

    2

    0

    1

    1

    (2 j 2,2i 1) 1

    0

    2

    3

    0

    0

    (2 j 2,2i 1) 1 (2 j 1,2i 1) 1

    0

    0

    3

    0

    1

    1

    (2 j 2,2i 1) 1

    0

    2

    3

    0

    0

    (2 j 2,2i 1) 1 (2 j 2,2i 1) 1

    0

    0

    4

    0

    1

    1

    (2 j 2,2i 1)! 1

    1

    is applied the max algorithm for pattern recovery. The first column shows the value por the variable i considerated the first for of the algorithm. the

    second column is the value of

    y1 for default, wich

    j

    j

    is 1. The third column are the differents values for the cycle of the variable j , the fourth column is the

    respective value tha has

    x1 , the fifth column is the

    condition that must comply in the algorithm

    x

    x

    j

    j

    depends if the value of 1

    is 0 o 1.

    i

    i

    Finally the sixth column shows the final value

    of the

    y1 component.

    Table 1. Configuration of the CA of the example

    5.3 in a) after to apply

    fQ and in b) after to apply

    fW .

    MLP [17]

    95.99

    NBT [17]

    93.99

    PART [17]

    94.66

    ACA with max recuperation

    99.33

    ACA with min recuperation

    99.33

    MLP [17]

    95.99

    NBT [17]

    93.99

    PART [17]

    94.66

    ACA with max recuperation

    99.33

    ACA with min recuperation

    99.33

    0

    0

    From the table 2 we have

    y1 . So, recover

    0

    1

    the pattern y1 when is presentated the pattern x1

    using the min algorithm for pattern recovery.

  5. Experiments and results

    For the experimental part was used the Iris database provided by the University of California Irvine Machine Learning Repository available in http://www.ics.uci.edu/~mlearn/mlrepository.html. The Iris database count with 150 instances, each instance with 4 real attributes without information loss, divided in 3 classes: Iris Setosa, Iris Versicolour and Iris Virgínica. To validate the test, it was considered the k-fold cross validation method with k = 10. The CAA was applied in its learning phase. For the recovering phase it was applied the max algorithm and the min algorithm. The figure 4 shows the CAA configuration in its learning phase, and the table 3 shows the result of the model compared with another results applied at the Iris Plant database.

    Figure 4. CAA configuration in its learning phase for the Iris Plant database.

    Model

    Iris Plant (%)

    Bayesian Network (K2) [15]

    93.20

    Adaboost NB [15]

    94.80

    Bagging NB [15]

    95.53

    NBTree [15]

    93.53

    LogitBostDS [15]

    94.93

    K-Means [16]

    89

    Neural Gas [16]

    91.7

    Model

    Iris Plant (%)

    Bayesian Network (K2) [15]

    93.20

    Adaboost NB [15]

    94.80

    Bagging NB [15]

    95.53

    NBTree [15]

    93.53

    LogitBostDS [15]

    94.93

    K-Means [16]

    89

    Neural Gas [16]

    91.7

    Table 3. Comparison of proposed model with other models.

  6. Conclusions

    It has been presented a model of associative memory based on cellular automata that we call CAA. For the learning phase the CAA is builded from a fundamental set. For the recovering there is two algorithms: the max and the min recovering algorithms. The model was applied to the database of Iris Plant from the databases available in the repertory by the UCI Machine Learning Repository. The model was compared with other models by their showed yield.

  7. References

  1. Fang-Fang Chen, Fang-Yue Chen. Complex dynamics of cellular automata rule 119. Physica A: Statistical Mechanics and its Applications, Volume 388, Issue 6, 15 March 2009, Pages 984-990.

  2. Ying Zhai, Zhong Yi, Pei-min Deng. On behavior of two-dimensional cellular automata with an exceptional rule. Information Sciences, Volume 179, Issue 5, 15 February 2009, Pages 613-622.

  3. S. Wolfram, Statistical mechanics of previous termcellular automata, next term. Rev. Mod. Phys. 55 (3) (1983), pp. 601644.

  4. Matthew John M. Krane, David R. Johnson, Srinivasan Raghavan. The development of a cellular automaton-nite volume model for dendritic growth. Applied Mathematical Modelling, Volume 33, Issue 5, May 2009, Pages 2234-2247.

  5. E.A. Reis, L.B.L. Santos, S.T.R. Pinho. A cellular automata model for avascular solid tumor growth under the effect of therapy. Physica A: Statistical Mechanics and its Applications, Volume 388, Issue 7, 1 April 2009, Pages 1303-1314.

  6. M. Richard, K.J. Kirkby, R.P. Webb, N.F. Kirkby. Cellular automaton model of cell response to targeted radiation. Applied Radiation and Isotopes, Volume 67, Issue 3, March 2009, Pages 443-446.

  7. Takahi Nagatani. Traffic states and fundamental diagram in cellular automaton model of vehicular trac controlled by signals. Physica A: Statistical Mechanics and its Applications, In Press, Corrected Proof, Available online 4 January 2009.

  8. Steinbuch, K. (1961). Die Lernmatrix, Kybernetik, 1, 1, 36-45.

  9. Willshaw, D., Buneman, O. & Longuet-Higgins, H. (1969). Non-holographic associative memory, Nature, 222, 960-962.

  10. Amari, S. (1972). Learning patterns and pattern sequences by self-organizing nets of threshold elements, IEEE Transactions on Computers, C-21, 11, 1197-1206.

  11. Hop eld, J.J. (1982). Neural networks and physical systems with emergent collective computational abilities, Proceedings of the National Academy of Sciences, 79, 2554-2558.

  12. Kosko, B. (1988). Bidirectional associative memories, IEEE Transactions on Systems, Man, and Cybernetics, 18, 1, 49-60.

  13. Ritter, G. X., Sussner, P. & Diaz-de-Leon, J. L. (1998). Morphological associative memories, IEEE Transactions on Neural Networks, 9, 281-293.

  14. Yáñez-Márquez, C., Sánchez-Fernández, L. P., López-Yáñez, I., (2006). Alpha-Beta Associative Memories for Gray Level Patterns. Lecture Notes in Computer Science, LNCS 3971, Springer-Verlag Berlin Heidelberg, ISSN: 0302- 9743, DOI: 10.1007/11759966 120, pp. 818-823.

  15. Kotsiantis, S. B., Pintelas, P. E., (2005). Logitboost of Simple Bayesian Classifier. Informatica.

  16. Camastra, F., Verri, A., (2005). A Novel Kernel Method for Clustering. IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, VOL. 27, NO. 5.

  17. Wickramasinghe, L. K., Alahakoon, L. D., Smith- Miles, K., (2007). A novel Episodic Associative Memory model for enhanced classification accuracy. Pattern Recognition Letters 28, ELSEVIER, pp. 11931202.

Leave a Reply