Hopfield Neural Networks
A Hopfield network (named after John Hopfield) is a recurrent network since the flow of activation through the network has loops. These networks are trained by applying input patterns and letting the network settle in a state that stores the input patterns.
The example code is in the file src/loving_snippets/Hopfield_neural_network.lisp.
The example we look at recognizes patterns that are similar to the patterns seen in training examples and maps input patterns to a similar training input pattern. The following figure shows output from the example program showing an original training pattern, a similar pattern with one cell turned on and other off, and the reconstructed pattern:
To be clear, we have taken one of the original input patterns the network has learned, slightly altered it, and applied it as input to the network. After cycling the network, the slightly scrambled input pattern we just applied will be used as an associative memory key, look up the original pattern, and rewrite to input values with the original learned pattern. These Hopfield networks are very different than backpropagation networks: neuron activation are forced to values of -1 or +1 and not be differentiable and there are no separate output neurons.
The next example has the values of three cells modified from the original and the original pattern is still reconstructed correctly:
This last example has four of the original cells modified:
The following example program shows a type of content-addressable memory. After a Hopfield network learns a set of input patterns then it can reconstruct the original paterns when shown similar patterns. This reconstruction is not always perfecrt.
The following function Hopfield-Init (in file Hopfield_neural_network.lisp) is passed a list of lists of training examples that will be remembered in the network. This function returns a list containing the data defining a Hopfield neural network. All data for the network is encapsulated in the list returned by this function, so multiple Hopfield neural networks can be used in an application program.
In lines 9-12 we allocate global arrays for data storage and in lines 14-18 the training data is copied.
The inner function adjustInput on lines 20-29 adjusts data values to values of -1.0 or +1.0. In lines 31-33 we are initializing all of the weights in the Hopfield network to zero.
The last nested loop, on lines 35-52, calculates the autocorrelation weight matrix from the input test patterns.
On lines 54-56, the function returns a representation of the Hopfield network that will be used later in the function HopfieldNetRecall to find the most similar “remembered” pattern given a new (fresh) input pattern.
1 (defun Hopfield-Init (training-data
2 &aux temp *num-inputs* *num-training-examples*
3 *training-list* *inputCells* *tempStorage*
4 *HopfieldWeights*)
5
6 (setq *num-inputs* (length (car training-data)))
7 (setq *num-training-examples* (length training-data))
8
9 (setq *training-list* (make-array (list *num-training-examples* *num-inputs*)))
10 (setq *inputCells* (make-array (list *num-inputs*)))
11 (setq *tempStorage* (make-array (list *num-inputs*)))
12 (setq *HopfieldWeights* (make-array (list *num-inputs* *num-inputs*)))
13
14 (dotimes (j *num-training-examples*) ;; copy training data
15 (dotimes (i *num-inputs*)
16 (setf
17 (aref *training-list* j i)
18 (nth i (nth j training-data)))))
19
20 (defun adjustInput (value) ;; this function is lexically scoped
21 (if (< value 0.1)
22 -1.0
23 +1.0))
24
25 (dotimes (i *num-inputs*) ;; adjust training data
26 (dotimes (n *num-training-examples*)
27 (setf
28 (aref *training-list* n i)
29 (adjustInput (aref *training-list* n i)))))
30
31 (dotimes (i *num-inputs*) ;; zero weights
32 (dotimes (j *num-inputs*)
33 (setf (aref *HopfieldWeights* i j) 0)))
34
35 (dotimes (j-1 (- *num-inputs* 1)) ;; autocorrelation weight matrix
36 (let ((j (+ j-1 1)))
37 (dotimes (i j)
38 (dotimes (s *num-training-examples*)
39 (setq temp
40 (truncate
41 (+
42 (* ;; 2 if's truncate values to -1 or 1:
43 (adjustInput (aref *training-list* s i))
44 (adjustInput (aref *training-list* s j)))
45 (aref *HopfieldWeights* i j))))
46 (setf (aref *HopfieldWeights* i j) temp)
47 (setf (aref *HopfieldWeights* j i) temp)))))
48 (dotimes (i *num-inputs*)
49 (setf (aref *tempStorage* i) 0)
50 (dotimes (j i)
51 (setf (aref *tempStorage* i)
52 (+ (aref *tempStorage* i) (aref *HopfieldWeights* i j)))))
53
54 (list ;; return the value of the Hopfield network data object
55 *num-inputs* *num-training-examples* *training-list*
56 *inputCells* *tempStorage* *HopfieldWeights*))
The following function HopfieldNetRecall iterates the network to let it settle in a stable pattern which we hope will be the original training pattern most closely resembling the noisy test pattern.
The inner (lexically scoped) function deltaEnergy defined on lines 9-12 calculates a change in energy from old input values and the autocorrelation weight matrix. The main code uses the inner functions to iterate over the input cells, possibly modifying the cell at index i delta energy is greater than zero. Remember that the lexically scoped inner functions have access to the variables for the number of inputs, the number of training examples, the list of training examples, the input cell values, tempoary storage, and the Hopfield network weights.
1 (defun HopfieldNetRecall (aHopfieldNetwork numberOfIterations)
2 (let ((*num-inputs* (nth 0 aHopfieldNetwork))
3 (*num-training-examples* (nth 1 aHopfieldNetwork))
4 (*training-list* (nth 2 aHopfieldNetwork))
5 (*inputCells* (nth 3 aHopfieldNetwork))
6 (*tempStorage* (nth 4 aHopfieldNetwork))
7 (*HopfieldWeights* (nth 5 aHopfieldNetwork)))
8
9 (defun deltaEnergy (row-index y &aux (temp 0.0)) ;; lexically scoped
10 (dotimes (j *num-inputs*)
11 (setq temp (+ temp (* (aref *HopfieldWeights* row-index j) (aref y j)))))
12 (- (* 2.0 temp) (aref *tempStorage* row-index)))
13
14 (dotimes (ii numberOfIterations) ;; main code
15 (dotimes (i *num-inputs*)
16 (setf (aref *inputCells* i)
17 (if (> (deltaEnergy i *inputCells*) 0)
18 1
19 0))))))
Function test in the next listing uses three different patterns for each test. Note that only the last pattern gets plotted to the output graphics PNG file for the purpose of producing figures for this chapter. If you want to produce plots of other patterns, edit just the third pattern defined on line AAAAA. The following plotting functions are inner lexically scoped so they have access to the data defined in the enclosing let expression in lines 16-21:
- plotExemplar - plots a vector of data
- plot-original-inputCells - plots the original input cells from training data
- plot-inputCells - plots the modified input cells (a few cells randomly flipped in value)
- modifyInput - scrambles training inputs
1 (defun test (&aux aHopfieldNetwork)
2 (let ((tdata '( ;; sample sine wave data with different periods:
3 (1 0 0 0 1 0 0 0 1 0 0 0 1 0 0 0 1 0 0 0 0 0 1 1 0 0 0)
4 (0 1 1 0 0 0 0 0 1 0 0 1 0 0 0 0 0 1 0 0 0 1 1 0 0 1 0)
5 (0 0 0 1 1 0 0 0 0 0 0 1 1 0 0 0 0 0 1 1 0 0 1 1 0 1 1)))
6 (width 300)
7 (height 180))
8 (vecto::with-canvas (:width width :height height)
9 (plotlib:plot-string-bold 10 (- height 14) "Hopfield pattern classifier")
10
11 ;; Set up network:
12 (print tdata)
13 (setq aHopfieldNetwork (Hopfield-Init tdata))
14
15 ;; lexically scoped variables are accesible by inner functions:
16 (let ((*num-inputs* (nth 0 aHopfieldNetwork))
17 (*num-training-examples* (nth 1 aHopfieldNetwork))
18 (*training-list* (nth 2 aHopfieldNetwork))
19 (*inputCells* (nth 3 aHopfieldNetwork))
20 (*tempStorage* (nth 4 aHopfieldNetwork))
21 (*HopfieldWeights* (nth 5 aHopfieldNetwork)))
22
23 (defun plotExemplar (row &aux (dmin 0.0) (dmax 1.0) (x 20) (y 40))
24 (let ((YSize (array-dimension *training-list* 1)))
25 (plotlib:plot-string (+ x 20) (- height (- y 10))
26 "Original Training Exemplar")
27 (dotimes (j Ysize)
28 (plotlib:plot-fill-rect
29 (+ x (* j plot-size+1)) (- height y) plot-size plot-size
30 (truncate (*
31 (/ (- (aref *training-list* row j) dmin)
32 (- dmax dmin))
33 5)))
34 (plotlib:plot-frame-rect (+ x (* j plot-size+1))
35 (- height y) plot-size plot-size))))
36
37 (defun plot-original-inputCells (&aux (dmin 0.0) (dmax 1.0) (x 20) (y 80))
38 (let ((Xsize (array-dimension *inputCells* 0)))
39 (plotlib:plot-string (+ x 20) (- height (- y 10)) "Scrambled Inputs")
40 (dotimes (j Xsize)
41 (plotlib:plot-fill-rect
42 (+ x (* j plot-size+1)) (- height y) plot-size plot-size
43 (truncate (*
44 (/ (- (aref *inputCells* j) dmin) (- dmax dmin))
45 5)))
46 (plotlib:plot-frame-rect (+ x (* j plot-size+1))
47 (- height y) plot-size plot-size))))
48
49 (defun plot-inputCells (&aux (dmin 0.0) (dmax 1.0) (x 20) (y 120))
50 (let ((Xsize (array-dimension *inputCells* 0)))
51 (plotlib:plot-string (+ x 20) (- height (- y 10))
52 "Reconstructed Inputs")
53 (dotimes (j Xsize)
54 (plotlib:plot-fill-rect
55 (+ x (* j plot-size+1)) (- height y) plot-size plot-size
56 (truncate (* (/
57 (- (aref *inputCells* j) dmin)
58 (- dmax dmin))
59 5)))
60 (plotlib:plot-frame-rect
61 (+ x (* j plot-size+1)) (- height y) plot-size plot-size))))
62
63 (defun modifyInput (arrSize arr) ;; modify input array for testing
64 (dotimes (i arrSize)
65 (if (< (random 50) 5)
66 (if (> (aref arr i) 0)
67 (setf (aref arr i) -1)
68 (setf (aref arr i) 1)))))
69
70 ;; Test network on training data that is randomly modified:
71
72 (dotimes (iter 10) ;; cycle 10 times and make 10 plots
73 (dotimes (s *num-training-examples*)
74 (dotimes (i *num-inputs*)
75 (setf (aref *inputCells* i) (aref *training-list* s i)))
76 (plotExemplar s)
77 (modifyInput *num-inputs* *inputCells*)
78 (plot-original-inputCells)
79 (dotimes (call-net 5) ;; iterate Hopfield net 5 times
80 (HopfieldNetRecall aHopfieldNetwork 1) ;; calling with 1 iteration
81 (plot-inputCells)))
82
83 (vecto::save-png
84 (concatenate
85 'string
86 "output_plot_hopfield_nn_" (format nil "~5,'0d" iter) ".png")))))))
The plotting functions in lines 23-62 use the plotlib library to make the plots you saw earlier. The function modifyInput in lines 64-69 randomly flips the values of the input cells, taking an original pattern and slightly modifying it.
Hopfield neural networks, at least to some extent, seem to model some aspects of human brains in the sense that they can function as content-addressable (also called associative) memories. Ideally a partial input pattern from a remembered input can reconstruct the complete original pattern. Another interesting feature of Hopfield networks is that these memories really are stored in a distributed fashion: some of the weights can be randomly altered and patterns are still remembered, but with more recall errors.