Semester project for 625-438.71 (Neural Networks)
This project entailed an exploration of learning rules for Hopfield Networks - a neural network architecture conduscive to pattern recognition and recall. Hopfield networks have a capacity limit relative to their size, and the learning rule for the network affects the utilization of that capacity as the network is trained. The traditional learning rule for a Hopfield Network is Hebbian, affecting a reinforcement of synaptic strength in the network through repeat exposure of examplar patterns to the network during training. Amos Storkey proposed an improvement to the Hebbian rule known as the Storkey learning rule, specifically constructed to allow for more patterns to be stored in a network of the same size.
This project and accompanying paper serve as an introduction to Hopfield Networks, the Hebbian and Storkey rules, and experimental proof that the Strokey rule allows for greater capacity.