Associative Neural Networks Library
The library provides implementation of various Associative Neural Network models. Most of the work has been done as a part of my (still ongoing) PhD project and was supported by the INTAS Young Scientist Fellowship YSF 0355795.
Associative Neural Networks can be used for:

Associative
memories (ContentAddressable Memory);

Classification
problems;

Optimization
problems.
Key features:

Distributed
storage of information;

"Graceful
degradation", i.e., destruction of individual neurons or of small groups
of neurons reduces performance, but does not have the devastating effect;

Parallel
mode of operation (hardware friendly);

Different
learning rules, including fast noniterative one that allows to add/erase data.
The neural network model for associative memory was first proposed by J. Hopfield in 1982. It is a dynamical system of simple threshold units (neurons). The Hopfield model is fully connected, that is each neuron receives outputs from all other neurons (including itself).
Using the sparsely connected Hopfield network has a number of advantages:
 Biologically more feasible structure of the network;
 More suitable for hardware implementation;
 Faster and requires less memory in computer simulations;
 Reveals the influence of network connectivity pattern (architecture) on its behaviour.
Architecture of the sparse Hopfield network can be chosen in a number of ways:
 Random architecture – certain number of connections are chosen to connect random pairs of neurons.
 Adaptive architecture – location of connections is chosen so as to (sub) maximize the associative performance of the network for a particular dataset [Dekhtyarenko2004, Dekhtyarenko2005].
 Cellular architecture – local connectivity pattern. Only neighboring neurons are connected. This architecture favors hardware implementation and is inspired by Cellular Neural Network (CNN) paradigm [Chua1988]. It provides the smallest value of total connection length what is crucial in some applications.
 SmallWorld architecture – it takes the best of two worlds – the associative performance of the network with random architecture and mostly local connectivity pattern of the cellular network.

Fullyconnected 
Sparse with Adaptive Architecture 
Sparse with Random Architecture 
SmallWorld Architecture 
Cellular Architecture (regular grid) 
Implementation class 
FullProjectiveNet 
AdaptiveCellularNet 
PseudoInverseNet HebbianCellularNet DeltaCellularNet 
SmallWorldNet 
PseudoInverseNet HebbianCellularNet DeltaCellularNet 
Associative performance 
***** 
**** 
*** 
** 
* 
Total connection length (the smaller the
better) 
* 
** 
** 
*** 
**** 
Number of weights (the smaller the better) 
* 
***** 
***** 
***** 
***** 
Note:
Comparison of associative performance for Sparse Adaptive/Sparse
Random/SmallWorld/Cellular networks is given subject to the equal number of
weights.
There are a number of learning rules (LR)
that can be used to train the sparse Hopfield network. Here is a brief summary
of implemented LRs.

Projective 
Perceptron
(Hebbian) 
Delta Rule 
PseudoInverse 
Associative performance 
* 
**** 
**** 
**** 
Iterative 
no 
yes 
yes 
no 
Possibility of incremental learning/deletion 
yes 
no 
no 
yes 
Note: In
Projective learning rule the weight matrix of fully connected network is
obtained by [Personnaz1986] and then all connections that do not satisfy
architectural constrains are simply cut.
The
given release is of August 15, 2005.
Library allows creating and testing various Associative Neural Networks (most of them are based on Hopfield model [Hopfield1982]). All networks function in discrete time with bipolar (+1/1) states and synchronous convergence mode.
Testing functions allow tracking the evolution of network properties during the training or with the change of network parameters.
One of the most important network
characteristics is Attraction Radius, which quantifies the network
performance as associative memories. It is possible to find either absolute
value of network attraction radius (measured as Hamming distance
[test.cpp::getRAttraction] function), or it’s normalized value
[test.cpp::getNormalizedRAttraction].
class FullNet – abstract base class for
fully connected models.
class CellularNet  abstract base class
for sparsely connected models, provides efficient weights storage and
manipulation.
class FullProjectiveNet –
fullyconnected network with Projective learning rule [Personnaz1986].
Implements desaturation technique [Gorodnichy1997] and various
retraining methods that allow network recovery after the failure of some
neurons (this implementation was used in [Reznik2003a]).
class PseudoInverseNet – sparse network
implementing PseudoInverse learning rule (PI LR) [Brucoli1995], a
learning rule that allows guaranteed storage of memory patterns as stable
states for (almost) any network architecture.
class AdaptiveCellularNet – sparse
network with PI LR and architecture that is changing depending on a dataset.
That is for the specified number of connection network itself finds the
architecture that maximizes the associative performance on a given dataset
[Dekhtyarenko2004, Dekhtyarenko2005].
class SmallWorldNet – sparse network
with SmallWorld architecture [WattsStrogatz1998] and PI LR. Apart from
the originally proposed random rewiring SmallWorldNet implements new systematic
rewiring procedure, which further improves the associative properties of the
network using the same amount of shortcut connections [Dekhtyarenko2005a].
class HebbianCellularNet  sparse
network with Perceptron (Hebbian) learning rule [Diederich1987].
class DeltaCellularNet  sparse network
with WidrowHoff Delta learning rule [Widrow1960].
class ModularNet – growing modular
associative network with large memory capacity [Reznik2003b]. As its basic building blocks (modules) it can
use any of the network types mentioned above.
class BAMCellularNet – sparse
bidirectional associative memory (under construction).
Run compileBorland.bat or compileMS.bat files.
The library is being created in Win OS using
Borland C++ Builder 5.0 with “Language compliance” compiler option set to
“Borland”, therefore it requires a couple of tricks to compile it using
Microsoft CL v. 12 (from MS Visual Studio 6.0)

#
define for if (0) {} else for // scope of definition in “for” statements

FORCE:MULTIPLE
linker option
Run nets.exe with the name of inifile as its
input. Provided file “_SmallWorld.ini” does the following:
1. SmallWorldNet network is created with
the following parameters:

dimension
= 256

neuron
connection radius = 10 (connectivity degree of about 8%)

no
diagonal weights
2. The network is trained with the data from the file “_256x256.dat” (bipolar patterns with random equiprobable and independent components) for patterns #114 (testNum = 1, numStored = 1:15:1+). After each additional stored vector the network is tested and test results (including attraction radius – “attrR” field) are stored in the “_report.txt” file.
In addition to the value of attraction radius
output file “_report.txt” contains a lot of other useful information, such as
network architecture properties (number of connections, total connection
length, ...), weight matrix properties (norm, trace, asymmetry degree, ...),
estimations of associative performance (kappa measure, min aligned local
field), actual testing results (average number of iterations, error portion,
...), etc.
The results in the output file are easy to
analyze using any table processor (MS Excel, ...).
If you
have any questions or comments I’d be glad to hear from you.
Oleksiy K. Dekhtyarenko, PhD student
Institute of
Mathematical Machines and Systems
Department of
Neurotechnologies
Glushkov Ave. 42,
Kiev 03187, Ukraine
Tel.: +380445266221
Fax: +380445266457
Mobile:
+380672366157
Email: name@domain, name=olexii, domain=mail.ru
ICQ: 86473901
[Brucoli1995] 
M.
Brucoli; L. Carnimeo & G. Grassi  "Discretetime cellular neural
networks for associative memories with learning and forgetting
capabilities", IEEE Transactions on Circuits and Systems, pp.
396–399, vol. 42, 1995 
[Chua1988]

L.
Chua & L. Yang  "Cellular Neural Networks: Theory", IEEE
Transactions on Circuits and Systems, pp. 12571272, vol. 35(10), 1988 

O.
Dekhtyarenko; A. Reznik & A. Sitchov  "Associative Cellular Neural
Networks with Adaptive Architecture", in proc. of The 8th IEEE
International Biannual Workshop on Cellular Neural Networks and their
Application (CNNA'04), pp. 219224, Budapest, Hungary, July 2224, 2004 

O.
Dekhtyarenko; V. Tereshko & C. Fyfe  "Phase transition in sparse
associative neural networks", in proc. of European Symposium on
Artificial Neural Networks (ESANN'05), Bruges, Belgium, April 2729, 2005 

O.
Dekhtyarenko  "Systematic Rewiring in Associative Neural Networks with
SmallWorld Architecture", in proc. of International Joint Conference
on Neural Networks (IJCNN'05), pp. 11781181, Montreal, Quebec, Canada,
July 31  August 4, 2005 (poster) 
[Diederich1987]

S.
Diederich & M. Opper  "Learning of correlated patterns in
spinglass networks by local learning rules", Physical Review Letters,
pp. 949952, vol. 58(9), 1987 
[Gorodnichy1997] 
D.
Gorodnichy & A. Reznik  "Increasing Attraction of PseudoInverse
Autoassociative Networks", Neural Processing Letters, pp.
123127, vol. 5(2), 1997 
[Hopfield1982] 
J.
Hopfield  "Neural networks and physical systems with emergent
collective computational abilities.", Proc Natl Acad Sci USA, pp.
25542558, vol. 79(8), 1982 
[Personnaz1986] 
L.
Personnaz; I. Guyon & G. Dreyfus  "Collective computational
properties of neural networks: New learning mechanisms", Physical
Review A, pp. 4217–4228, vol. 34, 1986 

A.
Reznik; A. Sitchov; O. Dekhtyarenko & D. Nowicki  "Associative
memories with "killed" neurons: the methods of recovery", in
proc. of International Joint Conference on Neural Networks (IJCNN'03),
Portland, Oregon, US, July 2024, 2003 

A.
Reznik & O. Dekhtyarenko  "Modular neural associative memory
capable of storage of large amounts of data", in proc. of International
Joint Conference on Neural Networks (IJCNN'03), Portland, Oregon, US,
July 2024, 2003 
[WattsStrogatz1998]

D.
Watts & S. Strogatz  "Collective dynamics of 'smallworld' networks.",
Nature, pp. 440442, vol. 393, 1998 
[Widrow1960]

B.
Widrow & M.E. Hoff, J.  "Adaptive switching circuits", IRE
Western Electric Show and Convention Record, pp. 96104, vol. 4, 1960 