What is associative self-building neural network?
Associative self-building neural
network is a new class of neural networks that dynamically change their
structures during training. All used for train data samples are stored in
the structure of the network. That means that associative self-building
neural network "remembers" all data samples that were used during
training session, and each fragment of data is stored only once in it.
Associative self-building neural
network applies inductive way of learning. When the network is trained it
feeds with descriptions of data samples. Each data sample is marked with
positive or negative flag that denotes to which part of investigated
phenomenon this sample belongs. The network applies inductive way of learn and
compares new sample with all entered before samples and derives
regularities that are peculiar to different data samplings and fixes them
into network structure by means of forming new neurons and changing
connections weights.
Inductive way of learning
resided in associative self-building neural network differs this type of
network from commonly used neural networks that realize deductive way of
learning. Inductive way of learning can continue without jeopardizing that
proposed neural network can be overtrained. Network size is limited only
with memory capacity of available hard disks and PC processing power. The
more PC has processing power the more experience can be concentrated in
associative self-building neural network can be accumulated for further
prediction or pattern recognition.
Accumulation of experience lead
to growing of network size. At the beginning of its generation network
grows very fast because of fixing new associations in its structure. As far
as network gather knowledge, generation rate of new associations slow down.
In saturated network each new train sample will be stored with only one
neuron at top level of the network structure. This feature helps network enormously compress data.
Network
size and amount of data is the second and very important problem.
Associative self-building neural network can consists of tenth thousands of
nodes. During recognition or prediction network should compare description
of the situation with its collected knowledge to take decision. Straightforward
search of data will be very time consuming. Associative self-building
neural network overcomes this problem because it has sophisticated
mechanism of associative search of relevant data. This mechanism activates
only a fragment of the network that contains required information and not
use other inactive part of the network.
Associative
self-building neural network has the following advantages:
- Inductive way of "learn" on a set of
examples.
- Insensitivity to (moderate) noise or unreliability
in the data.
- It can permanently evolve by means of use of
additional trains.
- User cannot directly change network structure
(change number or neurons and connections) that reflects only data
used for train.
- Absorbs enormous amounts of data without
conventional programming.
- Neural network size is limited only by size of hard
disk.
- Provides enormously compressed data representation.
- Associative way of data retrieval from the network.
- Network is portable. Data is stored in a format
which will not become obsolete with change of hardware.
- Users need little training.
Where associative self-building neural
network can be applied?
Associative
self-building neural network can be effectively applied in those areas
where conventional technologies are not able to be used because of
ill-defined or very complex relationships.
- Forecasting the future changes in prices of stocks,
exchange rates and commodities.
- Solar flare prognosis
- Risk management.
- Fault diagnosis.
- Medical diagnoses.
Please
send any comments or suggestions by e-mail
|