At each adaptation step a connection between the winner and the second-nearest unit is created (this is competitive Hebbian learning). Since the reference vectors are adapted according to the neural gas method a mechanism is needed to remove edges which are not valid anymore. This is done by a local edge aging mechanism. The complete neural gas with competitive Hebbian learning algorithm is the following:
Initialize the connection set , , to the empty set:
Initialize the time parameter t:
For the time-dependent parameters suitable initial values and final values have to be chosen.
Figure 5.5 shows some stages of a simulation for a simple ring-shaped data distribution. Figure 5.6 displays the final results after 40000 adaptation steps for three other distribution. Following Martinetz et al. (1993) we used the following parameters: . The network size N was set to 100.
Figure 5.5: Neural gas with competitive Hebbian learning simulation sequence for a ring-shaped uniform probability distribution. a) Initial state. b-f) Intermediate states. g) Final state. h) Voronoi tessellation corresponding to the final state. The centers move according to the neural gas algorithm. Additionally, however, edges are created by competitive Hebbian learning and removed if they are not ``refreshed'' for a while.
Figure: Neural gas with competitive Hebbian learning simulation results after 40000 input signals for three different probability distributions (described in the caption of figure 4.4).