next up previous contents
Next: Growing Neural Gas Up: Soft Competitive Learning without Previous: Competitive Hebbian Learning


Neural Gas plus Competitive Hebbian Learning


This method (Martinetz and Schulten, 1991, 1994) is a straight-forward superposition of neural gas and competitive Hebbian learning. It is sometimes denoted as ``topology-representing networks'' (Martinetz and Schulten, 1994). This term, however, is rather general and would apply also to the growing neural gas model described later.

At each adaptation step a connection between the winner and the second-nearest unit is created (this is competitive Hebbian learning). Since the reference vectors are adapted according to the neural gas method a mechanism is needed to remove edges which are not valid anymore. This is done by a local edge aging mechanism. The complete neural gas with competitive Hebbian learning algorithm is the following:

  1. Initialize the set tex2html_wrap_inline4775 to contain N units tex2html_wrap_inline4779
    with reference vectors tex2html_wrap_inline4781 chosen randomly according to tex2html_wrap_inline4783.

    Initialize the connection set tex2html_wrap_inline4785 , tex2html_wrap_inline4787, to the empty set:

    Initialize the time parameter t:

  2. Generate at random an input signal tex2html_wrap_inline4791 according to tex2html_wrap_inline4793.
  3. Order all elements of tex2html_wrap_inline4795 according to their distance to tex2html_wrap_inline4797, i.e., find the sequence of indices tex2html_wrap_inline4799 such that tex2html_wrap_inline4801 is the reference vector closest to tex2html_wrap_inline4803, tex2html_wrap_inline4805 is the reference vector second-closest to tex2html_wrap_inline4807 and tex2html_wrap_inline4809 is the reference vector such that k vectors tex2html_wrap_inline4813 exist with tex2html_wrap_inline4815. Following Martinetz et al. (1993) we denote with tex2html_wrap_inline4817 the number k associated with tex2html_wrap_inline4821.
  4. Adapt the reference vectors according to
    with the following time-dependencies:


  5. If it does not exist already, create a connection between tex2html_wrap_inline4827 and tex2html_wrap_inline4829:
    Set the age of the connection between tex2html_wrap_inline4831 and tex2html_wrap_inline4833 to zero (``refresh'' the edge):
  6. Increment the age of all edges emanating from tex2html_wrap_inline4835:
    Thereby, tex2html_wrap_inline4837 is the set of direct topological neighbors of c (see equation 2.5).
  7. Remove edges with an age larger than the maximal age T(t) whereby
  8. Increase the time parameter t:
  9. If tex2html_wrap_inline4845 continue with step 2.

For the time-dependent parameters suitable initial values tex2html_wrap_inline4847 and final values tex2html_wrap_inline4849 have to be chosen.

Figure 5.5 shows some stages of a simulation for a simple ring-shaped data distribution. Figure 5.6 displays the final results after 40000 adaptation steps for three other distribution. Following Martinetz et al. (1993) we used the following parameters: tex2html_wrap_inline4851. The network size N was set to 100.

Figure 5.5:   Neural gas with competitive Hebbian learning simulation sequence for a ring-shaped uniform probability distribution. a) Initial state. b-f) Intermediate states. g) Final state. h) Voronoi tessellation corresponding to the final state. The centers move according to the neural gas algorithm. Additionally, however, edges are created by competitive Hebbian learning and removed if they are not ``refreshed'' for a while.

Figure:   Neural gas with competitive Hebbian learning simulation results after 40000 input signals for three different probability distributions (described in the caption of figure 4.4).

next up previous contents
Next: Growing Neural Gas Up: Soft Competitive Learning without Previous: Competitive Hebbian Learning

Bernd Fritzke
Sat Apr 5 18:17:58 MET DST 1997