next up previous contents
Next: Other Methods Up: Soft Competitive Learning without Previous: Neural Gas plus Competitive

 

Growing Neural Gas

This method (Fritzke, 1994b, 1995a) is different from the previously described models since the number of units is changed (mostly increased) during the self-organization process. The growth mechanism from the earlier proposed growing cell structures (Fritzke, 1994a) and the topology generation of competitive Hebbian learning (Martinetz and Schulten, 1991) are combined to a new model. Starting with very few units new units are inserted successively. To determine where to insert new units, local error measures are gathered during the adaptation process. Each new unit is inserted near the unit which has accumulated most error. The complete growing neural gas algorithm is the following:

1.
Initialize the set tex2html_wrap_inline4855 to contain two units tex2html_wrap_inline4857 and tex2html_wrap_inline4859
equation2096
with reference vectors chosen randomly according to tex2html_wrap_inline4861.

Initialize the connection set tex2html_wrap_inline4863 , tex2html_wrap_inline4865, to the empty set:
equation2213

2.
Generate at random an input signal tex2html_wrap_inline4867 according to tex2html_wrap_inline4869.
3.
Determine the winner tex2html_wrap_inline4871 and the second-nearest unit tex2html_wrap_inline4873 tex2html_wrap_inline4875 by
equation2098
and
equation2102
4.
If a connection between tex2html_wrap_inline4881 and tex2html_wrap_inline4883 does not exist already, create it:
equation2106
Set the age of the connection between tex2html_wrap_inline4885 and tex2html_wrap_inline4887 to zero (``refresh'' the edge):
equation2108

5.
Add the squared distance between the input signal and the winner to a local error variable:
equation2112
6.
Adapt the reference vectors of the winner and its direct topological neighbors by fractions tex2html_wrap_inline4891 and tex2html_wrap_inline4893, respectively, of the total distance to the input signal:
eqnarray2117
Thereby tex2html_wrap_inline4899 (see equation 2.5) is the set of direct topological neighbors of tex2html_wrap_inline4901.
7.
Increment the age of all edges emanating from tex2html_wrap_inline4903:
equation2126
8.
Remove edges with an age larger than tex2html_wrap_inline4905. If this results in units having no more emanating edges, remove those units as well.
9.
If the number of input signals generated so far is an integer multiple of a parameter tex2html_wrap_inline4907, insert a new unit as follows:
tex2html_wrap_inline4909
Determine the unit q with the maximum accumulated error:
equation2136
tex2html_wrap_inline4913
Determine among the neighbors of q the unit f with the maximum accumulated error:
equation2142
tex2html_wrap_inline4919
Add a new unit r to the network and interpolate its reference vector from q and f.
equation2148
tex2html_wrap_inline4927
Insert edges connecting the new unit r with units q and f, and remove the original edge between q and f:
equation2151
tex2html_wrap_inline4939
Decrease the error variables of q and f by a fraction tex2html_wrap_inline4945:
equation2154
tex2html_wrap_inline4947
Interpolate the error variable of r from q and f:
equation2161
10.
Decrease the error variables of all units:
equation2167

11.
If a stopping criterion (e.g., net size or some performance measure) is not yet fulfilled continue with step 2.

Figure 5.7 shows some stages of a simulation for a simple ring-shaped data distribution. Figure 5.8 displays the final results after 40000 adaptation steps for three other distribution. The parameters used in both simulations were: tex2html_wrap_inline4955, tex2html_wrap_inline4957, tex2html_wrap_inline4959, tex2html_wrap_inline4961, tex2html_wrap_inline4963, tex2html_wrap_inline4965.

 figure2270
Figure 5.7:   Growing neural gas simulation sequence for a ring-shaped uniform probability distribution. a) Initial state. b-f) Intermediate states. g) Final state. h) Voronoi tessellation corresponding to the final state. The maximal network size was set to 100.

 figure2360
Figure:   Growing neural gas simulation results after 40000 input signals for three different probability distributions (described in the caption of figure 4.4).


next up previous contents
Next: Other Methods Up: Soft Competitive Learning without Previous: Neural Gas plus Competitive

Bernd Fritzke
Sat Apr 5 18:17:58 MET DST 1997