next up previous contents
Next: Constant Learning Rate Up: Hard Competitive Learning Previous: Batch Update: LBG

On-line Update: Basic Algorithm

In some situations the data set tex2html_wrap_inline4529 is so huge that batch methods become impractical. In other cases the input data comes as a continuous stream of unlimited length which makes it completely impossible to apply batch methods. A resort is on-line update, which can be described as follows:

1.
Initialize the set tex2html_wrap_inline4531 to contain N units tex2html_wrap_inline4535
equation810
with reference vectors tex2html_wrap_inline4537 chosen randomly according to tex2html_wrap_inline4539.
2.
Generate at random an input signal tex2html_wrap_inline4541 according to tex2html_wrap_inline4543.
3.
Determine the winner tex2html_wrap_inline4545:


equation819

4.
Adapt the reference vector of the winner towards tex2html_wrap_inline4551:
equation775
5.
Unless the maximum number of steps is reached continue with step 2.

Thereby, the learning rate tex2html_wrap_inline4555 determines the extent to which the winner is adapted towards the input signal. Depending on whether tex2html_wrap_inline4557 stays constant or decays over time, several different methods are possible some of which are described in the following.



Bernd Fritzke
Sat Apr 5 18:17:58 MET DST 1997