Next: Gradient Descent Algorithm
Up: Neural Network Learning
Previous: Perceptron Training Rule
- If the training example is correctly classified ,
making , so no weights are updated
- If the perceptron outputs -1 when the target output is +1 and
assuming and , then
- if the perceptron outputs +1 when the target output is -1, then
the weight would be decreased
- this learning procedure will converge within finite number of
applications of the perceptron training rule to a weight vector that
correctly classifies all training examples, provided
the training examples are linearly separableБе and a sufficiently
small is used.
Patricia Riddle
Fri May 15 13:00:36 NZST 1998