Next: Termination Conditions for Backprop
Up: Neural Network Learning
Previous: Backpropagation Algorithm
- the error in the delta rule is replaced by
- for output unit it is the familiar from
the delta rule multiplied by which is derivative of
the sigmoid squashing function
- for hidden unit the derivative component is the same but
there is no target value directly available so you sum the error
terms for each output unit influenced by
weighting each of the by the weight, , from the hidden
unit to the output unit .
- This weight characterizes the degree to which each hidden unit
is responsible for the error in output unit .
Patricia Riddle
Fri May 15 13:00:36 NZST 1998