Next: Combining Different Algorithms
Up: Ensembles
Previous: Injecting Randomness
- Backprop - train several networks simultaneously and use a
correlation penalty in the error function
- genetic operators to generate new network topologies -
multiplicative term that incorporates the diversity of the classifiers -
prune to N best networks
- training on auxiliary task as well as the main task - diverse
classifiers can be learned with one primary task and but with
different auxiliary tasks such as predicting one of its input features
- network whose secondary prediction is best is the winner -
encourage different networks to become experts a predicting the
auxiliary task in different local regions - causes the errors in the
primary output to become decorrelated
- DT - option-trees - equivalent and more understandable than
bagging
Patricia Jean Riddle
Wed Jun 23 13:06:34 NZST 1999