Next: Naive Bayes Classifier
Up: Bayesian Learning
 Previous: Bayes Optimal Classifier
 
-  Bayes Optimal is quite costly to apply.  It computes the
posterior probabilities for every hypothesis in  
  and combines the
predictions of each hypothesis to classify each new instance -  An alternative (less optimal) method:
-  Choose a hypothesis  
  from  
  at random, according to the
posterior probability distribution over  
 . -  Use  
  to predict the classification of the next instance
 
 .
 
 -  Under certain conditions the expected misclassification error for
Gibbs algorithm is at most twice the expected error of the Bayes
optimal classifier.
 
 
Patricia Riddle 
Fri May 15 13:00:36 NZST 1998