Most of the CBRers started Tuesday by
attending an invited talk by Robert Schapire called "A brief introduction to
boosting". The presentation was exactly as the title, providing a clear and
concise summary of the concepts involved in boosting illustrated by the algorithm
AdaBoost. Boosting is a deceptively simple technique for improving the accuracy of any
learning algorithm seemingly without the usual problems of over fitting to the training
data. Perhaps the most interesting by product of AdaBoost for CBR is its ability to find outliers
in the data set. In a CBR system these are bound to be "interesting"
cases. I wouldn't be surprised to see boosting turning up in a CBR paper soon. |
Agnar Aamodt chaired the sessions on CBR
that followed, which were moderately well attended. Certainly there were many faces
present who were not known to me as regular CBRers. The first paper was by David McSherry
called Demand driven discovery of adaptation knowledge. This work is closely
related to the paper he presented at yesterday's workshop.
It's a simple and elegant solution that again uses adaptation to find cases that do not
exist but can be solved. In a recursive process the system can create cases and then use
those to find new cases that can be solved. In this way a large number of solutions can be
solved with a very low case-coverage. This is however restricted to fairly constrained
domains where the adaptation function is additive or multiplicative. |
|
Qiang Yang then presented the first of
two papers called Dynamic refinement of feature weights using introspective learning.
The aim is to make a customer support CBR system responsive to change in its operating
environment. Currently this is done manually, but it is a very imprecise art. Yang's group
have adapted the reinforcement learning technique of Bonzano et al from Trinity College
Dublin so that feature weights are strengthened globally when retrieval is successful and
weakened globally when retrieval fails. The feature weights are represented with the back
prop algorithm of a NN. Consequently, the system is constantly refining its feature
weights after each retrieval. |
|
Qiang's second paper also was
inspired by the Irish CBRers. Remembering to Add: competence-preserving
case-addition policies for case maintenance draws on Barry Smyth's Remembering to
Forget paper. Using a case competence theorem derived from set theory Qiang can
remove cases from a case-base leaving just the pivotal cases required to maintain
case-base competence. This presented an interesting counterpoint to David McSherry's paper
which basically involves expanding to number of cases until just enough are needed to
solve all problem, where as Qiang's involves contracting the case-base to a just
sufficient number. The really interesting thing here is that we are starting to be able to
predict how many cases will be needed to provide a predictable degree of case coverage.
Sounds like CBR is becoming a science! |
After lunch Sunil Vadera (a colleague of
mine from the University of Salford)
presented PEBM: a probabilistic exemplar based model. This technique had
interesting resonances with Barry Smyth's competence models and footprint based retrieval.
Sunil uses statistics (Bayesian probability) to identify footprint (exemplar) cases
in weak ill-structured domains where it is difficult to be certain of the coverage of an
individual case. The model can be used to incrementally decide which cases should be
retained during case retention or authoring. The system performed well on binary
classification data sets, but less well on a set with 22 classes. It would be interesting
if Sunil or Barry Smyth could directly compare their approaches using the same data set or
case-base. |
|
Ekye Hullermeier ended the session with a
paper called Toward a probabilistic formalisation of case-based inference, which
was really about a formal definition of similarity in CBR. I will admit to being a little
baffled by this work and still not to clear on its relevance. Perhaps I was tired at the
end of a long day. |
|
We then went on to the
Conference Gala Dinner, which is definitely worth commenting on. After a pleasant scenic
boat cruise of about an hour we had dinner in the courtyard of the Vaxholm fortress. The
organisers had arranged some entertainment - acrobats, singers, you know the sort of
thing, which was quite diverting until the grand finale - a full Monty strip by two male
acrobats. Honest, I'm not joking! We then returned by boat to Stockholm where David Wilson
and I went to an Irish bar with some locals and then on to a night club where our internal
organs were rearranged by incredibly loud rave music. |