News:Latest news

 

Index
News
* Latest news
* Archive

 


See also: ACAI'99

Workshop on Support Vector Machine Theory and Application at ACAI'99

Chania, Greece, July 1999

by Konstantinos Veropoulos

The ACAI '99 workshop on "Support Vector Machine Theory and Applications" was organised by Theodoros Evgeniou, Massachusetts Institute of Technology. Prof. Sergios Theodoridis, University of Athens gave a lecture on “Statistical Pattern Recognition" that preceded the workshop the same morning. That provided an ideal starting point for the invited talk on “Pattern Classification with Support Vector Machines" by Nello Cristianini, University of Bristol, who gave an introduction to the field. Cristianini started the talk by describing Support Vector Machines (SVMs) as maximal margin hyperplanes in a feature space, based on the description of the perceptron algorithm given in the morning by Prof. Theodoridis. That led to an alternative description of the algorithm that naturally introduced the concepts of dual representation and margin, before the discussion on kernels and Vapnik-Chervonenkis (VC) bounds. Then Christianini moved on to cover new results. An interesting new idea was the extension of SVMs to the multi-class categorisation case, obtained by using a Directed Acyclic Graph (DAG) whose nodes are associated with a maximal margin hyperplane in a feature space. For such construction, rigorous VC bounds exist (obtained by J. Platt, N. Cristianini and J. Shawe-Taylor) and experimental results are encouraging both from the generalisation and the computational viewpoint. The regular talks that followed were well balanced in topics, covering learning theory (N. Vayatis), implementation techniques (T. Trafalis), variations of the algorithm (R. Fernandez) and applications of SVM (K. Veropoulos; C. Kotropoulos). The talk of Nicolas Vayatis ("How to estimate the Vapnik-Chervonenkis Dimension of Support Vector Machines through simulations". N. Vayatis & R. Azencott) described some experimental techniques for estimating the effective VC dimension of a learning system, that provide a new insight on how such quantity can be effectively observed in SVMs.
The two talks on applications of SVM were both in the area of machine vision. The first talk ("The Application of Support Vector Machines to Medical Decision Support: A case study". K. Veropoulos, N. Cristianini & C. Campbell) proposed an application of SVMs to automated diagnosis of Tuberculosis from photomicrographs of sputum smears. This is the first time that SVMs are used in medical problems and the objective of the talk was to show that SVM classifiers are good candidates in this area. An interesting point was the introduction of two methods that can be used for controlling the performance of the system on a particular class of the data. This is practically very useful for most medical applications, were control of the performance on one or more classes is necessary, as well as for applications in other areas, where the classification system is presented with unbalanced data sets (one class is significantly larger than the other).
The second talk on applications of SVM ("Enhancing the performance of elastic graph matching for face authentications by using support vector machines". A. Tefas, C. Kotropoulos & I. Pitas) presented a system for face recognition based on a combination of statistical pattern recognition and Support Vector Machines. Fisher Linear Discriminants, SVM, and a new proposed variation of Fisher linear discriminant that is similar to SVM are used for improving the performance of elastic graph matching for face authentication.
The algorithm described by Rodrigo Fernandez ("Predicting time series with local support vector regression machine". R. Fernandez) was in the area of SVM-regression machines. Fernandez used a variation of the standard SVM regression formulation that predicts a Time-Series by automatically adjusting its flexibility. A novel approach was the use of a number of SVM regression models (local SVMs), each one trained using local information (part of the initial training data), instead of the use of one global regression model. This approach helped reducing the complexity of the prediction problem and showed a remarkable improvement in performance over standard approaches.
Finally the talk of Trafalis ("Primal-dual optimisation methods in neural networks and support vector machines training". T. Trafalis) surveyed the application of standard optimisation techniques for solving the particular Quadratic Programming problems arising in SVMs implementation. Primal-Dual interior point methods were especially discussed.
Overall, the workshop was both a source of new ideas and an interesting forum for discussions. It provided a complete and interesting overview of SVM theory and applications and novel approaches that were presented there ranged from developing efficient methods for training SVMs to designing variations of the standard SVM algorithm for practical usage. Some of the most interesting points discussed in the workshop are

  • the extension of SVM to multi-class categorisation problems (as discussed by N. Cristianini) that can give the opportunity to apply SVM theory on a wider area of applications,
  • the use of different regularisation parameters for each of the two classes (as K. Veropoulos presented) that provides an alternative design of SVM in pattern recognition,
  • and, the use of ensembles of SVMs for improving the performance of SVMs when used for regression problems (as R. Fernandez showed).

 

Index
News
* Latest news
* Archive