Support Vector Learning

The Support Vector Machine is a new type of learning machine for pattern recognition and regression problems which constructs its solution in terms of a subset of the training data, the Support Vectors. This page collects some material that might be helpful for people interested in Support Vector Machines: a bibliography, a list of people working on Support Vectors, and a brief discussion of Support Vectors in view-based object recognition.


Support Vector Learning Bibliography

Publications on SV Machines (in chronological order)

1970-1990
  • Vapnik, V.; Chervonenkis, A. 1974. Theory of Pattern Recognition [in Russian]. Nauka, Moscow. (German Translation: Wapnik, W.; Tscherwonenkis, A. 1979. Theorie der Zeichenerkennung, Akademie-Verlag, Berlin.
  • Vapnik, V. 1979. Estimation of Dependences Based on Empirical Data [in Russian]. Nauka, Moscow. (English translation: 1982, Springer Verlag, New York)
  • 1990-1995
  • Boser, B.; Guyon, I..; Vapnik, V. 1992. A training algorithm for optimal margin classifiers. Fifth Annual Workshop on Computational Learning Theory. ACM Press, Pittsburgh.
  • Guyon, I.; Boser, B.; Vapnik, V. 1993. Automatic Capacity Tuning of Very Large VC-Dimension Classifiers. Advances in Neural Information Processing Systems Vol. 5 Morgan Kaufmann, San Mateo, CA.
  • Cortes, C.; and Vapnik, V. 1995. Support Vector Networks. Machine Learning 20:273-297.
  • Schölkopf, B.; Burges, C.; & Vapnik, V. 1995. Extracting support data for a given task. In: U. M. Fayyad and R. Uthurusamy (eds.), Proceedings, First International Conference on Knowledge Discovery & Data Mining, AAAI Press, Menlo Park, CA. (6 pages, 48 K)
  • Vapnik, V. 1995. The Nature of Statistical Learning Theory. Springer-Verlag, New York.
  • 1996

  • Schölkopf, B. 1996. Künstliches Lernen. Forum für Interdisziplinäre Forschung 15:93-117. (Komplexe adaptive Systeme, Hrsg. S. Bornholdt & P. H. Feindt, Verlag Röll, Dettelbach) (25 pages, 112 K)
  • Burges, C. 1996. Simplified Support Vector decision rules. 13th International Conference on Machine Learning.
  • Blanz, V.; Schölkopf, B.; Bülthoff, H.; Burges, C.; Vapnik, V.; & Vetter, T. 1996. Comparison of view--based object recognition algorithms using realistic 3D models. In: C. von der Malsburg, W. von Seelen, J. C. Vorbrüggen, and B. Sendhoff (eds.): Artificial Neural Networks - ICANN'96. Springer Lecture Notes in Computer Science Vol. 1112, Berlin, 251-256. (6 pages, 441 K)
  • Schölkopf, B.; Burges, C.; & Vapnik, V. 1996. Incorporating Invariances in Support Vector Learning Machines. In: C. von der Malsburg, W. von Seelen, J. C. Vorbrüggen, and B. Sendhoff (eds.): Artificial Neural Networks - ICANN'96. Springer Lecture Notes in Computer Science Vol. 1112, Berlin,47-52 (6 pages, 67 K)
  • Schölkopf, B.; Smola, A.; Mülller, K.-R. 1996. Nonlinear Component Analysis as a Kernel Eigenvalue Problem. Technical Report No. 44. Max-Planck-Institut für biologische Kybernetik, Tübingen. (18 pages, 239 K)
  • Schölkopf, B.; Sung, K.; Burges, C.; Girosi, F.; Niyogi, P.; Poggio, T.; Vapnik, V. 1996. Comparing Support Vector Machines with Gaussian Kernels to Radial Basis Function Classifiers. AI Memo No. 1599; CBCL Paper No. 142, Massachusetts Institute of Technology, Cambridge. (7 pages, 116 K)
  • 1997

  • Burges, C.; and Schölkopf, B. 1997. Improving the accuracy and speed of support vector machines. In: M. Mozer, M. Jordan, and T. Petsche (eds.): Neural Information Processing Systems, Vol. 9. MIT Press, Cambridge, MA, 1997 (in press). (7 pages, 144 K)
  • Vapnik, V.; Golowich, S.; Smola, A. 1997. Support Vector Method for Function Approximation, Regression Estimation, and Signal Processing. In: M. Mozer, M. Jordan, and T. Petsche (eds.): Neural Information Processing Systems, Vol. 9. MIT Press, Cambridge, MA, 1997 (in press).
  • Drucker, H.; Burges, C.; Kaufman, L.; Smola, A.; Vapnik, V. 1997. Support Vector Regression Machines. In: M. Mozer, M. Jordan, and T. Petsche (eds.): Neural Information Processing Systems, Vol. 9. MIT Press, Cambridge, MA, 1997 (in press).
  • Smola, A.; Schölkopf, B. 1997. On a Kernel-based Method for Pattern Recognition, Regression, Approximation and Operator Inversion. GMD Technical Report No. 1064.
  • Osuna, E.; Freund, R.; Girosi, F. 1997. Training Support Vector Machines:An Application to Face Detection. To appear in: CVPR'97
  • Osuna, E.; Freund, R.; Girosi, F. 1997. Improved Training Algorithm for Support Vector Machines. To appear in: NNSP'97
  • Mukherjee, S.; Osuna, E.; Girosi, F. 1997. Nonlinear Prediction of Chaotic Time Series using a Support Vector Machine. To appear in: NNSP'97
  • Mülller, K.-R.; Smola, A.; Rätsch, G.; Schölkopf, B.; Kohlmorgen, J.; Vapnik, V. 1997. Predicting Time Series with Support Vector Machines. Submitted to: ICANN'97. (6 pages, 42 K)
  • Drop me a line if you have any additions to this list.


    People Working on Support Vectors

    Kristin Bennett (Rensselaer Polytechnic Institute, NY)

    Volker Blanz (MPI für biologische Kybernetik, Tübingen)

    Leon Bottou (AT&T Research, Holmdel, NJ)

    Chris Burges (Bell Laboratories, Holmdel, NJ) (Bell Labs SV project)

    Harris Drucker (Bell Laboratories, Holmdel, NJ)

    Federico Girosi (MIT, Cambridge, MA)

    Thorsten Joachims (Uni Dortmund)

    Ulrich Kressel (Daimler-Benz AG, Ulm)

    Klaus-Robert Müller (GMD First, Berlin)

    Edgar Osuna (MIT, Cambridge, MA)

    Bernhard Schölkopf (MPI für biologische Kybernetik, Tübingen)

    Alex Smola (GMD First, Berlin)

    Mark Stitson (Royal Holloway and Bedford College, London) ( SV project )

    Vladimir Vapnik (AT&T Research, Holmdel, NJ)

    Drop me a line if you want to be included in this list.


    Support Vector Learning: a Theoretical Foundation for Exemplar-based Object Recognition?

    Learning can be thought of as inferring regularities from a set of training examples. Much research has been devoted to the study of various learning algorithms which allow the extraction of these underlying regularities. If the learning has been successful, these intrinsic regularities will be captured in the values of some parameters of a learning machine; for a polynomial classifier, these parameters will be the coefficients of a polynomial, for a neural net they will be weights and biases, and for a radial basis function classifier they will be weights and centers. This variety of different representations, however, conceals the fact that no matter how different the outward appearance of these algorithms is, they all must rely on intrinsic regularities of the data. The Support Vector Learning Algorithm (Boser, Guyon & Vapnik, 1992, Cortes & Vapnik, 1995) is a promising tool for studying these regularities (i.e. for studying learning) in pattern classification:

    # 2-D example pic #

    Experimental results (Schölkopf, Burges & Vapnik, 1995) showed that for the case of handwritten digit recognition,

    The support vector set allows the incorporation of transformation invariance into SV classifiers, significantly improving accuracy (Schölkopf, Burges & Vapnik, 1996). Together with the "reduced set method" this yields fast high-accuracy classifiers (Burges & Schölkopf 1997).

    sv-machine architecture


    With its principled way of extracting statistically critical examples to represent classes, support vector learning may provide a theoretical foundation for exemplar-based approaches to object recognition (Bülthoff & Edelman, Proc. Natl. Acad. Sc. 89:60-64, 1992). We have applied support vector machines to object recognition (Blanz et al., 1996); for benchmarking, our set of images of rendered chair models is available on our ftp server. Future work will include psychophysical experiments studying the relevance of support vectors in human visual learning.



    [Bernhard's Personal Page]



    Last modified: 10 June 1997

    bs@mpik-tueb.mpg.de