Conference: STATISTICAL LEARNING, THEORY AND APPLICATIONS
  KXEN logo
  November 14–15, 2002
CNAM Paris

Speaker biographies:

Jerome H. Friedman (Stanford University, Palo Alto - USA)

Professor, Department of Statistics, Stanford University, and Leader, Computation Research Group, Stanford Linear Accelerator Center

Jerome Friedman has contributed a remarkable array of topics and methodologies to data mining and machine learning during the last 25 years. In 1977, as leader of the numerical methods group at the Stanford Linear Accelerator Center (SLAC), he co-authored several algorithms for speeding up nearest-neighbour classifiers. In the following seven years, he collaborated with Leo Breiman, Richard Olshen, and Charles Stone to produce a landmark work in decision tree methodology, "Classification and Regression Trees" (1984), and released the commercial product CART®. Jerome H. Friedman was the winner of the SIGKDD 2002 Innovation Award.

His latest book, written with T. Hastie and R. Tibshirani, is The Elements of Statistical Learning, Springer 2001.

[Top of page]

David J. Hand (Imperial College, London - UK)

Professor of Statistics in the Department of Mathematics at Imperial College.

David Hand has broad research interests, including multivariate statistics, classification methods, pattern detection, the interface problem between statistics and computing, and the foundations of statistics. He is interested in applications in medicine, psychology, and finance. Professor Hand has wide-ranging consultancy experience in organisations ranging from banks, through pharmaceutical companies, to governments.

His latest book, written with H. Mannila and P. Smyth, is Principles of Data Mining, MIT Press, 2001.

[Top of page]

Bernhard Schölkopf (Max Planck Society, Tübingen - Germany)

Director of MPI for Biological Cybernetics, KXEN Scientific Committee member

Dr. Bernhard Schölkopf is an acknowledged international expert in pattern analysis and machine intelligence, support vector machines and kernel-based algorithms. His thesis on Support Vector Learning won the annual dissertation prize of the German Association for Computer Science. He has worked for AT&T Bell Labs as well as Microsoft Research. In 1998, he won the prize for the best scientific project at the German National Research Center for Computer Science and in July 2001, he was appointed scientific member of the Max Planck Society and director at the MPI for Biological Cybernetics.

His latest book, written with A.Smola is, Learning with kernels, MIT Press, 2002.

[Top of page]

Vladimir Vapnik (NEC Research Institute, Princeton - USA)

Senior Research Scientist

Vladimir Vapnik, PhD Degree in Mathematics, was of two people behind VC dimension (Vapnik-Chervonenkis dimension), a key concept that led to the creation of a whole field of Computation Learning Theory. From 1965 to 1990 he worked at the Institute of Control Sciences, Moscow, where he became Head of the Machine Learning Research Department. He then moved to the United States and joined AT&T Bell Laboratories and later AT&T Labs-Research. Vapnik has taught and researched in theoretical and applied statistics for over 30 years. He has published 7 books and over a hundred research papers. His major achievements have been the development of a general theory for minimizing the expected risk using empirical data, and a new type of learning machine, called Support Vector Machine, that possesses a high level of generalization ability.

His research is presented in his latest books, The Nature of Statistical Learning Theory, Springer, New York, 1995 (Second Edition, 2000), and Statistical Learning Theory, J Wiley, New York, 1998.

[Top of page]