A Graduate Course:

Statistical Learning Theory


  • COURSE OUTLINE:

  • The course investigates tools for analysis of performance of simplified models for prediction, estimation, and classification of complex data.
    The course starts with the theory and algorithms of support vector machines and the Vapnik-Chervonenkis theory.
    The techniques of analysis are rooted in information theory (minimax compression and learning, 'blowing-up lemma'), PAC-Bayesian theorems and concentration inequalities. Tools from probability are Bennett's, Hoeffding's, Chernoff's, Azuma's and McDiarmid's inequalities.
    Oracle inequalities, non-asymptotic bounds on the statistical risk, selfboundedness of Vapnik entropy and concentration inequalities for statistical learning (entropy method, logarithmic Sobolev inequalities) will be presented.

  • Prerequisites:

  • A graduate course in measure and integration theory (e.g., given by TM/MAI)
  • A graduate course in probability and stochastic processes (e.g, given by mat.stat./MAI)
  • A graduate course in statistical inference. (e.g., given by stat./MAI)
  • An undergraduate/graduate course in information theory (e.g. given by Division of Data Transmission/ISY)
  • A graduate course in Markov Chain Monte Carlo (e.g., given by mat.stat./MAI).

  • Examination

  • Presentations by participants. Home Assignments.
    THE PRELIMINARY RECOMMENDED NUMBER OF CREDIT UNITS FOR `DOKTORANDLADOK' IS = 10 CU.

  • COURSE LITERATURE:

  • O. Catoni (2004): Statistical Learning Theory and Stochastic Optimization, Lecture Notes in Mathematics 1851, Springer Verlag
  • V.N.Vapnik (1998): Statistical Learning Theory. chapters 14-16. John Wiley & Sons.
  • M. Vidyasagar (2003): Learning and Generalization. Springer.
  • Material on Support Vector Machines Klicka här.

  • Supporting papers/lecture notes will be handed out during the lectures:



  • SCHEDULE


  • Back to

    tikos@mai.liu.se

    Senast uppdaterad 2006-01-26.