The slides of the lectures can be downloaded from here

  • The First Lecture, an introduction to Big Data and Data Mining
  • The Second Lecture, the effects of Big on Data Analysis
  • The Third Lecture, Classification, an introduction to and the analysis of this venerable problem
  • The Fourth Lecture, The introduction of PAC learning
  • The Fifth Lecture, Showing that finite hypothesis sets can be PAC learned agnostically and the introduction of the VC dimension
  • The Sixth Lecture, Computing the VC dimension of some Boolean hypothesis classes, the intrroduction of the growth function and computing the VC dimension of polynomial classifiers
  • The Seventh Lecture, We proof the Fundamental Theorem: a hypothesis class is PAC laeranable iff it has a finite VC dimension. We then discuss what we have achieved so far, are there still desirables that are not met and if so, what do we do about those?
  • The Eighth Lecture, A further discussion of the proof of the Fundamental Theorem leading to the introduction of Rademacher Complexity, yielding error bounds that can be estimated from the data.
  • The Ninth Lecture, Non-uniform learning with the SRM rule and the Minimu Description Length rule as a special case
  • The Tenth Lecture, Weak Learning and Boosting equal strong learning
  • The Eleventh Lecture, Introducing Frequent Itemset Mining and a first attempt at mining from a sample
  • The Twelfth Lecture, Deriving sample size bounds for probably approximately correct itemser mining
  • The Thirteenth Lecture, Introducing compression as the inductive method to battle the patternset explosion
  • The Fourteenth Lecture, Maiking AIT concrete: The Krimp Algorithm