A hands-on introduction to modern pattern recognition. We begin in medias res by training two intuitive classifiers (a perceptron and a kNN classifier) to recognize digits. To determine which one is better we must state what "better" means, so we introduce "generalization." To talk about generalization we need to build the probabilistic framework in which modern pattern recognition resides. From here the plan is clear: we imagine that probabilistic descriptions of data (that is, the knowledge) is given to us and derive optimal classifiers. We then move to supervised learning from training samples, linear classification and regression, feature design (especially the multiscale or spectral perspective which has seen great success for images, audio, text, ..., and which has beautiful connections to physics). We apply our probabilistic models to problems in image restoration—denoising, demosaicking, deblurring, as well as—time permitting—segmentation. We then eschew feature design in favor of deep neural networks. Finally, we pivot back to solid ground and ask: what good are probabilistic description of the world if we cannot generate samples from them? This leads us to consider sampling methods such as Markov chain Monte Carlo. We'll wrap up with an eclectic selection of contemporary topics chosen in the very last minute.


Week Topic Reading Assignment

Tue—Two classifiers + introduction to pattern recognition (ipynbhtml, book)
Fri—Generalization via cat and dog people; modeling knowledge (ipynb, html)
Rec—Refresher on linear algebra and probability (ipynb)

Clever Hans Predictors
Chapters 1 and 2 in PPA
On Frank Rosenblatt



Rec—Python clinique support


Sample assignment
3 Tue—Knowledge Modeling and Likelihood Tests (ipynb, book)
Rec—Sample assignment and exercise
Read PPA Chapter 2 and MML Section 5.7.2

1st assignment published


4 Linear methods and the importance of good features (ipynb) Chapter 7 in MML  
5 Tue—Logistic regression (ipynbhtmlpdf)
Rec—Sample assignment and exercise
Chapter 8 in MML



Signal processing, the multiscale idea, seeds of CNNs

Tue—Filtering and boundary detection (ipynbhtmlpdf)

Chapter 3.1, 3.2 and 3.3  in FPS  

Chapter 7.2 in CVAA

1st assignment due on 24.10

2nd assignment published (link)


Filtering and the Fourier transform (ipynbhtmlpdf, book)


Discrete Fourier Transform (link)

Chapter 3 in Foundations of Signal Processing (link)




Deep learning bootcamp

Tue—Deep learning math basics (ipynb,pdf)

Fri—Pytorch, GPU, and some coding tricks in ML(colab)




Statistical framing for deblurring and denoising, image models, MMSE and LMMSE estimation (link)

Chapter 3.6 in FPS

Chapter 4 and 5 in SPC

2nd assignment due on 14.11

3rd assignment published (link)

10  Statistical framing for deblurring and denoising continue, Wiener filter (link), U-Net (link)  


11 Autoencoder, PCA [MML]  


12 Sampling [MCMC, MCMC2], Ising model (link, link)  

3rd assignment due on 5.12

4th assignment published (link)

13 Applications of MCMC [MCMC, MCMC2]  



4th assignment due on 19.12

5th assignment published




We will assume familiarity with basic probability and linear algebra. We will review the most important concepts along the way but these reviews cannot replace entire courses. All programming examples and assignments will be in Python.


The course materials will be posted to this gitlab repo. The easiest way to browse the html and ipynb slides is to clone the repo. A growing collection of lecture notes can be found in this jupyterbook.

We will (strongly) recommend weekly reading from various freely-available resources:

[PPA]    Patterns, Predictions, and Actions: A book by Moritz Hardt and Ben Recht

[AML]   Applied Machine Learning course by Volodymyr Kuleshov

[MML] Machine Learning: A Probabilistic Perspective (2012) by Kevin Murphy

[FPS] Foundations of Signal Processing by Martin Vetterli, Jelena Kovaˇcevi´c and Vivek K Goyal 

[SPC] Signal Processing for Communications by Paolo Prandoni and Martin Vetterli

[CVAA] Computer Vision: Algorithms and Applications,  2nd Edition, by Richard Szeleski

[MCMC] The Markov Chain Monte Carlo Revolution, by Persi Diaconis

[MCMC2] Introduction to Monte Carlo Methods, by D. Mackay

Class time and Location

Tuesday lecture will be given in Kollegienhaus, Hörsaal 118 and Friday lecture will be in Kollegienhaus, Hörsaal 115.

Tuesday lecture takes place from 08.40 am to 10.00 am. Friday lecture takes place from 10.15 am to 12.00 pm.

Exercise sessions will take place on Monday, 2.15 pm - 4.00 pm and Wednesday, 4.15 pm - 6.00pm. All are in Spiegelgasse 5, Seminarraum 05.002.



Prof. Dr. Ivan Dokmanić: ivan.dokmanic[at]unibas.ch


Teaching assistants

Kian Hunziker: kian.hunziker[at]unibas.ch

Cheng Shi: cheng.shi[at]unibas.ch

Vinith Kishore: vinith.kishore[at]unibas.ch

Hieu Nguyen: hieuhuu.nguyen[at]unibas.ch

Valentin Debarnot: valentin.debarnot[at]unibas.ch


Homework assignment 50%
Final exam 40%
Attendance and participation 10%