Short Course Description
Bayesian decision Rules. Discriminant Functions.
Parameter estimation using maximum likelihood and using the Bayesian approach.
Non-parametric classifiers, non-parametric density estimation. Parzen windows. The nearest neighbors classification rule.
Linear models for regression and classification, least squares, regularization, lasso, ridge regression, logistic regression.
Maximum margin classification, support vector machines, kernel functions.
Neural networks, deep learning, applications.
Unsupervised learning, clustering methods, the K means algorithm.
The expectation maximization (EM) algorithm, applications to mixture model parameter estimation.
Markov and hidden Markov models (HMM), classification and parameter estimation, applications.
Principle component analysis (PCA).
Full syllabus will be available to registered students only