top of page
atu_logo.png

EEE7019 - Pattern Recognition (2025-2026 Spring)

Course Details

  • Venue: Dr Zor's Office (3rd Floor, M4B Building)

  • Date&Time: Tuesdays 09:15-12:00

  • Textbook:

    • C. M. Bishop, Pattern Recognition and Machine Learning, Springer, 2006.

    • I. Goodfellow, Y. Bengio, A. Courville, Deep Learning, MIT Press, 2016.

  • Reference Books:

    • Stanford CS229: Machine Learning lecture notes and videos (https://cs229.stanford.edu)

    • MIT Pattern Recognition and Analysis and related MIT OpenCourseWare materials

    • K. P. Murphy, Probabilistic Machine Learning: An Introduction, MIT Press, 2022 (selected chapters)

    • Official documentation of PyTorch and/or TensorFlow for implementation details and examples

  • Objectives: The aim of this course is to provide students with a solid theoretical and practical understanding of modern pattern recognition and machine learning methods. Starting from classical statistical approaches, the course extends to contemporary techniques such as deep neural networks, convolutional networks, attention mechanisms and generative models for classification, regression, clustering and representation learning. By the end of the course, students are expected to be able to design, train, evaluate and critically analyse end‑to‑end pattern recognition systems on real‑world data.

  • Contents: The course content is structured to integrate fundamental statistical pattern recognition theory with modern deep learning methods.

    • Introduction to pattern recognition and types of pattern recognition problems

    • Review of probability theory, Bayesian decision theory, parametric and non‑parametric density estimation

    • Linear and quadratic classifiers, Fisher’s discriminant, PCA and other dimensionality reduction methods

    • Mixture models, Gaussian mixture models and the EM algorithm, statistical clustering techniques

    • Multilayer perceptrons (MLPs), backpropagation, modern optimisation and regularisation techniques

    • Convolutional neural networks (CNNs), image classification, transfer learning and pre‑trained models

    • RNNs, LSTMs and GRUs for time‑series and sequential data; basic hidden Markov models (HMMs)

    • Attention mechanisms and the Transformer architecture, introduction to Vision Transformers

    • Support vector machines (SVMs) and kernel methods

    • Autoencoders, Variational Autoencoders (VAEs), Generative Adversarial Networks (GANs); introduction to generative modelling

    • Decision trees and ensemble methods (Random Forest, Boosting)

    • Introductory treatment of graph neural networks; fairness, bias and ethical aspects of machine learning

    • Case studies and a term project based on real‑world datasets

Assessment and Evaluation

  • Midterm: Project Proposal

  • Final: Project Presentation

Announcements

bottom of page