IB031 Introduction to Machine Learning

Faculty of Informatics
Spring 2016
Extent and Intensity
2/0/1. 3 credit(s) (plus extra credits for completion). Type of Completion: zk (examination).
doc. RNDr. Tomáš Brázdil, Ph.D. (lecturer)
doc. RNDr. Lubomír Popelínský, Ph.D. (lecturer)
RNDr. Karel Vaculík, Ph.D. (seminar tutor)
Guaranteed by
prof. RNDr. Mojmír Křetínský, CSc.
Department of Computer Science – Faculty of Informatics
Supplier department: Department of Computer Science – Faculty of Informatics
Wed 10:00–11:50 A217
  • Timetable of Seminar Groups:
IB031/01: each even Friday 8:00–9:50 A219, K. Vaculík
IB031/02: each odd Friday 8:00–9:50 A219, K. Vaculík
Recommended courses are MB102 a MB103.
Course Enrolment Limitations
The course is offered to students of any study field.
Course objectives
By the end of the course, students should know basic methods of machine learning and understand their basic theoretical properties, implementation details, and key practical applications. Also, students should understand the relationship among machine learning and other sub-areas of mathematics and computer science such as statistics, logic, artificial intelligence and optimization.
  • Basic machine learning: classification and regression, clustering, (un)supervised learning, simple examples
  • Decision trees: learning of decision trees and rules
  • Logic and machine learning: specialization and generalization, logical entailment
  • Evaluation: training and test sets, overfitting, cross-validation, confusion matrix, learning curve, ROC curve; sampling, normalisation
  • Probabilistic models: Bayes rule, MAP, MLE, naive Bayes; introduction to Bayes networks
  • Linear regression (classification): least squares, relationship wih MLE, regression trees
  • Kernel methods: SVM, kernel transformation, kernel trick, kernel SVM
  • Neural networks: multilayer perceptron, backpropagation, non-linear regression, bias vs variance, regularization
  • Lazy learning: nearest neighbor method; Clustering: k-means, hierarchical clustering, EM
  • Practical machine learning: Data pre-processing: attribute selection and construction, sampling. Ensemble methods. Bagging. Boosting. Tools for machine learning. Weka.
  • Advanced methods: Inductive logic programming, deep learning.
    recommended literature
  • Simon Rogers, Mark Girolami. A First Course in Machine Learning . Chapman and Hall, 2011.
  • BERKA, Petr. Dobývání znalostí z databází. Vyd. 1. Praha: Academia, 2003, 366 s. ISBN 8020010629. info
    not specified
  • Pattern recognition and machine learning. Edited by Christopher M. Bishop. New York: Springer, 2006, xx, 738. ISBN 0387310738. info
  • MITCHELL, Tom M. Machine learning. Boston: McGraw-Hill, 1997, xv, 414. ISBN 0070428077. info
Teaching methods
Lectures + practical exercises + project
Assessment methods
Intrasemestral exam, project, final exam.
Language of instruction
Follow-Up Courses
Further Comments
Study Materials
The course is taught annually.
The course is also listed under the following terms Spring 2015, Spring 2017, Spring 2018, Spring 2019, Spring 2020, Spring 2021, Spring 2022, Spring 2023, Spring 2024, Spring 2025.
  • Enrolment Statistics (Spring 2016, recent)
  • Permalink: https://is.muni.cz/course/fi/spring2016/IB031