IB031 Introduction to Machine Learning

Faculty of Informatics
Spring 2017
Extent and Intensity
2/2/0. 4 credit(s) (plus extra credits for completion). Recommended Type of Completion: zk (examination). Other types of completion: z (credit).
Teacher(s)
doc. RNDr. Tomáš Brázdil, Ph.D. (lecturer)
doc. RNDr. Lubomír Popelínský, Ph.D. (lecturer)
RNDr. Karel Vaculík, Ph.D. (seminar tutor)
Guaranteed by
prof. RNDr. Mojmír Křetínský, CSc.
Department of Computer Science – Faculty of Informatics
Supplier department: Department of Computer Science – Faculty of Informatics
Timetable
Thu 8:00–9:50 A217
  • Timetable of Seminar Groups:
IB031/01: Wed 14:00–15:50 A219, K. Vaculík
IB031/02: Tue 8:00–9:50 B130, K. Vaculík
Prerequisites
Recommended courses are MB102 a MB103.
Course Enrolment Limitations
The course is offered to students of any study field.
Course objectives
By the end of the course, students should know basic methods of machine learning and understand their basic theoretical properties, implementation details, and key practical applications. Also, students should understand the relationship among machine learning and other sub-areas of mathematics and computer science such as statistics, logic, artificial intelligence and optimization.
Syllabus
  • Basic machine learning: classification and regression, clustering, (un)supervised learning, simple examples
  • Decision trees: learning of decision trees and rules
  • Logic and machine learning: specialization and generalization, logical entailment
  • Evaluation: training and test sets, overfitting, cross-validation, confusion matrix, learning curve, ROC curve; sampling, normalisation
  • Probabilistic models: Bayes rule, MAP, MLE, naive Bayes; introduction to Bayes networks
  • Linear regression (classification): least squares, relationship wih MLE, regression trees
  • Kernel methods: SVM, kernel transformation, kernel trick, kernel SVM
  • Neural networks: multilayer perceptron, backpropagation, non-linear regression, bias vs variance, regularization
  • Lazy learning: nearest neighbor method; Clustering: k-means, hierarchical clustering, EM
  • Practical machine learning: Data pre-processing: attribute selection and construction, sampling. Ensemble methods. Bagging. Boosting. Tools for machine learning. Weka.
  • Advanced methods: Inductive logic programming, deep learning.
Literature
    recommended literature
  • Simon Rogers, Mark Girolami. A First Course in Machine Learning . Chapman and Hall, 2011.
  • BERKA, Petr. Dobývání znalostí z databází. Vyd. 1. Praha: Academia. 366 s. ISBN 8020010629. 2003. info
    not specified
  • Pattern recognition and machine learning. Edited by Christopher M. Bishop. New York: Springer. xx, 738. ISBN 0387310738. 2006. info
  • MITCHELL, Tom M. Machine learning. Boston: McGraw-Hill. xv, 414. ISBN 0070428077. 1997. info
Bookmarks
https://is.muni.cz/ln/tag/FI:IB031!
Teaching methods
Lectures + practical exercises + project
Assessment methods
Intrasemestral exam, project, final exam.
Language of instruction
Czech
Follow-Up Courses
Further Comments
The course is taught annually.
The course is also listed under the following terms Spring 2015, Spring 2016, Spring 2018, Spring 2019, Spring 2020, Spring 2021, Spring 2022, Spring 2023, Spring 2024.
  • Enrolment Statistics (Spring 2017, recent)
  • Permalink: https://is.muni.cz/course/fi/spring2017/IB031