#
FI:IB031 Intro to Machine Learning - Course Information

## IB031 Introduction to Machine Learning

**Faculty of Informatics**

Spring 2018

**Extent and Intensity**- 2/2/0. 4 credit(s) (plus extra credits for completion). Recommended Type of Completion: zk (examination). Other types of completion: z (credit).
**Teacher(s)**- doc. RNDr. Tomáš Brázdil, Ph.D. (lecturer)

doc. RNDr. Lubomír Popelínský, Ph.D. (lecturer)

RNDr. Karel Vaculík, Ph.D. (seminar tutor)

Mgr. Jaroslav Čechák (assistant)

Mgr. Veronika Krejčířová (assistant) **Guaranteed by**- prof. RNDr. Mojmír Křetínský, CSc.

Department of Computer Science - Faculty of Informatics

Supplier department: Department of Computer Science - Faculty of Informatics **Timetable**- Mon 8:00–9:50 A318
- Timetable of Seminar Groups:

*K. Vaculík*

IB031/02: Mon 18:00–19:50 B130,*K. Vaculík* **Prerequisites**- Recommended courses are MB102 a MB103.
**Course Enrolment Limitations**- The course is offered to students of any study field.
**Course objectives**- By the end of the course, students should know basic methods of machine learning and understand their basic theoretical properties, implementation details, and key practical applications. Also, students should understand the relationship among machine learning and other sub-areas of mathematics and computer science such as statistics, logic, artificial intelligence and optimization.
**Learning outcomes**- By the end of the course, students

- will know basic methods of machine learning;

- will understand their basic theoretical properties, implementation details, and key practical applications;

- will understand the relationship among machine learning and other sub-areas of mathematics and computer science such as statistics, logic, artificial intelligence and optimization;

- will be able to implement and validate a simple machine learning method. **Syllabus**- Basic machine learning: classification and regression, clustering, (un)supervised learning, simple examples
- Decision trees: learning of decision trees and rules
- Logic and machine learning: specialization and generalization, logical entailment
- Evaluation: training and test sets, overfitting, cross-validation, confusion matrix, learning curve, ROC curve; sampling, normalisation
- Probabilistic models: Bayes rule, MAP, MLE, naive Bayes; introduction to Bayes networks
- Linear regression (classification): least squares, relationship wih MLE, regression trees
- Kernel methods: SVM, kernel transformation, kernel trick, kernel SVM
- Neural networks: multilayer perceptron, backpropagation, non-linear regression, bias vs variance, regularization
- Lazy learning: nearest neighbor method; Clustering: k-means, hierarchical clustering, EM
- Practical machine learning: Data pre-processing: attribute selection and construction, sampling. Ensemble methods. Bagging. Boosting. Tools for machine learning. Weka.
- Advanced methods: Inductive logic programming, deep learning.

**Literature**- Simon Rogers, Mark Girolami. A First Course in Machine Learning . Chapman and Hall, 2011.
- BERKA, Petr.
*Dobývání znalostí z databází*. Vyd. 1. Praha: Academia, 2003. 366 s. ISBN 8020010629. info

*recommended literature***Bookmarks**- https://is.muni.cz/ln/tag/FI:IB031!
**Teaching methods**- Lectures + practical exercises + project
**Assessment methods**- Intrasemestral exam, project, final exam.
**Language of instruction**- Czech
**Follow-Up Courses****Further Comments**- Study Materials

The course is taught annually.

- Enrolment Statistics (Spring 2018, recent)
- Permalink: https://is.muni.cz/course/fi/spring2018/IB031