Carnegie Mellon University

10-601 (Machine Learning)

Fall and Spring: 

12 units Machine learning studies the question "how can we build computer programs that automatically improve their performance through experience?" This includes learning to perform many types of tasks based on many types of experience. For example, it includes robots learning to better navigate based on experience gained by roaming their environments, medical decision aids that learn to predict which therapies work best for which diseases based on data mining of historical health records, and speech recognition systems that lean to better understand your speech based on experience listening to you. This course is designed to give students an overview of the methods, theory, and algorithms of machine learning. The topics will include Decision Trees, Naive Bayes, Logistic regression, Kernel regression, Support Vector Machines, EM algorithm, Clustering, Hidden Markov Models, Bayesian Networks, Bias-variance tradeoff, overfitting, PAC learning, and some advanced topics, if time permits. Students entering the class with a pre-existing working knowledge of probability, statistics and algorithms will be at an advantage, but the class has been designed so that anyone with a strong numerate background can catch up and fully participate. By the end of the class, students will have a working knowledge of the state of the art in Machine Learning, experience applying these algorithms to real-world data sets, and an ability to read and understand papers in the current research literature. For graduate students, the pre-reqs are to have a good basic understanding of computational complexity and data structures, and to be proficient at writing programs at least several hundred lines long in a general-purpose programming language such as JAVA, C++, or C. If you are interested in this topic, are a PhD student, and would like a more formal treatment of the material, you might consider 10-701.