CS-401 | Applied data analysis (Fall)
This course teaches the basic techniques, methodologies, and practical skills required to draw meaningful insights from a variety of data. We start by an introduction to key stages of the data science pipeline, including data wrangling, interpretation, and visualization. The curriculum then delves into applied statistics, machine learning techniques (supervised and unsupervised learning), text mining, network data analysis, and strategies for managing large-scale datasets. Through hands-on projects, students gain practical experience with popular data science tools, preparing them for real-world analytical challenges.
Course webpage: https://epfl-ada.github.io/teaching/fall2024/cs401/
CS-502 | Deep learning in biomedicine (Spring)
This course covers recent deep learning methods and demonstrates how they can be applied to biomedical data. The course is centered around a comprehensive exploration of advanced deep learning architectures and paradigms for different input data types (e.g., convolutional neural networks for images, graph convolutional neural networks for graph-structured data, transformers for sequence data). We cover practical applications of state-of-the-art deep learning methodologies in biomedicine including medical image analysis, single-cell genomics, and DNA sequence analysis, among others. The course starts with a standard supervised learning setting and then covers the ongoing developments in methodologies that allow using scarcely labelled datasets by transferring knowledge across tasks.
Course webpage: https://edu.epfl.ch/coursebook/en/deep-learning-in-biomedicine-CS-502
CS-625 | Transfer learning and meta-learning (Spring)
While machine learning methods excel on tasks with large labeled datasets that can support learning of highly parameterized deep learning models, to solve real-world problems we need machine learning methods that can generalize to unseen domains and tasks. The goal of this course is to cover principles and recent advancements in machine learning methods that have the ability to solve multiple tasks and generalize to new domains in which training and test distributions are different. We cover topics such as few-shot learning, meta-learning, domain adaptation and domain generalization, zero-shot learning and self-supervised pretraining. Students are expected to read, review, present, and discuss relevant research papers in this area.
Course webpage: https://edu.epfl.ch/coursebook/en/transfer-learning-and-meta-learning-CS-625