We are constantly looking for exceptional and highly motivated new team members. So don’t hesitate to send your prospective application following these guidelines.

Postdoctoral fellows

We are looking for highly motivated postdoctoral candidates to join us on a scientific journey! We invite candidates that are excited about conducting breakthrough research in machine learning and/or at the intersection of machine learning and biomedicine. The candidates should have a strong computational background. Candidates should have a Ph.D. or equivalent degree in computer science, statistics, applied math, computational biology or a closely related field. Machine learning topics we are especially focused on: domain adaptation/generalization, few-shot learning, unsupervised and semi-supervised learning, graph representation learning, explainable AI, but candidates from other areas are also welcomed to apply. Prior experience in working with biomedical data is not required but a plus.

Application:
To apply, please send the following documents to Maria with the email subject line: “Postdoctoral application”:

    1. CV with a list of your publications
    2. Research statement (1-2 pages) that describes your prior research and future research plans
    3. Arrange for at least two letters of reference from your previous research supervisors and collaborators to be sent to Maria

Additionally, you can check out these opportunities for fully funded postdoctoral positions. Feel free to reach out if you are interested in doing research in the lab under these fellowships:

PhD students

We are always looking for highly motivated and talented PhD students. PhD students can join the lab through EPFL’s PhD program in computer science (EDIC)  or computational biology (EDCB). The admission process in both programs is handled centrally at EPFL. The system allows you to specify labs you are interested in so you do not need to contact Maria.

Summer Research Interns

If you are a Bachelor’s or Master’s student, consider applying to the Summer@EPFL internship program. If you are looking for a PhD internship, send Maria an email directly.

Bachelor & Master students

 

How to apply for Master’s Thesis with Our Lab

If you are an EPFL master’s student interested in conducting your thesis with our lab, send Maria an email including CV and transcripts and use subject line “Master’s Thesis Application”.

How to apply for Bachelor & Master Research Projects

Please apply using the following link. If you are interested in multiple projects, please send only one application and indicate the projects you are interested in the motivation letter.

Below you can find our offerings for Bachelor & Master Research Projects for the 2024 fall semester. (Last update: 17.05.2024) Please, note that the list of topics below is not exclusive. In case you have other ideas or proposals, you are welcome to contact a senior member of the lab and talk about possibilities for a tailor-made topic.

Students interested in doing a project are encouraged to have a look at the Thesis & Project Guidelines, where you will gain an understanding about what can be expected of us and what we expect from students.

 

  • Project 1: Multi-target Open-World Semi-Supervised Learning

Keywords: open-world semi-supervised learning

In many real-world applications, unlabeled test data contains new classes that have not been previously encountered in the labeled training data. In such scenarios, machine learning algorithms need to annotate samples from seen classes but also discover novel classes that have not been previously seen in a labeled dataset, thus overcoming the categorical shift between labeled and unlabeled data.  However, the current works focus on the condition that there is only one unlabeled dataset. The goal of the project is to extend the problem setting by considering multiple unlabeled datasets. We aim to propose a method that is able to perform open-world SSL on different unlabeled datasets.

Requirement: We are looking for students with experience in deep learning and with high interest and self-motivation to develop novel machine-learning methods. Applicants should be experienced with Python(Programming Language) and Pytorch(deep learning framework).

Level: Master

Contact: Shuo Wen (shuo.wen@epfl.ch)

  • Project 2: Multi-modal single-cell foundation model

Keywords: foundation model, multi-modal, transformer, single-cell biology

Generative pre-trained models have recently achieved unprecedented success in the language and vision domains. These foundation models follow a similar paradigm of pretraining transformers on large-scale datasets and achieve state-of-the-art performance on a multitude of downstream tasks. In the field of single-cell biology, deep learning has been extensively studied and applied to tasks including cell type annotation, batch effect removal, cell trajectory inference, and perturbation response modeling. Until recently, pre-training a foundation model that achieves a general understanding of single-cell biology has not been extensively studied. Additionally, existing cellular foundation models struggle to generalize well in the zero-shot setting and hardly out-perform specialized models trained on smaller datasets. Furthermore, despite the multi-modal nature of cellular biology, existing models are only trained on a single modality of transcriptomics data. Our goal is to train a multi-modal foundation model for cellular biology on a large-scale dataset that spans multiple species, physiological states, anatomical systems, and development stages. Such a model will serve as a robust foundation that drives the discovery of new cell types and states aids prediction of perturbation response to unseen compounds or genetic interventions, and contributes to a stronger understanding of cell biology.

Requirement: We are looking for highly motivated students with experience in deep learning to develop novel machine learning methods, curate large-scale training corpus, and build foundation models for the advancement of biomedicine. A background in biology is preferred but not required.

Level: Master/Master thesis

Contact: Shawn Fan (shuyang.fan@epfl.ch)

  • Project 3: Vision-Language Model based Open-World Semi-Supervised Learning

Keywords: open-world semi-supervised learning, vision-language model

In many real-world applications, unlabeled test data contains new classes that have not been previously encountered in the labeled training data. Given an unlabeled dataset, the goal of open-world semi-supervised learning is to annotate samples from seen classes and discover novel classes that have not been previously seen in  a labeled dataset (reference set). However, most current methods are based only on the vision model. The goal of the project is to propose a novel method based on the vision-language model. The method should be able to better perform the task of open-world semi-supervised learning by utilizing the language model.

Requirement: We are looking for students with experience in deep learning and with high interest and self-motivation to develop novel machine learning methods. Basic knowledge of the vision language model (e.g. CLIP) is a plus but not required. Applicants should be experienced with Python(Programming Language) and Pytorch(deep learning framework).

Level: Master

Contact: Shuo Wen (shuo.wen@epfl.ch)

  • Project 4:  Multi-task zero-shot transfer via diffusion models.

Keywords: zero-shot transfer, multi-task, diffusion models, visual in-context learning

Zero-shot transfer corresponds to the ability of a model to perform a previously unseen task by providing the model with task-specific instructions and without any task-specific training of the model. For example, Large Language Models (LLMs) have enabled such ability in the NLP domain via so-called in-context learning, i.e., providing natural language descriptions of a task to perform, i.e., “Please, translate the following sentence from English to French: {SENTENCE}”. Recently, in the vision domain, large text-to-image diffusion models were shown to be able to perform effective zero-shot classification, though they were not trained to perform such discriminative tasks. However, repurposing diffusion models for other visual tasks such as segmentation, object detection and others still requires task-specific fine-tuning of the pre-trained diffusion model. Therefore, it remains unclear if there is a single vision model which can perform a broad set of visual tasks without any additional task-specific fine-tuning, reflecting the capabilities of LLMs in the NLP domain. Thus, our goal is to explore recent advances in the diffusion models and propose a method that will be able to perform such multi-task zero-shot transfer to a variety of visual tasks.

Requirement: We are looking for students with experience in deep learning, strong math background and high self-motivation to develop novel machine learning methods.

Level: Master/Master thesis

Contact: Artem Gadetskii (artem.gadetskii@epfl.ch)

  • Project 5:  Unsupervised Transfer from Foundation Models

Keywords: unsupervised transfer, foundation models, clustering, transfer clustering

Transfer learning is a fundamental machine learning paradigm that leverages large-scale pre-training of deep neural networks to improve the performance of a model on a downstream task with limited resources. However, the prevalent approaches including (i) fully supervised transfer, (ii) few-shot transfer, and (iii) zero-shot transfer still require at least minimal supervision to solve a new task. Recently, our lab has developed the approach that leverages representation spaces produced by modern foundation models to perform fully unsupervised downstream transfer. The goal of the project is to develop the Python library for the practical and convenient usage by broader communities. Furthermore, the project includes deeper studying of the proposed approach for biomedical applications including but not limited to large-scale benchmarking and comparison of the approach to the alternatives and proposing extensions of the approach better suited for the bio domain (i.e. tackling highly unbalanced data).

Requirement: We are looking for students with a strong software engineering background, experience in implementing deep learning algorithms and high self-motivation.

Level: Master/Master thesis

Contact: Artem Gadetskii (artem.gadetskii@epfl.ch)