Nikolaos Dimitriadis

Nikolaos Dimitriadis

PhD student

EPFL

Hello! I am Nikos

I am currently a student researcher in Google DeepMind, working on continual learning and model merging with foundation models. I completed my PhD in Machine Learning at École Polytechnique Fédérale de Lausanne (EPFL), where I was fortunate to be advised by François Fleuret and Pascal Frossard. My research interests revolve around model merging, multi-task and continual learning. From December 2023 to December 2024, I interned in Google DeepMind in Zurich, where I worked on text-to-image generation.

Before coming to Switzerland, I completed my undergraduate studies in Electrical and Computer Engineering at the National Technical University of Athens in Greece. I conducted my thesis under the supervision of Petros Maragos, focusing on the intersection of tropical geometry and machine learning.

Interests
  • Model Merging
  • Multi-Task Learning
  • Continual Learning
Education
  • PhD in Computer Science, 2025

    École Polytechnique Fédérale de Lausanne

  • MEng in Electrical Engineering and Computer Science, 2020

    National Technical University of Athens

Publications

Quickly discover relevant content by filtering publications.
(2025). MEMOIR: Lifelong Model Editing with Minimal Overwrite and Informed Retention for LLMs. Advances in Neural Information Processing Systems (NeurIPS).

Cite URL

(2025). LiNeS: Post-training layer scaling prevents forgetting and enhances model merging. International Conference on Learning Representations (ICLR).

Cite PDF GitHub Repository Website

(2025). Pareto Low-Rank Adapters: Efficient Multi-Task Learning with Preferences. International Conference on Learning Representations (ICLR).

Cite PDF

(2025). Single-Input Multi-Output Model Merging: Leveraging Foundation Models for Dense Multi-Task Learning. arXiv.

Cite

(2024). Localizing Task Information for Improved Model Merging and Compression. International Conference on Machine Learning (ICML).

Cite PDF Website GitHub repository

(2023). Benefits of Max Pooling in Neural Networks: Theoretical and Experimental Evidence. Transactions on Machine Learning Research.

PDF Cite URL

(2023). Pareto Manifold Learning: Tackling Multiple Tasks via Ensembles of Single-Task Models. International Conference on Machine Learning (ICML).

PDF Cite Poster Slides GitHub repository

(2023). SequeL: A Continual Learning Library in PyTorch and JAX. CVPR Workshop on Continual Learning 2023 (non-archival track).

PDF Cite Website GitHub repository

(2023). Flexible Channel Dimensions for Differentiable Architecture Search. arXiv (Preprint).

PDF Cite arXiv

(2022). U-Boost NAS: Utilization-boosted Differentiable Neural Architecture Search. European Conference on Computer Vision (ECCV).

PDF Cite Poster Slides Video GitHub repository

(2021). Advances in Morphological Neural Networks: Training, Pruning and Enforcing Shape Constraints. ICASSP 2021-2021 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP).

PDF Cite Poster Slides Video

Contact

dimitriadisnikolaos0[at]gmail.com