Nikolaos Dimitriadis

Nikolaos Dimitriadis

PhD student

EPFL

Hello! I am Nikos

I am a PhD student in Computer Science at École Polytechnique Fédérale de Lausanne (EPFL), where I am advised by François Fleuret and Pascal Frossard. My research interests revolve around model merging, multi-task and continual learning. From December 2023 to December 2024, I interned in Google DeepMind in Zurich (full-time for 3 months and part-time for the rest of the year), where I worked on text-to-image generation.

Before coming to Switzerland, I completed my undergraduate studies in Electrical and Computer Engineering at the National Technical University of Athens in Greece. I conducted my thesis (available here in Greek) at the CVSP lab under the supervision of Petros Maragos. The focus lied on using tropical geometry to analyze Morphological Neural Networks, studying the sparsity of their representations compared to their linear counterparts, their ability to enforce shape constraints such as monotonicity, and extending a training algorithm based on Difference-of-Convex Programming to multiclass problems.

I am also an avid classical guitar player! I love playing Baroque and romantic pieces, such as compositions by Agustín Barrios Mangoré. Check out this beautiful performance!

Download my resumé .

Interests
  • Model Merging
  • Multi-Task Learning
  • Continual Learning
Education
  • PhD in Computer Science

    École Polytechnique Fédérale de Lausanne

  • MEng in Electrical Engineering and Computer Science, 2020

    National Technical University of Athens

Publications

Quickly discover relevant content by filtering publications.
(2025). LiNeS: Post-training layer scaling prevents forgetting and enhances model merging. International Conference on Learning Representations (ICLR).

Cite PDF GitHub Repository Website

(2025). Pareto Low-Rank Adapters: Efficient Multi-Task Learning with Preferences. International Conference on Learning Representations (ICLR).

Cite PDF

(2024). Localizing Task Information for Improved Model Merging and Compression. International Conference on Machine Learning (ICML).

Cite PDF Website GitHub repository

(2023). Benefits of Max Pooling in Neural Networks: Theoretical and Experimental Evidence. Transactions on Machine Learning Research.

PDF Cite URL

(2023). Pareto Manifold Learning: Tackling Multiple Tasks via Ensembles of Single-Task Models. International Conference on Machine Learning (ICML).

PDF Cite Poster Slides GitHub repository

(2023). SequeL: A Continual Learning Library in PyTorch and JAX. CVPR Workshop on Continual Learning 2023 (non-archival track).

PDF Cite Website GitHub repository

(2023). Flexible Channel Dimensions for Differentiable Architecture Search. arXiv (Preprint).

PDF Cite arXiv

(2022). U-Boost NAS: Utilization-boosted Differentiable Neural Architecture Search. European Conference on Computer Vision (ECCV).

PDF Cite Poster Slides Video GitHub repository

(2021). Advances in Morphological Neural Networks: Training, Pruning and Enforcing Shape Constraints. ICASSP 2021-2021 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP).

PDF Cite Poster Slides Video

Contact

nikolaos.dimitriadis[at]epfl.ch