I am an IVADO postdoctoral fellow at Mila - Quebec Artificial Intelligence Institute and DIRO, UdeM. I work closely with Dr. Simon Lacoste-Julien and Dr. Ioannis Mitliagkas. My current research focuses on the theory and applications of convex and non-convex optimization in large-scale machine learning and data science problems.
I obtained my PhD from The University of Edinburgh, School of Mathematics. More specifically I was a member of the Operational Research and Optimization Group (ERGO) under the supervision of Dr. Peter Richtarik. Prior to that, I spent 4 beautiful years in Athens as undergraduate student in the Department of Mathematics at National and Kapodistrian University of Athens and 1 year as postgraduate student at Imperial College London where I obtained an MSc in Computing (Computational Management Science).
During fall of 2018, I was a research intern at Facebook AI Research in Montreal, Canada. I was working mainly with Dr. Mike Rabbat on topics related to Distributed Non-Convex Optimization Algorithms and Deep Learning.
My research interests include (but are not limited to):
Large Scale Optimization, Machine Learning, Randomized numerical linear algebra, Convex Analysis, Randomized and Distributed Algorithms.
For more details please feel free to look my CV (updated September 2020).
Selected Recent News
(For the full list of news please check the News tab.)
19 Aug 2020: Our paper "Convergence Analysis of Inexact Randomized Iterative Methods", joint work with Peter Richtarik, was accepted to SIAM Journal on Scientific Computing (SISC)
19 Aug 2020: Our paper "Momentum and Stochastic Momentum for Stochastic Gradient, Newton, Proximal Point and Subspace Descent Methods ", joint work with Peter Richtarik, was accepted to Computational Optimization and Applications (COAP)
20 Jun 2020: New Paper out: Unified Analysis of Stochastic Gradient Methods for Composite Convex and Smooth Optimization , joint work with Ahmed Khaled, Othmane Sebbouh, Robert M. Gower and Peter Richtárik.
18 Jun 2020: New Paper out: SGD for Structured Nonconvex Functions: Learning Rates, Minibatching and Interpolation, joint work with Robert M. Gower and Othmane Sebbouh.
01 Jun 2020: Two papers accepted to ICML 2020 (37th International Conference on Machine Learning):
A Unified Theory of Decentralized SGD with Changing Topology and Local Updates
joint work with Anastasia Koloskova, Sadra Boreiri, Martin Jaggi and Sebastian U. Stich.
Stochastic Hamiltonian Gradient Methods for Smooth Games
joint work with Hugo Berard, Alexia Jolicoeur-Martineau, Pascal Vincent, Simon Lacoste-Julien and Ioannis Mitliagkas.
24 Feb 2020: New Paper out: Stochastic Polyak Step-size for SGD: An Adaptive Learning Rate for Fast Convergence, joint work with Sharan Vaswani, Issam Laradji and Simon Lacoste-Julien.
19 December 2019: I am delighted to be awarded the IVADO Fellow Postdoctoral Scholarship .
For more details on The Institute for Data Valorisation (IVADO) and its mission check out the IVADO's website
23 Mar 2020: New Paper out: A Unified Theory of Decentralized SGD with Changing Topology and Local Updates, joint work with Anastasia Koloskova, Sadra Boreiri, Martin Jaggi and Sebastian U. Stich.