I am a research scientist (Ambizione fellow) at the LTS2 lab in EPFL, Switzerland. Previously, I spent time at TU Berlin, Ben Gurion, and TU Delft.

My research focuses on the foundations and applications of graph methods in machine learning and data science. I aim to find elegant explanations for phenomena associated with learning and to exploit them in order to design specialized learning machines.

I am also interested in graph problems in signal processing and theoretical computer science, as well as in the analysis of neural networks.

Selected recent papers

N. Karalias, A. Loukas. Erdos Goes Neural: an Unsupervised Learning Framework for Combinatorial Optimization on Graphs. Oral at NeurIPS 2020. (preprint)

A. Loukas. How hard is to distinguish graphs with graph neural networks. NeurIPS 2020. (preprint)

C. Vignac, A. Loukas, P. Frossard. Building powerful and equivariant graph neural networks with structural message-passing. NeurIPS 2020 (preprint)

A. Loukas. What graph neural networks cannot learn: depth vs width. ICLR 2020. (paper, bibtex, blogpost, 5min-presentation)

JB Cordonnier, A Loukas, M. Jaggi. On the relationship between self-attention and convolution. ICLR 2020. (paper, bibtex, blogpost, code, interactive website)

JB Cordonnier, A Loukas. Extrapolating paths with graph neural networks. IJCAI 2019. (preprint, bibtex, blogpost, code)

A Loukas. Graph reduction with spectral and cut guarantees. JMLR 2019. (paperbibtexcodeblogpost)

A Loukas. How close are the eigenvectors and eigenvalues of the sample and actual covariance matrices? ICML 2017. (paperbibtexblogpost)

Additional information

List of publications

Contact details

Social media

Please consult google scholar.

Drop me an email at “firstname.lastname@epfl.ch”.

Catch me on twitter, researchgate, or linkedin.