I am research scientist at the Core Data Science team at Meta.

My interests include Machine Learning, Algorithms and Optimization. Recently, I became interested in questions around neural networks. Specifically, why do our modern over-parametrized models generalize and what kind of theory do we need to develop to explain this phenomenon?

Before that, I obtained my PhD from Harvard where I was advised by Yaron Singer. At Harvard, I was part of the EconCS and a co-founder of the Harvard Machine Learning Theory groups.

Prior to that, I received my Bachelor's from the University of Athens where I was advised by Dimitris Fotakis and I worked in Algorithmic Game Theory and Selfish Routing.

I spent the summers of 2019 and 2020 in the Core Data Science team at Facebook working with Henry C. Lin, Smriti Bhagat and Udi Weinsberg. During the summer of 2018 I interned in Microsoft Research in Beijing (MSRA) hosted by Wei Chen.

My CV can be found here (last updated: May 2023).

Email: dim.kalimer [at] gmail [dot] com

Professional Experience


Publications


Degree Distribution Identifiability of Stochastic Kronecker Multiplication (under submission)

with Daniel Alabi


Preference Amplification in Recommender Systems

with Smriti Bhagat, Shankar and Udi Weinsberg

27th International Conference on Knowledge Discovery and Data Mining, (KDD) 2021


Robustness from Simple Classifiers

with Sharon Qian, Gal Kaplun and Yaron Singer


CLARA: Confidence of Labels and Raters

with Viet-An Nguyen, Peibei Shi, Jagdish Ramakrishnan, Udi Weinsberg, Henry C. Lin, Steve Metz, Neil Chandra, Jane Jing

26th International Conference on Knowledge Discovery and Data Mining, (KDD) 2020


SGD on Neural Networks Learns Functions of Increasing Complexity 

with Preetum Nakkiran, Gal Kaplun,  Tristan Yang, Benjamin L. Edelman, Fred Zhang and Boaz Barak

Annual Conference on Neural Information Processing Systems (NeurIPS) 2019

(spotlight presentation)

Preliminary version appeared in the 1st Workshop on Understanding and Improving Generalization in Deep Learning, ICML 2019


Robust Neural Networks are More Interpretable for Genomics

with Peter K. Koo, Sharon Qian, Gal Kaplun and Verena Volf

Preliminary version appeared in the ICML 2019 Workshop on Computational Biology


Robust Influence Maximization for Hyperparametric Models

with Gal Kaplun and Yaron Singer

International Conference of Machine Learning (ICML) 2019


Learning Diffusion using Hyperparameters 

with Yaron Singer, Karthik Subbian and Udi Weinsberg

International Conference of Machine Learning (ICML) 2018


Improving Selfish Routing for Risk-Averse Players

with Dimitris Fotakis and Thanasis Lianeas

Web and Internet Economics (WINE) 2015


Committees 

ICML 2020, 2021, NeurIPS 2020, 2021

Teaching Experience


Spring '19: AM 221: Advanced Optimization

Institution: Harvard

Instructor: Yaron Singer


Fall '17: CS 134: Networks (teaching excellence award)

Institution: Harvard

Instructor: Yaron Singer


Spring '16: Algorithmic game Theory

Institution: DIT

Instructor: Dimitris Fotakis


Fall '16: Mathematics for Computer Science, Theory of Computation

 Institution: DIT

 Instructor:  Stavros Kolliopoulos