I am research scientist at the Core Data Science team at Meta.
My interests include Machine Learning, Algorithms and Optimization. Recently, I became interested in questions around neural networks. Specifically, why do our modern over-parametrized models generalize and what kind of theory do we need to develop to explain this phenomenon?
Before that, I obtained my PhD from Harvard where I was advised by Yaron Singer. At Harvard, I was part of the EconCS and a co-founder of the Harvard Machine Learning Theory groups.
Prior to that, I received my Bachelor's from the University of Athens where I was advised by Dimitris Fotakis and I worked in Algorithmic Game Theory and Selfish Routing.
I spent the summers of 2019 and 2020 in the Core Data Science team at Facebook working with Henry C. Lin, Smriti Bhagat and Udi Weinsberg. During the summer of 2018 I interned in Microsoft Research in Beijing (MSRA) hosted by Wei Chen.
My CV can be found here (last updated: May 2023).
Email: dim.kalimer [at] gmail [dot] com
2021 - present: Research Scientist at Meta, NYC
Summer 2020: Facebook Internship in Core Data Science group, NYC, hosted by Smriti Bhagat and Udi Weinsberg.
Summer 2019: Facebook Internship in Core Data Science group, Menlo Park, hosted by Henry C. Lin and Udi Weinsberg.
Summer 2018: MSR Asia Research Internship hosted by Wei Chen.
Summer 2014: Software Engineering internship at CERN, through CERN's summer student program.
Degree Distribution Identifiability of Stochastic Kronecker Multiplication (under submission)
with Daniel Alabi
Preference Amplification in Recommender Systems
with Smriti Bhagat, Shankar and Udi Weinsberg
27th International Conference on Knowledge Discovery and Data Mining, (KDD) 2021
Robustness from Simple Classifiers
with Sharon Qian, Gal Kaplun and Yaron Singer
CLARA: Confidence of Labels and Raters
with Viet-An Nguyen, Peibei Shi, Jagdish Ramakrishnan, Udi Weinsberg, Henry C. Lin, Steve Metz, Neil Chandra, Jane Jing
26th International Conference on Knowledge Discovery and Data Mining, (KDD) 2020
SGD on Neural Networks Learns Functions of Increasing Complexity
with Preetum Nakkiran, Gal Kaplun, Tristan Yang, Benjamin L. Edelman, Fred Zhang and Boaz Barak
Annual Conference on Neural Information Processing Systems (NeurIPS) 2019
Preliminary version appeared in the 1st Workshop on Understanding and Improving Generalization in Deep Learning, ICML 2019
Robust Neural Networks are More Interpretable for Genomics
with Peter K. Koo, Sharon Qian, Gal Kaplun and Verena Volf
Preliminary version appeared in the ICML 2019 Workshop on Computational Biology
Robust Influence Maximization for Hyperparametric Models
with Gal Kaplun and Yaron Singer
International Conference of Machine Learning (ICML) 2019
Learning Diffusion using Hyperparameters
with Yaron Singer, Karthik Subbian and Udi Weinsberg
International Conference of Machine Learning (ICML) 2018
Improving Selfish Routing for Risk-Averse Players
with Dimitris Fotakis and Thanasis Lianeas
Web and Internet Economics (WINE) 2015
ICML 2020, 2021, NeurIPS 2020, 2021
Spring '19: AM 221: Advanced Optimization
Instructor: Yaron Singer
Fall '17: CS 134: Networks (teaching excellence award)
Instructor: Yaron Singer
Spring '16: Algorithmic game Theory
Instructor: Dimitris Fotakis
Fall '16: Mathematics for Computer Science, Theory of Computation
Instructor: Stavros Kolliopoulos