About me

I am a post-doc in the Department of Computer science at The University of Toronto, hosted by Aleksander Nikolov and Nicolas Papernot. I am also a postdoctoral affiliate with the Vector Institute. Previously, I obtained my PhD the Ohio State University, where I was fortunate to be advised by Raef Bassily. My research focuses on characterizing the limits of differentially private optimization and machine learning. Other research interests include the role of stability in optimization and learning, the study of gradient oracle complexity lower bounds in first-order optimization methods, and bridging gaps between machine learning theory and practice.

Research

Private Algorithms for Stochastic Saddle Points and Variational Inequalities: Beyond Euclidean Geometry

Raef Bassily, Cristóbal Guzmán, Michael Menart
NeurIPS 2024

Public-data Assisted Private Stochastic Optimization: Power and Limitations

E Ullah, M Menart, R Bassily, C Guzmán, R Arora
NeurIPS 2024

Differentially Private Non-Convex Optimization under the KL Condition with Optimal Rates

M Menart, E Ullah, R Arora, R Bassily, C Guzmán
ALT 2024

Differentially Private Algorithms for the Stochastic Saddle Point Problem with Optimal Rates for the Strong Gap

R Bassily, C Guzmán, M Menart
COLT 2023

Faster Rates of Convergence to Stationary Points in Differentially Private Optimization

R Arora, R Bassily, T González, C Guzmán, M Menart, E Ullah
ICML 2023

Differentially Private Generalized Linear Models Revisited

Raman Arora, Raef Bassily, Cristóbal Guzmán, Michael Menart, Enayat Ullah
NeurIPS 2022

Differentially private stochastic optimization: New results in convex and non-convex settings

Raef Bassily, Cristóbal Guzmán, Michael Menart
NeurIPS 2021