Machine Learning/AI Research

see also Applied Mathematics Research

Publications

2025

2024

2023

2022

  1. On the Generalization of Representations in Reinforcement Learning. Charline Le Lan, Stephen Tu, Adam Oberman, Rishabh Agarwal, Marc G. Bellemare: AISTATS 2022: 4132-4157 arxiv
  2. FairCal: Fairness Calibration for Face Verification. Tiago Salvador, Stephanie Cairns, Vikram Voleti, Noah Marshall, Adam M. Oberman. ICLR 2022 arxiv
  3. EuclidNets: Combining Hardware and Architecture Design for Efficient Training and Inference. Mariana Oliveira Prazeres, Xinlin Li, Adam M. Oberman, Vahid Partovi Nia. ICPRAM 2022: 141-151 https://arxiv.org/abs/2212.11803
  4. A Reproducible and Realistic Evaluation of Partial Domain Adaptation Methods. Tiago Salvador, Kilian Fatras, Ioannis Mitliagkas, and Adam Oberman. NeurIPS Distribution Shifts Workshop, 2022 https://arxiv.org/abs/2210.01210
  5. ImageNet-Cartoon and ImageNet-Drawing: two domain shift datasets for ImageNet. Tiago Salvador and Adam M. Oberman. ICML Shift Happens Workshop, 2022
  6. Score-based Denoising Diffusion with Non-Isotropic Gaussian Noise Models. Vikram Voleti, Christopher Pal, Adam Oberman. NeurIPS 2022 Workshop on Score-Based Methods https://arxiv.org/abs/2210.12254
  7. Improving Continuous Normalizing Flows using a Multi-Resolution Framework. Voleti, Vikram, Chris Finlay, Adam M. Oberman, and Christopher Pal. In ICML Workshop on Invertible Neural Networks, Normalizing Flows, and Explicit Likelihood Models.

2021

  1. Stochastic Gradient Descent with Polyak’s Learning Rate Mariana Prazeres, Adam Oberman Journal of Scientific Computing 89.1 (2021): 1-16. arxiv
  2. Learning normalizing flows from Entropy-Kantorovich potentials Chris Finlay, Augusto Gerolin, Adam M Oberman, Aram-Alexandre Pooladian arxiv INNF+ 2021 workshop (2021)
  3. (Journal) Scaleable input gradient regularization for adversarial robustness Chris Finlay, Adam Oberman, Machine Learning with Applications (MLWA), 3 2021 arxiv github
  4. “Methods and systems for computing an output of a neural network layer”. US Patent application 17/317,833 (filed May 11, 2021)

2020

  1. (Journal and Conference) Nesterov’s method with decreasing learning rate leads to accelerated stochastic gradient descent, Maxime Laborde, Adam Oberman arxiv (Older version: A Lyapunov analysis for accelerated gradient method: from deterministic to stochastic case, Maxime Laborde, Adam Oberman AISTATS 2020 (Proceedings of Machine Learning Research)
  2. (Conference) Calibrated Top-1 Uncertainty Estimates for classification by score based models Adam Oberman Chris Finlay, Alexander Iannantuono, Tiago Salvador arxiv spotlight presentation at Spotlight Talk at UDL ICML Workshop (140 accepted papers, 6 contributed talks, 8 spotlight talks).
  3. (Conference) How to train your neural ODE: the world of Jacobian and kinetic regularization Chris Finlay, Jorn-Henrik Jacobsen, Levon Nurbekyan, Adam Oberman arxiv ICML 2020
  4. (Conference) A principled approach for generating adversarial images under non-smooth dissimilarity metrics Aram-Alexandre Pooladian, Chris Finlay, Tim Hoheisel, Adam Oberman AISTATS 2020 (Proceedings of Machine Learning Research) arxiv
  5. (Journal and Conference) No collision transportation maps, Levon Nurbekyan, Alexander Iannantuono, Adam M. Oberman NeurIPS 2019 OTML workshop Journal of Scientific Computing 2020 arxiv
  6. (Journal) A regularization interpretation of the proximal point method for weakly convex functions Tim Hoheisel, Maxime Laborde, Adam Oberman Journal of Dynamics and Games 2020
  7. (Conference) Partial differential equation regularization for supervised machine learning, Adam Oberman 75 Years of Mathematicas of Computation Symposium (2020) AMS Contemporary Math arxiv

2019, 2018

  1. (Conference) The LogBarrier adversarial attack: making effective use of decision boundary information Chris Finlay, Aram-Alexander Pooladian, Adam Oberman ICCV 2019 arxiv
  2. Parle: parallelizing stochastic gradient descent Pratik Chaudhari, Carlo Baldassi, Riccardo Zecchina, Stefano Soatto, Ameet Talwalkar, Adam Oberman SysML 2018 arxiv
  3. Stochastic Backward Euler: An Implicit Gradient Descent Algorithm for k-means Clustering Penghang Yin, Minh Pham, Adam Oberman, Stanley Osher; Journal of Scientific Computing 2018 arxiv
  4. Deep Relaxation: partial differential equations for optimizing deep neural networks Pratik Chaudhari, Adam M. Oberman, Stanley Osher, Stefano Soatto, Guillame Carlier; Research in Math Sciences 2018 arxiv
  5. Approximate Homogenization of fully nonlinear elliptic PDEs Chris Finlay and Adam M. Oberman; Journal of Scientific Computing 2018, arxiv
  6. Approximate Homogenization of convex nonlinear elliptic PDEs Chris Finlay and Adam M. Oberman; Comm Math Sci 2018, arxiv

Presentations