About me

I am a research engineer at Meta AI (FAIR), working AI guided design, federated learning and privacy in machine learning. My research interests include federated learning, training efficiency of large-scale generative AI systems, and privacy-preserving and robust machine learning. I served on the organizing committee for FL-ICML, the program committee for FL-NeurIPS, and reviewer for ICLR, ICML, NeurIPS, AISTATS, and MLSys.

Before Meta, I graduated Cum Laude from UC Davis with double majors in Statistics and Computer Science (2018) and M.S. in Computer Science (2019). At UC Davis, I worked with Prem Devanbu and Vincent Hellendoorn on empirical software engineering.

Publications (see all)

2024

Now It Sounds Like You: Learning Personalized Vocabulary On Device

  • Sid Wang, Ashish Shenoy, Pierce Chuang, John Nguyen
  • AAAI 2024 Spring Symposium

2023

READ: Recurrent Adaptation of Large Transformers

  • John Nguyen*, Sid Wang*, Ke Li, Carole-Jean Wu
  • NeurIPS 2023 R0-FoMo: Robustness of Few-shot and Zero-shot Learning in Foundation Models Workshop

On Noisy Evaluation in Federated Hyperparameter Tuning

  • Kevin Kuo, Pratiksha Thaker, Mikhail Khodak, John Nguyen, Daniel Jiang, Ameet Talwalkar, Virginia Smith
  • Conference on Machine Learning and Systems (MLSys), 2023.

Where to Begin? Exploring the Impact of Pre-Training and Initialization in Federated Learning

  • John Nguyen, Jianyu Wang, Kshitiz Malik, Maziar Sanjabi, Michael Rabbat
  • Spotlight at International Conference on Learning Representations (ICLR) 2023
  • Presentation

2022

Toward Fair Federated Recommendation Learning: Characterizing the Inter-Dependence of System and Data Heterogeneity

  • Kiwan Maeng, Haiyu Lu, Luca Melis, John Nguyen, Mike Rabbat, Carole-Jean Wu.

  • Best Paper Finalist Award at the ACM Conference Series on Recommender Systems (RecSys), 2022.

Papaya: Practical, Private, and Scalable Federated Learning

  • Dzmitry Huba, John Nguyen, Kshitiz Malik, Ruiyu Zhu, Mike Rabbat, Ashkan Yousefpour, Carole-Jean Wu, Hongyuan Zhan, Pavel Ustinov, Harish Srinivas, Kaikai Wang, Anthony Shoumikhin, Jesik Min, Mani Malek.

  • Conference on Machine Learning and Systems (MLSys), 2022.

Federated Learning with Buffered Asynchronous Aggregation

  • John Nguyen, Kshitiz Malik, Hongyuan Zhan, Ashkan Yousefpour, Mike Rabbat, Mani Malek, Dzmitry Huba.

  • International Conference on Artificial Intelligence and Statistics (AISTATS), 2022.
  • Presentation

2021

Opacus: User-Friendly Differential Privacy Library in PyTorch

  • Ashkan Yousefpour*, Igor Shilov*, Alexandre Sablayrolles*, Davide Testuggine, Karthik Prasad, Mani Malek, John Nguyen, Sayan Ghosh, Akash Bharadwaj, Jessica Zhao, Graham Cormode, Ilya Mironov. ∗Equal contribution.

  • Privacy in Machine Learning (PriML) workshop, NeurIPS 2021.