Lachlan Ewen MacDonald

Academic Portfolio

Lachlan Ewen MacDonald

Postdoctoral researcher, University of Pennsylvania, Innovation in Data Engineering and Science (IDEAS).

About Me

I am an Australian mathematician, currently working as a postdoctoral researcher with René Vidal at IDEAS, University of Pennsylvania, developing mathematical foundations for deep learning.

My journey through academia to this point has been a winding affair. I did my undergraduate and graduate studies at my local institution, the University of Wollongong. My doctorate, advised by Adam Rennie, was in the noncommutative geometry of foliated manifolds. I held postdoctoral positions in pure mathematics at the Australian National University, with Alan Carey, and the University of Adelaide, with Mathai Varghese, before moving over to machine learning in 2021. I have since held postdoctoral positions focusing on deep learning theory with Simon Lucey at the University of Adelaide, and, since 2023, with René Vidal at Johns Hopkins University and the University of Pennsylvania. My journey would not have been possible without these mentors, and I am grateful to all of them.

My research in mathematical foundations for deep learning has recently led me back to where I started: differential geometry and dynamical systems theory; foliations have come back into my life! I am also pushing forward a new theory of disintegrations of probability measures and their applications to statistics in Wasserstein space and the generalisation problem in machine learning; the application of our theory to bounding the Wasserstein p-distance between a random empirical measure and its parent seems particularly promising in that, unlike existing approaches, it yields uniform-in-p bounds, making them the tightest we know of for large p.

I spend my spare time philosophising.

Publications

Published / Accepted
Peer-reviewed journal and conference articles.
Convergence Rates for Gradient Descent at the Edge of Stability in Overparametrised Least Squares first author
with Z. Xu, H. Min, S. Tarmoun, L. Palma, R. Vidal. NeurIPS 2025.
Understanding the Learning Dynamics of LoRA: A Gradient Flow Perspective on Low-Rank Adaptation in Matrix Factorization
with Z. Xu, H. Min, J. Luo, S. Tarmoun, E. Mallada, R. Vidal. AISTATS 2024.
On skip connections and normalisation layers in deep optimisation first author
with J. Valmadre, H. Saratchandran, S. Lucey. NeurIPS 2023.
Curvature-Aware Training for Coordinate Networks
with H. Saratchandran, S. F. Ch'ng, S. Ramasinghe, S. Lucey. ICCV 2023.
How much does Initialization Affect Generalization?
with S. Ramasinghe, M. Farazi, H. Saratchandran, S. Lucey. ICML 2023.
Flow Supervision for Deformable NeRF
with C. Wang, L. A. Jeni, S. Lucey. CVPR 2023.
On Quantizing Implicit Neural Representations
with C. Gordon, S. F. Chng, S. Lucey. WACV 2023.
On the Frequency Bias of Coordinate-MLPs (second author)
with S. Ramasinghe, S. Lucey. NeurIPS 2022.
Enabling Equivariance for Arbitrary Lie Groups first author
with S. Ramasinghe, S. Lucey. CVPR (oral) 2022.
A characteristic map for the holonomy groupoid of a foliation.
Mathematische Zeitschrift 2021.
Hierarchies of holonomy groupoids for foliated bundles.
Annals of Global Analysis and Geometry 2021.
The holonomy groupoids of singularly foliated bundles.
SIGMA 2021, 17.
On the Chern character in Higher Twisted K-theory and spherical T-duality
with V. Mathai, H. Saratchandran. Communications in Mathematical Physics 2021, 385.
The Godbillon-Vey invariant and equivariant KK-theory
with A. Rennie. Annals of K-theory 2020, 5.
Equivariant KK-theory for non-Hausdorff groupoids.
Journal of Geometry and Physics 2020, 154.
Theoretical Consideration of Superconducting Coils for Compact Superconducting Magnetic Energy Storage Systems
with A. Pan, H. Baiej, P. Cooper. IEEE Transactions on Applied Superconductivity 2016, 26.
Submitted / Under review
Disintegration theorem for multifunctions, with applications to empirical Wasserstein distances and average-case statistical bounds
with J. Fill. 2025. arXiv:2507.01236
On the Convergence, Implicit Bias and Edge of Stability of Gradient Descent in Deep Learning
with H. Min, R. Vidal. 2025.
Chern-Weil theory for Haefliger-singular foliations
with B. McMillan. 2021. arXiv:2106.10078

Awards & Honours

Awards
Best Thesis Award
The University of Wollongong, 2019
Awarded for the best PhD thesis in the Faculty of Engineering and Information Sciences.
University Medal
The University of Wollongong, 2015
First in the Faculty of Engineering and Information Sciences.
Undergraduate Awards
The University of Wollongong, 2010–2014
Numerous research scholarships and awards for academic excellence.

Talks & Presentations

Full list
Convergence Rates for Gradient Descent at the Edge of Stability in Overparametrised Least Squares
NeurIPS 2025 — Poster presentation. Link
On skip connections and normalisation layers in deep optimisation
NeurIPS 2023 — Poster presentation. Link
Towards a formal theory of deep optimisation
Johns Hopkins University — MINDS Seminar (2022). Link
On the Frequency Bias of Coordinate MLPs
NeurIPS 2022 — Poster presentation. Link
Towards a formal theory of deep optimisation
Carnegie Mellon University — VASC Seminar (2022). Link
General equivariance in deep learning
Lockheed-Martin USA (2022). Link
Enabling Equivariance for Arbitrary Lie Groups
CVPR 2022 — Oral presentation. Link
Equivariance in deep learning
Lockheed-Martin Australia — STELarLab presentation (2022). Link
Chern-Weil theory for singular foliations
University of New South Wales — Pure Math Seminar (2021). Link
Chern-Weil theory for singular foliations
Global Noncommutative Geometry Seminar (Virtual, 2021). Link
Chern-Weil theory for singular foliations
University of Adelaide — Differential Geometry Seminar (2021). Link
Holonomy groupoids via conservation laws
Singular Foliations and Related Structures Workshop (Virtual, 2020). Link
Conservation laws and the holonomy of foliated manifolds
University of Wollongong — NCG & Operator Algebras Seminar (2020). Link
Dynamical invariants of foliated manifolds
University of Adelaide — Analysis on Manifolds (2019). Link
Dynamical invariants of foliated manifolds
University of Wollongong — Geometric Analysis Seminar (2019). Link
Dynamical invariants of foliated manifolds
Chalmers University of Technology — Analysis & Probability Seminar (2019). Link
The Godbillon-Vey invariant in equivariant KK-theory
Université Clermont Auvergne — Séminaire et Groupe de travail GAAO (2018). Link
The Godbillon-Vey invariant in equivariant KK-theory
Université Paris Diderot — Séminaire d’Algèbres d’Opérateurs (2018). Link
The Godbillon-Vey invariant in equivariant KK-theory
Université Toulouse — ANR SINGSTAR workshop “Index Theory: Interactions and Applications” (2018). Link

Contact

Email: lemacdonald@protonmail.com