Research projects and open-source contributions
Novel attention mechanism using Riemannian geometry for 10x faster inference on edge devices.
Continuous-depth networks solving differential equations with 40% less memory usage.
Mathematical foundations for understanding neural architectures through functorial semantics.
Deep expertise across mathematical and computational domains
Open to collaborations on fundamental ML research and ambitious projects