Selected Works

Research projects and open-source contributions

RESEARCH

Geometric Transformer

Novel attention mechanism using Riemannian geometry for 10x faster inference on edge devices.

PyTorchCUDAC++
FRAMEWORK

Neural ODEs

Continuous-depth networks solving differential equations with 40% less memory usage.

JAXNumPySciPy
THEORY

Category Theory × DL

Mathematical foundations for understanding neural architectures through functorial semantics.

HaskellAgdaType Theory

Core Competencies

Deep expertise across mathematical and computational domains

Mathematics

Differential Geometry 95%
Algebraic Topology 90%
Measure Theory 88%
Optimization 92%

Machine Learning

Transformers 96%
Graph Neural Networks 93%
Reinforcement Learning 87%
Probabilistic Models 91%

Engineering

PyTorch / JAX 94%
CUDA / Triton 89%
Rust 85%
Distributed Systems 88%

Research Impact

12
Publications
847
Citations
3
Best Papers
Topological Deep Learning: A New Frontier
NeurIPS 2024 • Outstanding Paper Award
Riemannian Optimization for Large Language Models
ICML 2024 • Oral Presentation
Category-Theoretic Foundations of NAS
ICLR 2024 • Spotlight

Let's Connect

Open to collaborations on fundamental ML research and ambitious projects

email: hello@mathml.ai
arxiv: /author/math_ml
github: @mathxcs