I'm building neural networks that generate provably correct code, and the software infrastructure for training them.
I work on three fronts:
- New architectures: Understanding old, and designing new neural networks that natively consume and produce structured data
- Mathematical Foundations: Building the mathematics necessary to state precisely what it means to generalise, especially on data structures recursive in nature
- Infrastructure: Building the stack required to train such networks in dependently-typed languages: tensor processing, automatic differentiation and elaborator integration
In all of these, I use category theory, the mathematics of structure and composition, as a central glue. To read more about my research programme, check out this link. To find my academic work, click here.
And to check out my recent project TensorType -- a framework for type-safe, pure functional and non-cubical tensor processing -- follow this link.


