Abstract: Although machine learning researchers have introduced a plethora of useful constructions for learning over Euclidean space, numerous types of data in various applications benefit from, if not necessitate, a non-Euclidean treatment. In this talk I cover the need for Riemannian geometric constructs to (1) build more principled generalizations of common Euclidean operations used in geometric machine learning models as well as to (2) enable general manifold density learning in contexts that require it. Said contexts include theoretical physics, robotics, and computational biology. I will cover one of my papers that fits into (1) above, namely the ICML 2020 paper “Differentiating through the Fréchet Mean.” I will also cover two of my papers that fit into (2) above, namely the NeurIPS 2020 paper “Neural Manifold ODEs” and the NeurIPS 2021 paper “Equivariant Manifold Flows.” Finally, I will briefly discuss directions of relevant ongoing work.
Bio: Isay Katsman is a first year PhD student in applied mathematics at Yale advised by Prof. Anna Gilbert. His research interests include Riemannian geometry arising in the context of machine learning, with particular emphasis on applications in mathematical physics. Before Yale, Isay obtained his master’s degree in computer science from Cornell University, where he also conducted his undergraduate studies. Throughout his master’s and undergraduate careers, Isay was advised by Prof. Christopher de Sa. His work is supported by an NSF Graduate Research Fellowship.