I’m a PhD candidate in statistics advised by Daniel Roy at the University of Toronto, supported by the Vector Institute and an NSERC Doctoral Canada Graduate Scholarship. I received my BSc in Financial Mathematics from Western University in 2018.

My research area is broadly statistical machine learning, with a focus on theoretical performance guarantees for various sequential prediction settings. Some problems I’m currently thinking about are:

**Adaptivity**: How can we characterize the difficulty of learning data beyond classical stationary dependence structures and design algorithms that adapt to these difficulty notions?

**Statistical Complexities**: Existing notions of predictor and algorithm complexity may be vacuous or suboptimal for modern machine learning systems. What are the right notions of complexity that lead to matching theoretical guarantees on empirical performance?

**Domain Adaptation**: Empirical performance has far outpaced the theory for the broad problem of learning data with few or no labels. What are the correct theoretical formalizations of these tasks that will lead to sample complexity guarantees similar to those seen in practice?

To view my curriculum vitae, click here.

^{*}denotes equal contribution