About Me
I am a postdoctoral researcher at Università degli Studi di Milano and Politecnico di Milano, working with Nicolò Cesa-Bianchi. My primary focus is online learning theory, particularly algorithms that require no hyperparameter tuning and can adapt to time-varying objectives on-the-fly. I am also broadly interested in machine learning theory, reinforcement learning, and bandits.
I earned a PhD in Computing Science in 2024 from the University of Alberta, advised by Ashok Cutkosky (Boston University) and Martha White (University of Alberta). Before that, I earned a MSc in Computing Science (advised by Martha White and Adam White) and a BSc in Computing Science from the University of Alberta. I have also completed a music program at Grant Macewan University, where I majored in composition and minored in music technology.
In my free time I enjoy playing guitar, skateboarding, climbing, and wrestling/grappling.
Feel free to get in touch! If I do not get back to you, your message was probably filtered as spam; try using an institutional email address!
Publications
2024
Andrew Jacobsen. “Adapting to Non-stationarity in Online Learning”. PhD Thesis. University of Alberta. 2024.
Andrew Jacobsen, Francesco Orabona. “An Equivalence Between Static and Dynamic Regret Minimization”. Neural Information Processing Systems (NeurIPS). 2024.
Andrew Jacobsen, Ashok Cutkosky. “Online Linear Regression in Dynamic Environments via Discounting”. International Conference on Machine Learning (ICML). 2024.
2023
Andrew Jacobsen, Ashok Cutkosky. “Unconstrained Online Learning with Unbounded Losses”. International Conference on Machine Learning (ICML). 2023.
2022
Andrew Jacobsen, Ashok Cutkosky. “Parameter-free Mirror Descent”. Conference on Learning Theory (COLT). 2022.
2021
Matthew McLeod, Chunlok Lo, Matthew Schlegel, Andrew Jacobsen, Raksha Kumaraswamy, Martha White, Adam White. “Continual Auxiliary Task Learning”. Neural Information Processing Systems. 2021.
Matthew Schlegel, Andrew Jacobsen, Muhammad Zaheer, Andrew Patterson, Adam White, Martha White. “General Value Function Networks.” Journal of Artificial Intelligence Research. 2021.
2019
Andrew Jacobsen. “Vector Step-size Adaptation for Continual, Online Prediction.” MSc thesis. University of Alberta. 2019.
Andrew Jacobsen, Matthew Schlegel, Cameron Linke, Thomas Degris, Adam White, and Martha White. “Meta-descent for Online, Continual Prediction.” In Proceedings of the AAAI Conference on Artificial Intelligence. 2019.