Online Hyperparameter Optimization for Streaming Neural Networks

Abstract

Neural networks have enjoyed tremendous success in many areas over the last decade. They are also receiving more and more attention in learning from data streams, which is inherently incremental. An incremental setting poses challenges for hyperparameter optimization, which is essential to obtain satisfactory network performance. To overcome this challenge, we introduce Continuously Adaptive Neural networks for Data streams (CAND). For every prediction, CAND chooses the current best network from a pool of candidates by continuously monitor- ing the performance of all candidate networks. The candidates are trained using different optimizers and hyperparameters. An experimental comparison against three state-of-the-art stream learning methods, over 17 benchmark streaming datasets con- firms the competitive performance of CAND, especially on high- dimensional data. We also investigate two orthogonal heuristics for accelerating CAND, which trade-off small amounts of accu- racy for significant run-time gains. We observe that training on small mini-batches yields similar accuracy to single-instance fully incremental training, even on evolving data streams.

Date
Sep 11, 2022 11:00 AM — 12:00 PM
Event
Cardiff University - Machine Learning Seminar, 2022
Nuwan Gunasekara
Nuwan Gunasekara
AI Researcher

My research interests include Stream Learning and Online Continual Learning.