Most machine learning problems change over time, but most machine learning methods do not. The real world is a constantly evolving data source, and any ML method that runs for more than a week will experience dataset drift. The predominant approach, however, is just to retrain the model on a fixed schedule. In contrast, even the simplest biological organisms learn continuously. This talk will describe some recent progress in continual learning, including the use of generative models to pre-empt the types of changes the real world might inflict.
This event is part of the AI at Melbourne Colloquium Series, a program of talks on the future of Artificial Intelligence at The University of Melbourne.