"Non negative matrix factorization for transfer learning" by Ievgen Redko
The Thursday, December 17, 2015
at 2:00 pm
room F021a,
Building F,
Laboratoire Hubert Curien,
18 Rue Professeur Benoît Lauras,
42000 Saint-Etienne
Seminar by Ievgen Redko, from the LIPN
The ability of a human being to extrapolate previously gained knowledge to other domains inspired a new family of methods in machine learning called transfer learning. Transfer learning is often based on the assumption that objects in both target and source domains share some common feature and/or data space. If this assumption is false, most of transfer learning algorithms are likely to fail. In this work, we propose to investigate the problem of transfer learning from both theoretical and applicational points of view.
First, we introduce a theoretical framework based on the Hilbert-Schmidt embeddings that allows us to improve the current state-of-the-art theoretical results on transfer learning by introducing a natural and intuitive distance measure with strong computational guarantees for its estimation.
The proposed results combine the tightness of data-dependent bounds derived from Rademacher learning theory while ensuring the efficient estimation of its key factors.
We also present two different methods to solve the problem of unsupervised transfer learning based on Non-negative matrix factorization techniques.
First one represents a linear approach that aims at discovering an embedding for two tasks that decreases the distance between the corresponding probability distributions while preserving the non-negativity property. Second one proceeds using an iterative optimization procedure that aims at aligning the kernel matrices calculated based on the data from two tasks.