Thus far kernel methods have been mainly applied in cases where observations or instances are vectors. We are lifting kernel methods to the matrix domain, where the instances are outer products of two vectors. Matrix parameters can model all interactions between components and therefore take second order information into account. We discovered that in the matrix setting a much larger class of algorithms based on any spectrally invariant regularization can be kernelized. Therefore we believe that the impact of the kernelization method will be even greater in the matrix setting. In particular we will show how to kernelize the matrix versions of the multiplicative updates. This family is motivated by using the quantum relative entropy as a regularization. Most importantly we will use methods from on-line learning to prove generalization bounds for multiplicative updates that grow logarithmic in the feature dimension. This is important because it lets us use high dimensional feature spaces.

We will apply our methods to collaborative filtering. In this case an instance is defined by two vectors, one describing a user and another describing an object. The outer products of such pairs of vectors become the input instances to the machine learning algorithms. The multiplicative updates are ideally suited to learn well when there is a low-rank matrix that can accurately explain the preference labels of the instances. The kernel method greatly enhances the applicability of the method because now we can expand the user and object vectors to high-dimensional feature vectors and still obtain efficient algorithms.

Project Start
Project End
Budget Start
2009-09-01
Budget End
2014-08-31
Support Year
Fiscal Year
2009
Total Cost
$455,000
Indirect Cost
Name
University of California Santa Cruz
Department
Type
DUNS #
City
Santa Cruz
State
CA
Country
United States
Zip Code
95064