Date |
November 24, 2010 |
Speaker |
Dr. Masashi Sugiyama, Tokyo Institute of Technology, Japan |
Title |
Density Ratio Estimation: A New Versatile Tool for Machine Learning
|
Abstract |
Recently, we developed a new ML framework that allows us to
systematically avoid density estimation. The key idea is to directly
estimate the ratio of density functions, not densities themselves.
Our framework includes various ML tasks such as importance sampling
(e.g., covariate shift adaptation, transfer learning, multitask
learning), divergence estimation (e.g., two-sample test, outlier
detection, change detection in time-series), mutual information
estimation (e.g., independence test, independent component analysis,
feature selection, sufficient dimension reduction, causal inference),
and conditional probability estimation (e.g., probabilistic
classification, conditional density estimation).
In this talk, we introduce the density ratio framework, review methods
of density ratio estimation, and show various real-world applications
including brain-computer interface, speech recognition, image
recognition, and robot control.
http://sugiyama-www.cs.titech.ac.jp/~sugi/publications.html
References;
[Review of the density ratio framework of machine learning] Sugiyama, M., Kanamori, T., Suzuki, T., Hido, S., Sese, J., Takeuchi, I., & Wang, L. A density-ratio framework for statistical data processing. IPSJ Transactions on Computer Vision and Applications, vol.1, pp.183-208, 2009.
[Review of density ratio estimation methods] Sugiyama, M., Suzuki, T., & Kanamori, T. Density ratio estimation: A comprehensive review: In Statistical Experiment and Its Related Topics, Research Institute for Mathematical Sciences Kokyuroku, 2010.
[Recent papers] Kanamori, T., Hido, S., & Sugiyama, M. A least-squares approach to direct importance estimation. JMLR, vol.10, pp.1391-1445, 2009.
Sugiyama, M., Takeuchi, I., Kanamori, T., Suzuki, T., Hachiya, H., & Okanohara, D. Conditional density estimation via least-squares density ratio estimation. AISTATS2010
Suzuki, T. & Sugiyama, M. Sufficient dimension reduction via squared-loss mutual information estimation. AISTATS2010
Yamada, M. & Sugiyama, M. Dependence minimizing regression with model selection for non-linear causal inference under non-Gaussian noise. AAAI2010
Sugiyama, M. & Simm, J. A computationally-efficient alternative to kernel logistic regression. MLSP2010
Hachiya, H. & Sugiyama, M. Feature selection for reinforcement learning: Evaluating implicit state-reward dependency via conditional mutual information. ECMLPKDD2010
|
|