科学研究
报告题目:

Convergence for Kernel Minimum Error Entropy Principle

报告人:

胡婷 教授(西安交通大学)

报告时间:

报告地点:

金沙9001cc诚为本西北楼一楼报告厅(103)

报告摘要:

Information theoretic learning is a learning paradigm that uses concepts of entropies and divergences from information theory. A variety of signal processing and machine learning methods fall into this framework. Minimum error entropy principle is a typical one amongst them. In this talk, we study a kernel version of minimum error entropy methods that can be used to find nonlinear structures in the data. We show that the kernel minimum error entropy can be implemented by kernel based gradient descent algorithms with or without regularization. Convergence rates for both algorithms are deduced.