Clemson University

School of Computing Seminar with Kai Liu, Colorado School of Mines

"Alternating Minimization in Machine Learning with Provable Convergence"



Optimization is critical to a lot of machine learning methods such as Nonnegative Matrix Factorization, Dictionary Learning, and Principal Component Analysis. However, most existing algorithms can only guarantee that the objective function is monotonically non-increasing, while the convergence analysis of the generated sequences is usually ignored. In this talk, Kai Liu will introduce a new optimization framework, as well as its application in data recovery, denoising and mining with very promising results. He proved in theory that the new framework can ensure both the objective function and generated sequences are convergent with at least sub-linear convergence rate.    

Kai Liu is a Ph.D. candidate at Colorado School of Mines working with Dr. Hua Wang. Before he started pursuing his Ph.D., he did his Master and Bachelor studies in Control Sciences & Automation at Tsinghua University and Beijing Jiaotong University respectively. His research interest lies in Machine Learning and its applications in Data Mining, Computer Vision, Natural Language Processing and Bioinformatics. He aims at providing computationally efficient algorithms with provable theoretical guarantees. His work has been published in various top machine learning venues such as NIPS, ACL, CVPR, SDM, AAAI, IJCAI and RECOMB. During his internship, he worked on speech recognition and image/audio denoising with deep learning approaches. 

Wednesday, March 6 at 2:30pm to 3:30pm

McAdams Hall, 107
821 McMillan Rd., Clemson, SC 29634, USA

Event Type

Lectures / Seminars / Speakers


College of Engineering, Computing and Applied Sciences, School of Computing, Research Seminars

Target Audience

Students, Faculty


Contact Name:

Dida Weeks

Contact Phone:


Contact Email:


Recent Activity