讲座论坛
荔园数学讲坛: On Linear Convergence of ADMM for Decentralized Quantile Regression
发布时间:2023-09-04 15:50:10 2320

荔园数学讲坛: On Linear Convergence of ADMM for Decentralized Quantile Regression

主讲人:香港城市大学 练恒 教授

时间:2023年9月11日16:00–17:30

地点:H-304

 

内容摘要: 

The alternating direction method of multipliers (ADMM) is a natural method of choice for distributed parameter learning. For smooth and strongly convex consensus optimization problems, it has been shown that ADMM and some of its variants enjoy linear convergence in the distributed setting, much like in the traditional non-distributed setting. The optimization problem associated with parameter estimation in quantile regression is neither smooth nor strongly convex (although is convex) and thus it can only have sublinear convergence at best. Although this insinuates slow convergence, we show that, if the local sample size is sufficiently large compared to parameter dimension and network size, distributed estimation in quantile regression actually exhibits linear convergence up to the statistical precision.

主讲人简介: 

练恒,现任香港城市大学数学系教授,于2000年在中国科学技术大学获得数学和计算机学士学位,2007年在美国布朗大学获得计算机硕士,经济学硕士和应用数学博士学位。先后在新加坡南洋理工大学,澳大利亚新南威尔士大学,和香港城市大学工作。在高水平国际期刊上发表学术论文30多篇,包括《Annals of Statistics》《Journal of the Royal Statistical Society,Series B》、《Journal of the American Statistical Association》《Journal of Machine Learning Research》《IEEE Transactions on Pattern Analysis and Machine Intelligence》. 研究方向包括高维数据分析,函数数据分析,机器学习等。