报告人:吴远山(中南财经政法大学)
时间:2024年05月16日 10:00-
地址:数统学院LD402
摘要:Transfer learning has become an essential technique to exploit information from the source domain to boost performance of the target task. Despite the prevalence in high-dimensional data, heterogeneity and heavy tails are insufficiently accounted for by current transfer learning approaches and thus may undermine the downstream performance. We propose a transfer learning procedure in the framework of high-dimensional quantile regression models to accommodate heterogeneity and heavy tails in the source and target domains. We establish error bounds of transfer learning estimator based on delicately selected transferable source domains, showing that lower error bounds can be achieved for critical selection criterion and larger sample size of source tasks. We further propose valid confidence interval and hypothesis test procedures for individual component of high-dimensional quantile regression coefficients by advocating a double transfer learning estimator, which is one-step debiased estimator for the transfer learning estimator wherein the technique of transfer learning is designed again. By adopting data-splitting technique, we advocate a transferability detection approach that guarantees to circumvent negative transfer and identify transferable sources with high probability. Simulation results demonstrate that the proposed method exhibits some favorable and compelling performances and the practical utility is further illustrated by analyzing a real example.
简介:吴远山,现任中南财经政法大学统计与数学学院教授、博士生导师。主要从事大数据的统计理论基础、分位数回归、生存分析等相关的研究工作,在统计学期刊Journal of the American Statistical Association、Biometrika、Bernoulli、Biometrics、Biostatistics和人工智能期刊Journal of Machine Learning Research、Neurocomputing 以及数学综合期刊SCIENCE CHINA Mathematics 等发表学术论文30余篇。目前主持国家自然科学基金面上项目一项,主持完成国家自然科学基金面上项目和青年项目各一项、教育部高等学校博士学科点专项科研基金一项、博士后科学基金面上项目一项、参与科技部国家重点研发计划一项。目前担任ACM Transactions on Probabilistic Machine Learning 的Editorial Board Member。曾多次访问香港大学统计与精算学系、香港理工大学应用数学系等。
邀请人:张志民
欢迎广大师生积极参与!