报告人:王明秋 教授(曲阜师范大学)
时间:2025年11月19日 14:30-
地址:数统学院LD402
摘要:The advantage of the Minimum Distance Kernel Estimation (MDKE) method lies in its robustness to both outliers and high leverage points, with no reliance on the density function of the response variable. When the data scale is massive and computing resources are limited, the computational cost of estimation using this method becomes unacceptable. However, subsampling techniques can significantly reduce computational memory and time in the analysis of massive data. And existing subsampling methods do not account for such data anomaly scenarios. To address this issue, this paper combines the MDKE method with random perturbation subsampling algorithms, develops subsampling algorithms with the product weight and the additive weight, respectively, and derives the consistency and asymptotic normality of their estimators. Introducing two weighting strategies into the objective function eliminates the need to explicitly calculate subsampling probabilities for all samples, which can significantly reduce computational burden. Furthermore, our theoretical analysis demonstrates that the estimator derived from the additive weight algorithm achieves superior efficiency. The effectiveness and robustness of the proposed method are thoroughly verified through simulation studies in multiple cases and a real data.
邀请人:刘朝林
欢迎广大师生积极参与!