石跃勇等 Generalized Newton-Raphson algorithm for high dimensional LASSO regression

发布人:胡松琴 发布时间:2022-02-22 点击次数:

石跃勇等 Generalized Newton-Raphson algorithm for high dimensional LASSO regression

IMG_256

我校古天乐代言太阳集团石跃勇老师在T2级别期刊——《Statistics and Its Interface》上发表题为“Generalized Newton-Raphson algorithm for high dimensional LASSO regression”。论文第一作者石跃勇为古天乐代言太阳集团副教授。

Abstract /摘要:

The least absolute shrinkage and selection operator (LASSO) penalized regression is a state-of-the-art statistical method in high dimensional data analysis, when the number of predictors exceeds the number of observations. The commonly used Newton–Raphson algorithm is not very successful in solving the non-smooth optimization in LASSO. In this paper, we propose a fast generalized Newton–Raphson (GNR) algorithm for LASSO-type problems. The proposed algorithm, derived from a suitable Karush–Kuhn–Tucker (KKT) conditions based on generalized Newton derivatives, is a non-smooth Newton-type method. We first establish the local one-step convergence of GNR and then show that it is very efficient and accurate when coupled with a constinuation strategy. We also develop a novel parameter selection method. Numerical studies of simulated and real data analysis suggest that the GNR algorithm, with better (or comparable) accuracy, is faster than the algorithm implemented in the popular glmnet package.

论文信息;

Title/题目:

Generalized Newton-Raphson algorithm for high dimensional LASSO regression

Authors/作者:

Shi Yueyong; Huang Jian; Jiao Yuling; Kang Yicheng; Zhang Hu

Key Words /关键词:

LASSO; generalized Newton–Raphson; continuation; local one-step convergence; voting

Indexed by /核心评价:

WAJCI; SCI; Scopus;

DOI:DOI:10.4310/20-SII643

全文链接:https://dx.doi.org/10.4310/20-SII643