Abstract
#High-dimensional data analysis requires variable selection to
identify truly relevant variables. More often it is done implicitly via
regularization, such as penalized regression. Of the many versions of
penalties, SCAD has shown good properties and has been widely adopted in
medical research and many more areas. This paper reviews the various
#optimization techniques in solving SCAD penalized regression. High-dimensional data analysis has been a common and important topic in
biomedical/genomic/clinical studies. For example, the identification of
genetic factors for complex diseases such as lung cancer implicates a
variety of genetic variants. For high-dimensional data, there is the
well-known problem of curse of #dimensionality arising in modeling.
Therefore, variable selection is a fundamental task for high-dimensional
statistical modeling. The "old school" way of doing variable selection
is to follow a subset selection procedure prior to building the model of
interest. The procedure commonly adopts AIC/BIC as evaluation metric
and often iterates in a #stepwise fashion. Yet this is independent of the
subsequent modeling task hence the effectiveness might be less
desirable. A more natural way is to integrate the variable selection
into the modeling itself, i.e., the penalized regression, which
simultaneously performs variable selection and coefficient estimation.
For more articles on BJSTR Journal please click on https://biomedres.us/
For more Preventive Medicine Articles on BJSTR
No comments:
Post a Comment
Note: Only a member of this blog may post a comment.