|
In this paper, we proposed a $L_1$ penalized LAD estimation with some linear constraints. Different from constrained lasso, our estimation can perform well when heavy-tailed errors or outliers are found in the response. When the dimension of the estimation coefficient $p$ is fixed, the estimation enjoys the Oracle property with adjusted normal variance. When $p$ is greater than $n$, the error bound of estimation is sharper than $\sqrt{k\log(p)/n}$. It is worth noting the result is true for a wide range of noise distribution, even for the Cauchy distribution. Simulation and application to real data also confirm this. |
|
Keywords:High dimensional regression, Linear constraints, Variable selection, LADLasso, Oracle property |
|