Exercise 13.5 - Reducing elastic net to lasso

Answers

Expanding the l.h.s. of (13.196) yields:

J 1 ( c 𝐰 ) = ( 𝐲 c 𝐗 𝐰 ) T ( 𝐲 c 𝐗 𝐰 ) + c 2 λ 2 𝐰 T 𝐰 + λ 1 | 𝐰 | 1 = 𝐲 T 𝐲 + c 2 𝐰 T 𝐗 T 𝐗 𝐰 2 𝐲 T 𝐗 𝐰 + c 2 λ 2 𝐰 T 𝐰 + λ 1 | 𝐰 | 1 .

While for its r.h.s.:

J 2 ( 𝐰 ) = ( 𝐲 c 𝐗 𝐰 c λ 2 𝐰 ) T ( 𝐲 c 𝐗 𝐰 c λ 2 𝐰 ) + c λ 1 | 𝐰 | 1 = ( 𝐲 c 𝐗 𝐰 ) T ( 𝐲 c 𝐗 𝐰 ) + c 2 λ 2 𝐰 T 𝐰 + c λ 1 | 𝐰 | 1 = 𝐲 T 𝐲 + c 2 𝐰 T 𝐗 T 𝐗 𝐰 2 𝐲 T 𝐗 𝐰 + c 2 λ 2 𝐰 T 𝐰 + c λ 1 | 𝐰 | 1 .

Hence (13.192) and (13.193) are equal, this proceeds to prove (13.195).

This shows elastic net regularization, which pick a regularing term as a linear combination of l 1 and l 0 equals a lasso one. (The design matrix used in this exercise collets data as rows.)

User profile picture
2021-03-24 13:42
Comments