Homepage › Solution manuals › Yaser Abu-Mostafa › Learning from Data › Exercise 4.3
Exercise 4.3
Answers
(a) Consider a given
- If the best approximation from is less complex than the initial target function, then when we increase the complexity of , the deterministic noise in general should increase, since it’ll be harder for functions in to fit the target function. There’ll be a higher tendency to overfit.
- If the best approximation from is more complex than the initial target function, then when we increase the complexity of , the deterministic noise in general may decrease first, reducing the deterministic noise and there’ll be a lower tendency to overfit. But once the complexity of exceeds the best function approximation from , and if we continue increase the complexity of , we will increase the deterministic noise and thus increase the tendency to overfit.
(b) Given a fixed
- If the best approximation from is less complex than the target function, then when we decrease the complexity of , we increase the deterministic noise thus increasing the tendency of overfit.
- If the best approximation from is more complex than the target function, then when we decrease the complexity of , we will decrease the deterministic noise thus decreasing the tendency of overfit. Well, if we continue to decrease the complexity of , passing the point where its complexity is equal to , we start to increase the deministic noise again and thus increasing overfit.
2021-12-08 09:26