Homepage › Solution manuals › Yaser Abu-Mostafa › Learning from Data › Exercise 6.10
Exercise 6.10
Answers
- (a)
- When , we have , at each data point , we have , so
- (b)
- The final hypothesis is not smooth, because as we increase , their nearest neighborhoods is changing, within an interval, for all in that interval, their nearest neighbors are fixed, so we have constant average from these nearest neighbors for within the interval. But once we move out of the interval, new neighbor joined and one old neighbor dropped, the average of the neighborhood immediately changed, so we see the step-wise pattern in the final hypothesis.
- (c)
- If
is very small, say, 1, 2, the final hypothesis
is very unsmooth, it can match the data points closely (or exactly if
),
but the approximation to
can be very bad because it tries too hard to fit the noises.
If is very large, the final hypothesis approaches the average of the whole data set, which is a constant, this is too simplistic to approximate the target function .
- (d)
- If , or , the nearest neighbors for will be a fixed set of points, i.e. the data points at two ends. So the hypothesis will generate constant values for going large at both ends.
2021-12-08 09:46