Homepage › Solution manuals › Yaser Abu-Mostafa › Learning from Data › Exercise 3.9
Exercise 3.9
Answers
- 1.
- See plot below
- 2.
- When
- If , ,
- If , ,
In both cases, we have . Similarly, we can prove that when , we have as well. So for all cases of and . The classification is upper bounded by the squared error.
- 3.
- When ,
- (a)
- If , , , also , so we have .
- (b)
- If , , , so
Similarly, we can prove that when , we have as well. The logistic regression is an upper bound to the classification error.
import numpy as np import matplotlib.pyplot as plt def tanh(s): a = np.exp(s) - np.exp(-s) b = np.exp(s) + np.exp(-s) return a/b def classification_error(s, y): signs = np.sign(s) return signs != y def square_error(s, y): return (y-s)**2 def log_error(s, y): return np.log(1.0+np.exp(-y*s)) ss = np.arange(-10,10,0.1) cls_err = classification_error(ss, 1) sq_err = square_error(ss, 1) log_err = log_error(ss, 1)/np.log(2) plt.plot(ss, cls_err, label='Classification Error') plt.plot(ss, sq_err, label='Square Error') plt.plot(ss, log_err, label='Log Error') plt.ylim(-0.1, 2) plt.legend() plt.show()