Exercise 3.9

Answers

1.
See plot below
2.
When y = 1
  • If s 0, eclass(s,y) = 0, esq(s,y) = (1 s)2 0
  • If s0, eclass(s,y) = 1, esq(s,y) = (1 s)2 = 1 2s + s21

In both cases, we have esq(s,y) eclass(s,y). Similarly, we can prove that when y = 1, we have esq(s,y) eclass(s,y) as well. So esq(s,y) eclass(s,y) for all cases of s and y. The classification is upper bounded by the squared error.

3.
When y = 1,
(a)
If s 0, eclass(s,y) = 0, elog(s,y) = ln (1 + exp (ys)) = ln (1 + exp (s)) ln 1 = 0, also ln 20, so we have 1 ln 2elog(s,y) 0 = eclass(s,y).
(b)
If s0, eclass(s,y) = 1, elog(s,y) = ln (1 + exp (ys)) = ln (1 + exp (s))ln 2, so 1 ln 2elog(s,y)1 = eclass(s,y)

Similarly, we can prove that when y = 1, we have 1 ln 2elog(s,y) eclass(s,y) as well. The logistic regression is an upper bound to the classification error.

import numpy as np 
import matplotlib.pyplot as plt 
 
def tanh(s): 
    a = np.exp(s) - np.exp(-s) 
    b = np.exp(s) + np.exp(-s) 
    return a/b 
 
def classification_error(s, y): 
    signs = np.sign(s) 
    return signs != y 
 
def square_error(s, y): 
    return (y-s)**2 
 
def log_error(s, y): 
    return np.log(1.0+np.exp(-y*s)) 
 
ss = np.arange(-10,10,0.1) 
cls_err = classification_error(ss, 1) 
sq_err = square_error(ss, 1) 
log_err = log_error(ss, 1)/np.log(2) 
plt.plot(ss, cls_err, label='Classification Error') 
plt.plot(ss, sq_err, label='Square Error') 
plt.plot(ss, log_err, label='Log Error') 
plt.ylim(-0.12) 
plt.legend() 
plt.show()

PIC

User profile picture
2021-12-07 22:17
Comments