$30.00
Description
Question 1 Corrected (For Assessment)
Show that, for the maximum margin classifier, the correct value of β_{0} is |
|||||||
β_{0}=− |
max_{i}_{:}_{y}_{i}_{=−1}(β^{∗})^{T} x_{i} + min_{i}_{:}_{y}_{i}_{=1} |
(β^{∗})^{T} x_{i} |
, |
||||
where β^{∗} = ^{P} |
2 |
||||||
n |
λ_{i}^{∗}y_{i}x_{i} is the optimal value for |
β. |
|||||
i=1 |
Question 2
Work through labs 9.6.2 and 9.6.5 in the text book. This should give you a feeling for how support vector classifiers work in R.
Question 3
Consider the support vector classifier with the Lagrangian
1 |
n |
n |
n |
|||||||||||||
X |
X |
X |
||||||||||||||
L(β, β_{0}, ξ, λ, µ) = |
β^{T}β+C |
ξ_{i} − λ_{i} y_{i}(x_{i}^{T} β + β_{0}) − 1 + ξ_{i} − µ_{i}ξ_{i} |
||||||||||||||
2 |
||||||||||||||||
i=1 |
i=1 |
i=1 |
||||||||||||||
Using the KKT equations, show that the optimal β can be written as |
||||||||||||||||
n |
||||||||||||||||
β = |
X |
|||||||||||||||
λ_{i}y_{i}x_{i}. |
||||||||||||||||
i=1 |
||||||||||||||||
Show that λ solves |
||||||||||||||||
n |
1 |
n n |
||||||||||||||
X |
X X |
|||||||||||||||
max |
λ |
y y |
λ |
λ |
x^{T} x |
|||||||||||
i − _{2} |
||||||||||||||||
λ |
i j |
i |
j |
i |
j |
|||||||||||
i=1 |
i=1 j=1 |
Subject to:
0 ≤ λ_{i} ≤ C, i = 1, . . . , n
n
X
λ_{i}y_{i} = 0.
i=1
Argue that
λ_{i} = 0 ⇒ y_{i}(β^{T} x_{i} + β_{0}) ≥ 1
λ_{i} = C ⇒ y_{i}(β^{T} x_{i} + β_{0}) ≤ 1
0 < λ_{i} < C ⇒ y_{i}(β^{T} x_{i} + β_{0}) = 1.
1