08 - SVM - lec. 11, 12, 13
ucla | CS M146 | 2023-05-03T13:57
Table of Contents
Supplemental
Perceptron Review
Constrained Optimization
Lecture
Linear Separators
Perceptron Review
Choosing a separator
- the last separator is robust to nooise in the dataset
Margin of a Linear Separator
- given a binary classifier with (1,-1) labels and a linear separator decision boundary
- the margin of aapoint w.r.t. the hyperplane (linear separator) is the perpendicular distance between the point and the hyperplane
Computing Margin
proof
- assuming defines the hyperplane that perfectly separates the data with no bias
- we assume the point wee are tryiing to calculate margin for has a positive label thus the hypothesis >0
- is the unit normal vector to the plane
- thus the point lies on the hyperplane
- this means because the point B lies on the hyperplane, its corresponding feature vector’s hypothesis = 0 so
Maximizing Margin Classification
issueis. that max margin in NON-CONVEX → hard to optiimize
Support Vector Machines
- the maachines made from datapoints that lie on the boundary (support vectors) of the max-margin
Hard Margin SVMs
- we canuse the pereptron loss to constrain the minimization
Classic SVMs (max-margin classification)
Soft Margin SVMs
Slack Constraints
Defining Support Vectors
Hinge Loss (Unconstrained Optimization)
Discussion
Resources
📌
**SUMMARY
**