Support Vector Machines (SVMs) are a powerful supervised machine learning algorithm used for binary classification. Here's a brief overview of how SVMs work for binary classification:
Data Representation: SVMs classify data by finding the best hyperplane that separates data points of one class from those of another class.
Margin Maximization: The best hyperplane is the one with the largest margin, which is the maximal width of the slab parallel to the hyperplane that has no interior data points.
Support Vectors: The data points that are closest to the separating hyperplane are called support vectors. These points define the margin.
Mathematical Formulation: The problem is formulated as a quadratic programming problem, where the goal is to minimize the norm of the weights subject to constraints that ensure correct classification of the data points.
Here's a simple example of how to use SVM for binary classification in MATLAB:
% Generate sample data
X = [1 2; 5 4; 3 6; 7 8; 2 3; 6 7; 4 5];
Y = [-1 -1 -1 -1 1 1 1]; % Labels: -1 for class 1, 1 for class 2
% Create SVM model
SVMModel = fitcsvm(X, Y);
% Plot the data points and decision boundary
figure;
plot(X(Y == -1, 1), X(Y == -1, 2), 'ko', 'MarkerSize', 10);
hold on;
plot(X(Y == 1, 1), X(Y == 1, 2), 'ro', 'MarkerSize', 10);
plot(SVMModel, X, Y);
legend('Class 1', 'Class 2', 'Decision Boundary');
xlabel('Feature 1');
ylabel('Feature 2');
title('SVM for Binary Classification');
grid on;
Generate Sample Data: X contains the feature vectors, and Y contains the corresponding labels.
Create SVM Model: The fitcsvm function is used to create the SVM model.
Plot Data Points and Decision Boundary: The data points are plotted, and the decision boundary is added using the SVM model.