MATLAB SVM tutorial (fitcsvm)

I am sorry for everyone that I did not actually write code in the description.

clear; close all; clc;

%% preparing dataset

load fisheriris

species_num = grp2idx(species);
%%

% binary classification 형태로 만들기 위해 100개만…
X = randn(100,10);
X(:,[1,3,5,7]) = meas(1:100,:); % 1, 3, 5, 7 번 feature가 분류에 유용한 feature일 것임.
y = species_num(1:100);

rand_num = randperm(size(X,1));
X_train = X(rand_num(1:round(0.8*length(rand_num))),:);
y_train = y(rand_num(1:round(0.8*length(rand_num))),:);

X_test = X(rand_num(round(0.8*length(rand_num))+1:end),:);
y_test = y(rand_num(round(0.8*length(rand_num))+1:end),:);
%% CV partition

c = cvpartition(y_train,’k’,5);
%% feature selection

opts = statset(‘display’,’iter’);
classf = @(train_data, train_labels, test_data, test_labels)…
sum(predict(fitcsvm(train_data, train_labels,’KernelFunction’,’rbf’), test_data) ~= test_labels);

[fs, history] = sequentialfs(classf, X_train, y_train, ‘cv’, c, ‘options’, opts,’nfeatures’,2);
%% Best hyperparameter

X_train_w_best_feature = X_train(:,fs);

Md1 = fitcsvm(X_train_w_best_feature,y_train,’KernelFunction’,’rbf’,’OptimizeHyperparameters’,’auto’,…
‘HyperparameterOptimizationOptions’,struct(‘AcquisitionFunctionName’,…
‘expected-improvement-plus’,’ShowPlots’,true)); % Bayes’ Optimization 사용.

%% Final test with test set
X_test_w_best_feature = X_test(:,fs);
test_accuracy_for_iter = sum((predict(Md1,X_test_w_best_feature) == y_test))/length(y_test)*100

%% hyperplane 확인

figure;
hgscatter = gscatter(X_train_w_best_feature(:,1),X_train_w_best_feature(:,2),y_train);
hold on;
h_sv=plot(Md1.SupportVectors(:,1),Md1.SupportVectors(:,2),’ko’,’markersize’,8);

% test set의 data를 하나 하나씩 넣어보자.

gscatter(X_test_w_best_feature(:,1),X_test_w_best_feature(:,2),y_test,’rb’,’xx’)

% decision plane
XLIMs = get(gca,’xlim’);
YLIMs = get(gca,’ylim’);
[xi,yi] = meshgrid([XLIMs(1):0.01:XLIMs(2)],[YLIMs(1):0.01:YLIMs(2)]);
dd = [xi(:), yi(:)];
pred_mesh = predict(Md1, dd);
redcolor = [1, 0.8, 0.8];
bluecolor = [0.8, 0.8, 1];
pos = find(pred_mesh == 1);
h1 = plot(dd(pos,1), dd(pos,2),’s’,’color’,redcolor,’Markersize’,5,’MarkerEdgeColor’,redcolor,’MarkerFaceColor’,redcolor);
pos = find(pred_mesh == 2);
h2 = plot(dd(pos,1), dd(pos,2),’s’,’color’,bluecolor,’Markersize’,5,’MarkerEdgeColor’,bluecolor,’MarkerFaceColor’,bluecolor);
uistack(h1,’bottom’);
uistack(h2,’bottom’);
legend([hgscatter;h_sv],{‘setosa’,’versicolor’,’support vectors’})

Comments

Francisco Carrascoza says:

Hey! Where is the code?

Emmanuel Mutabazi says:

Very nice video, could you please tell me how i can solve this error? In the Feature Selection part, I’m getting this error: [fs,history] = sequentialfs(fun,x_train,y_train,’cv’,c,’options’,opts,’nfeatures’,2);
Start forward sequential feature selection:
Initial columns included: none
Columns that can not be included: none
Error using crossval>evalFun (line 480)
The function
‘@(train_data,train_labels,test_data,test_labels)sum(predict(fitcsvm(train_data,train_labels,’kernelFunction’,’rbf’,test_data)))==test_label’
generated the following error:
Wrong number of arguments.

Error in crossval>getFuncVal (line 497)
funResult = evalFun(funorStr,arg(:));

Error in crossval (line 343)
funResult = getFuncVal(1, nData, cvp, data, funorStr, []);

Error in sequentialfs>callfun (line 485)
funResult = crossval(fun,x,other_data{:},…

Error in sequentialfs (line 353)
crit(k) = callfun(fun,x,other_data,cv,mcreps,ParOptions);

Duverneuil Audrey says:

Hi ! I would like to know if you have an idea or a method for finding an equivalence between svmtrain and fitcsvm outputs. To be clear I have a matlab code written with matlab 2013 version and I am using matlab2018a version. In the oldest code I need the ‘rho’ (decision function wx+b), ‘sv_coef’ (coefficients for SVs for each class) and ‘SVs’ (support vectors) output parameters from the using of svmtrain function but I don’t know how have the same informations with fitcsvm. I hope you can help me. See you soon 😉

Yuvraj Thakur says:

Thank you

Ahmed Ali says:

nice what about for multi classification,including the code

MrKillerai says:

Cool video man, thanks

Thoufiq Shuvo says:

Code download link please

Star Mi says:

Hi, very cool video. Do you know how I can compute the distance from every datapoint to the hyperplane?

Md Mamunur Rahaman says:

Thanks. A great help

Amanda Rodrigue says:

Was the code ever provided?

Pietro Cicalese says:

Can you post a video solving the nonbinary problem with the same dataset (fisheriris)

EDIT: There is an error in this video. There is no need to separate test/training sets prior to k-fold. cvpartition does that for you.

 Write a comment

*

Do you like our videos?
Do you want to see more like that?

Please click below to support us on Facebook!