MATLAB SVM tutorial (fitcsvm)

I am sorry for everyone that I did not actually write code in the description.

clear; close all; clc;

%% preparing dataset

load fisheriris

species_num = grp2idx(species);

% binary classification 형태로 만들기 위해 100개만…
X = randn(100,10);
X(:,[1,3,5,7]) = meas(1:100,:); % 1, 3, 5, 7 번 feature가 분류에 유용한 feature일 것임.
y = species_num(1:100);

rand_num = randperm(size(X,1));
X_train = X(rand_num(1:round(0.8*length(rand_num))),:);
y_train = y(rand_num(1:round(0.8*length(rand_num))),:);

X_test = X(rand_num(round(0.8*length(rand_num))+1:end),:);
y_test = y(rand_num(round(0.8*length(rand_num))+1:end),:);
%% CV partition

c = cvpartition(y_train,’k’,5);
%% feature selection

opts = statset(‘display’,’iter’);
classf = @(train_data, train_labels, test_data, test_labels)…
sum(predict(fitcsvm(train_data, train_labels,’KernelFunction’,’rbf’), test_data) ~= test_labels);

[fs, history] = sequentialfs(classf, X_train, y_train, ‘cv’, c, ‘options’, opts,’nfeatures’,2);
%% Best hyperparameter

X_train_w_best_feature = X_train(:,fs);

Md1 = fitcsvm(X_train_w_best_feature,y_train,’KernelFunction’,’rbf’,’OptimizeHyperparameters’,’auto’,…
‘expected-improvement-plus’,’ShowPlots’,true)); % Bayes’ Optimization 사용.

%% Final test with test set
X_test_w_best_feature = X_test(:,fs);
test_accuracy_for_iter = sum((predict(Md1,X_test_w_best_feature) == y_test))/length(y_test)*100

%% hyperplane 확인

hgscatter = gscatter(X_train_w_best_feature(:,1),X_train_w_best_feature(:,2),y_train);
hold on;

% test set의 data를 하나 하나씩 넣어보자.


% decision plane
XLIMs = get(gca,’xlim’);
YLIMs = get(gca,’ylim’);
[xi,yi] = meshgrid([XLIMs(1):0.01:XLIMs(2)],[YLIMs(1):0.01:YLIMs(2)]);
dd = [xi(:), yi(:)];
pred_mesh = predict(Md1, dd);
redcolor = [1, 0.8, 0.8];
bluecolor = [0.8, 0.8, 1];
pos = find(pred_mesh == 1);
h1 = plot(dd(pos,1), dd(pos,2),’s’,’color’,redcolor,’Markersize’,5,’MarkerEdgeColor’,redcolor,’MarkerFaceColor’,redcolor);
pos = find(pred_mesh == 2);
h2 = plot(dd(pos,1), dd(pos,2),’s’,’color’,bluecolor,’Markersize’,5,’MarkerEdgeColor’,bluecolor,’MarkerFaceColor’,bluecolor);
legend([hgscatter;h_sv],{‘setosa’,’versicolor’,’support vectors’})


Francisco Carrascoza says:

Hey! Where is the code?

Emmanuel Mutabazi says:

Very nice video, could you please tell me how i can solve this error? In the Feature Selection part, I’m getting this error: [fs,history] = sequentialfs(fun,x_train,y_train,’cv’,c,’options’,opts,’nfeatures’,2);
Start forward sequential feature selection:
Initial columns included: none
Columns that can not be included: none
Error using crossval>evalFun (line 480)
The function
generated the following error:
Wrong number of arguments.

Error in crossval>getFuncVal (line 497)
funResult = evalFun(funorStr,arg(:));

Error in crossval (line 343)
funResult = getFuncVal(1, nData, cvp, data, funorStr, []);

Error in sequentialfs>callfun (line 485)
funResult = crossval(fun,x,other_data{:},…

Error in sequentialfs (line 353)
crit(k) = callfun(fun,x,other_data,cv,mcreps,ParOptions);

Duverneuil Audrey says:

Hi ! I would like to know if you have an idea or a method for finding an equivalence between svmtrain and fitcsvm outputs. To be clear I have a matlab code written with matlab 2013 version and I am using matlab2018a version. In the oldest code I need the ‘rho’ (decision function wx+b), ‘sv_coef’ (coefficients for SVs for each class) and ‘SVs’ (support vectors) output parameters from the using of svmtrain function but I don’t know how have the same informations with fitcsvm. I hope you can help me. See you soon 😉

Yuvraj Thakur says:

Thank you

Ahmed Ali says:

nice what about for multi classification,including the code

MrKillerai says:

Cool video man, thanks

Thoufiq Shuvo says:

Code download link please

Star Mi says:

Hi, very cool video. Do you know how I can compute the distance from every datapoint to the hyperplane?

Md Mamunur Rahaman says:

Thanks. A great help

Amanda Rodrigue says:

Was the code ever provided?

Pietro Cicalese says:

Can you post a video solving the nonbinary problem with the same dataset (fisheriris)

EDIT: There is an error in this video. There is no need to separate test/training sets prior to k-fold. cvpartition does that for you.

Write a comment


Human Verification: In order to verify that you are a human and not a spam bot, please enter the answer into the following box below based on the instructions contained in the graphic.

Do you like our videos?
Do you want to see more like that?

Please click below to support us on Facebook!

Send this to a friend

▷ Other ReviewsVehicles   Show Cars   Motorbikes   Scooters   Bicycles   Rims & Tires   Luxury BoatsFashion   Sunglasses   Luxury Watches   Luxury Purses   Jeans Wear   High Heels   Kinis Swimwear   Perfumes   Jewellery   Cosmetics   Shaving Helpers   Fashion Hats   Modeling TipsFooding   Chef Club   Fooding Helpers   Coktails & LiquorsSports   Sport Shoes   Fitness & Detox   Golf Gear   Racquets   Hiking & Trek Gear   Diving Equipment   Ski Gear   Snowboards   Surf Boards   Rollers & SkatesEntertainment   DIY Guides   Zik Instruments   Published Books   Music Albums   Cine Movies   Trading Helpers   Make Money   Fishing Equipment   Paintball Supplies   Trading Card Games   Telescopes   Knives   Vapes   GameplaysHigh Tech   Flat Screens   Tech Devices   Camera Lenses   Audio HiFi   Printers   USB Devices   PC Hardware   Network Gear   Cloud Servers   Software Helpers   Programmer Helpers   Mobile Apps   Hearing AidsHome   Home Furniture   Home Appliances   Tools Workshop   Beddings   Floor Layings   Barbecues   Aquarium Gear   Safe Boxes   Office Supplies   Security Locks   Cleaning ProductsKids   Baby Strollers   Child Car Seats   Remote ControlledTravel   Luggages & Bags   Airlines Seats   Hotel Rooms   Fun Trips   Cruise Ships   Mexico Tours