Can gradient descent be used with SVM?

In the aforesaid PPT Template, there are 2 statements used i.e.
a) 1st line —Tells us about Support Vector Machines
b) 2nd line tells us about Gradient Descent

I didn’t understand the above 2 lines. Are these 2 statements complementing or contradicting each other? Kindly clarify.


SVM has two parts to it: The SVM Loss and SVM algorithm.

In case of classification, SVM loss essentially means that "Come up with a biggest possible lane that separates the classes well.

Now, how to come up with such lane? You can either use gradient descent or you can use the geometric optimization. This geometric optimization is the SVM algorithm.

So, you can use gradient descent with SVM loss function.

1 Like

Please note that I have moved the second part of the question here: Which algorithms can be used for online learning?

Have noted your comments.Thanks for the segregation of the two queries/doubts.

Thanks a ton Sandeep… you have explained the concepts very clearly using simple English…It improves my understanding regarding SVM concepts.

However, there is one doubt still remains. Is this Geometric Optimization or SVM Algorithm same as Quadratic Programming ? Or are these 2 concepts different from each other? Kindly let me know.

Yes. The quadratic programming problems are a class of optimization problems that has multiple constraints.

SVM problems have multiple geometric and algebraic constraints. Though it is not really needed, but if you are interested in learning more about SVM, please check here:

1 Like

[quote=“sgiri, post:6, topic:5551”]
SVM problems have multiple geometric and algebraic constraints.
[/quote] :writing_hand: :+1: :grinning:
Thanks for your prompt reply…& for sharing the link regarding SVM.

1 Like