Support Vector Machine lectures

There is a big gap of explanation between session 16 and session 17. most of the slides are not covered.


I have rechecked all the videos of SVM and Decisions trees and their slides.
All the slides are covered and explained in the videos.
For Decisions tree slides kindly go to session 18 where it is covered in detail.

I cannot find any gap between the sessions, kindly mention the timestamps for the same.

the topic by which the video is ended in session 16 is the slide 117 and in the session 17 video is starting from computational complexity i.e slide 143.

The slides on ‚ÄúNon Linear SVM using RBF kernel‚ÄĚ was for self read purpose, It was given in the slides to explore more on ‚ÄúNon-LInear kernel types‚ÄĚ.

RBF is an exponential kernel. (Non linear).

You can read more about RBF here:-

All the best!

Hi Soumyadeep & Satyajit,

Need to add few points from what has been observed:

Session 16 ends abruptly at Polynomial Features + Standard Scaler + Linear SVC being the last point of discussion followed by the codes in Jupyter notebook.

Session 17 skips the 2 concepts viz. :

b) SVC Polynomial Kernel + StandardScaler
c) SVC RBF Kernel + StandardScaler

(Both forming a part of Non-Linear Classification techniques)

Session 17 skips the aforesaid concepts and directly starts with SVM Regression,rather to be frank.

These can be pin-pointed as I have jotted down some of the explanation & finer aspects/points given that is well explained by Sandeep Giri. (Note: Am writing this so as to understand how to explain such concepts while being questioned or asked during Interview(s) & moreover Faculty - Sandeep Giri’s explanation fills those gaps with his lucid speech)…but in this case, the explanation of the aforesaid concepts are rather missing.

In Session 16, there were some errors generated in the codes that was time-consuming and thereby this aspect could have resulted in the oversight/error regarding the connectivity between the two sessions ie. Session 16 & Session 17.

Also the important concepts viz. Kernel & Gamma explanation are missing from the lecture discussions. Neither is it there in Session 16 or 17. Kindly correct me if I am wrong. .

Can someone explain me or provide me practical verbal explanation regarding the aforesaid concepts as part of Short Notes or Addendum.?

Somehow the codes can be managed by referring to Jupyter Notebooks as shared by our Training Academy. No need of spoon-feeding for every little aspect that may be found missing. However, according to me, the explanation of aforesaid concepts are important in order to grasp the lecture contents of Session 17.

Hoping that Cloud X Lab Team would look into this significantly glaring gap & bridge the shortcomings with a favourable response…

Hi, Sameer.

As I mentioned some of the concepts are meant for self-explorations and discussions.

The main purpose of the training is to give Inspirations about the concepts and cover it breadth wise, now it depends upon your Perspiration and how much you can go in-depth wise in the topics and concepts.

Still you are finding any doubts in Mathematical concepts/ coding you can post here and I will try to resolve it to the best of my knowledge.

All the best!

Apprecite Satyajit for the inspiration given.

After reading /understanding the concepts, I have shared my view-points/thoughts regarding SVM topic on this link: ML - SVM (Support Vector Machine) Technique in a Nutshell

Thanks once again!!!

Thanks Sameer for great compilation and contributions!

1 Like

Appreciate Satyajit for your feedback !!!

Rather I would be glad if you can pin-point any errors in that article, in case if there are any. That would help me in improving my understanding regarding the concepts…

Okay, I will let you know!

:grinning: :ok_hand:t6: :pray:t6: