[Day 50 - My ML journey does not end today!] KAIST's AI503 Mathematics for AI (PCA, GMM, SVM)

 Hello :)
Today is Day 50!

And it is not the last day. I will continue every day, studying about ML and DL and writing about it in this blog. After this post, I will go change the title to 50+ days of ML haha

When I decided to do this (which was around Christmas) I did not know what was I going into, whether I will be motivated to not only study it every day but also write this blog about it. Tbh writing the blog is sometimes harder - mainly because I end up writing it around 2-3am where I am sleepy and just wanting to go to bed haha. 

But Machine learning and Deep learning completely took over me. I am always attracted to the difficult. In ML I see something that is extremely difficult and challenging, a topic so deep and full of potential that makes me excited to study it everyday. ML involves so many things and topics that the amount just makes me excited and motivates me to continue. Because I know that even if I learn X, there is X+1 which is harder, but also many times there is X-1 (some basics) which are harder than X itself. 

This vacation I found something... something that challenges me, and because it challenges me - it excites me. 


Now, a quick summary of today:
  • Lecture 5: Principal Component Analysis
  • Lecture 6: Gaussian Mixture Models
  • Lecture 7: Support Vector Machines

These are the last 3 lectures and chapters of the Mathematics for Machine learning book

Today I remembered an advice from lecture one of KAIST's AI503 Math for AI course:
'Do not ponder where this mathematics is useful'

When I started the course I kept this in mine. "I will focus on understanding how the math works, how we derive formulas" - because seeing how some formulas are useful and where are they useful is very hard because if I learn one formula in chapter 1, maybe after sometime I will learn that this formula is used in chapter 10 (of the same or maybe even a second book haha). 

Like yesterday I uploaded my notes to my google drive. But also are below as pictures. SVMs is a topic I am a bit familiar, but for PCA and GMM - I was excited to learn them in depth.


PCA


GMM


SVM






Tomorrow I hope to cover the next 3 which I have heard being mentioned in other lectures.

  • Lecture 8: High-dimensional Space 
  • Lecture 9: Random Walks and Markov Chains
  • Lecture 10: VC-Dimension

That is all for today, day 50! Not the end!

See you tomorrow :) 

Popular posts from this blog

[Day 198] Transactions Data Streaming Pipeline Porject [v1 completed]

[Day 107] Transforming natural language to charts

[Day 54] I became a backprop ninja! (woohoo)