Friday, 20 December 2013

Murphy's ML book

For the next few months, I'll be working on making the Kernel methods faster. 
I was in the process of preparing a FAQ kind of article on SVM and Kernel trick, with emphasis on the computational cost, when I came across Murphy's book on ML. I was amazed to find an entirely new perspective to SVM and some severe criticism of SVM in this book. To understand the criticism, I had to delve into the theory of Bayesian way of ML, which is the method used in Murphy's book. I have done a course on Machine Learning earlier (CS771: Machine Learning: Tools and Techniques), but not much emphasis was given to Bayesian approach to ML in that course. I have also seen some of the lectures of Yaseer Mustafa on ML and there as well, no mention of Bayesian approach was found. This kind of approach is new to me and reading into this has led me to several fascinating concepts. My guide tells me that the community is also taking a liking to the Bayesian approaches off late. Now, I am reading about several kinds of vector machines like sparse vector machines, l1 reqularized vector machines, l2-normalized vector machines, relevance vector machines. There is a lot of talk about sparsity promoting priors in Murphy's book. This whole prior business comes from the Bayesian outlook. I don't fully understand this and I'll have to go through first 14 chapters of Murphy's book before I can continue with my article on SVM and accelerating kernel methods.

No comments:

Post a Comment