This resource is totally free, and consists in a course based on a book which is itself totally free and available. I got to know it while browsing the forums discussions on Coursera Data Science Discussions.

Somebody in a discussion compared this course to the Machine Learning Course by Andrew Ng, and added that this course was somehow more formal.

So, as I have enjoyed Dr. Ng’s course as never any other before, I have decided to take this course, hoping that it would somehow be similar.

The course run from the Stanford’s University Lagunita platform, which hosts a series of freely available courses. You need to register and then you have free access to all the courses hosted there.

It consists in video material for 10 weeks and each of the lessons is spit in several videos. Each video has its own review questions and at the end of each lesson there is a module review. The course examples are based on R, all files used in the R sessions are available together with the datasets.

The course is taught by Trevor Hastie and Rob Tibshirani, who are also co-authors of the freely available books Introduction to Statistical Learning (base of the course) and The Elements of Statistical Learning (more depth in this one).

The course follows an incremental and logical approach, starting from Linear Regression, after an introduction, then covers Classification, Resampling Methods, Model Selection and Regularization, then moves beyond linearity, exploring Tree based methods, SVM and at the end unsupervised learning.

It is not fair to compare this course with the Machine Learning one in my opinion. The ground covered here is more useful to build knowledge from the statistical point of view but the the Machine learning course from a programmer’s perspective is more effective. Both have their reason of being, and I like the course (and the fact that is totally free), however I still prefer professor Ng’s approach.