Deliver to Australia
IFor best experience Get the App
Full description not available
J**T
Actually does something (huge) with the math
I have been using The Elements of Statistical Learning for years, so it is finally time to try and review it.The Elements of Statistical Learning is a comprehensive mathematical treatment of machine learning from a statistical perspective. This means you get good derivations of popular methods such as support vector machines, random forests, and graphical models; but each is developed only after the appropriate (and wrongly considered less sexy) statistical framework has already been derived (linear models, kernel smoothing, ensembles, and so on).In addition to having excellent and correct mathematical derivations of important algorithms The Elements of Statistical Learning is fairly unique in that it actually uses the math to accomplish big things. My favorite examples come from Chapter 3 "Linear Methods for Regression." The standard treatments of these methods depend heavily on respectful memorization of regurgitation of original iterative procedure definitions of the various regression methods. In such a standard formulation two regression methods are different if they have superficially different steps or if different citation/priority histories. The Elements of Statistical Learning instead derives the stopping conditions of each method and considers methods the same if they generate the same solution (regardless of how they claim they do it) and compares consequences and results of different methods. This hard use of isomorphism allows amazing results such as Figure 3.15 (which shows how Least Angle Regression differs from Lasso regression, not just in algorithm description or history: but by picking different models from the same data) and section 3.5.2 (which can separate Partial Least Squares' design CLAIM of fixing the x-dominance found in principle components analysis from how effective it actually is as fixing such problems).The biggest issue is who is the book for? This is a mathy book emphasizing deep understanding over mere implementation. Unlike some lesser machine learning books the math is not there for appearances or mere intimidating typesetting: it is there to allow the authors to organize many methods into a smaller number of consistent themes. So I would say the book is for researchers and machine algorithm developers. If you have a specific issue that is making inference difficult you may find the solution in this book. This is good for researchers but probably off-putting for tinkers (as this book likely has methods superior to their current favorite new idea). The interested student will also benefit from this book, the derivations are done well so you learn a lot by working through them.Finally- don't buy the kindle version, but the print book. This book is satisfying deep reading and you will want the advantages of the printed page (and Amazon's issues in conversion are certainly not the authors' fault).
M**O
excellent overview, especially for outsiders, ties the field together conceptually
This review is written from the perspective of a programmer who has sometimes had the chance to choose, hire, and work with algorithms and the mathematician/statisticians that love them in order to get things done for startup companies. I don't know if this review will be as helpful to professional mathematicians, statisticians, or computer scientists.The good news is, this is pretty much the most important book you are going to read in the space. It will tie everything together for you in a way that I haven't seen any other book attempt. The bad news is you're going to have to work for it. If you just need to use a tool for a single task this book won't be worth it; think of it as a way to train yourself in the fundamentals of the space, but don't expect a recipe book. Get something in the "using R" series for that.When it came out in 2001 my sense of machine learning was of a jumbled set of recipes that tended to work in some cases. This book showed me how the statistical concepts of bias, variance, smoothing and complexity cut across both fields of traditional statistics and inference and the machine learning algorithms made possible by cheaper cpus. Chapters 2-5 are worth the price of the book by themselves for their overview of learning, linear methods, and how those methods can be adopted for non-linear basis functions.The hard parts:First, don't bother reading this book if you aren't willing to learn at least the basics of linear algebra first. Skim the second and third chapters to get a sense for how rustyyour linear algebra is and then come back when you're ready.Second, you really really want to use the SQRRR technique with this book. Having that glimpse of where you are going really helps guide you're understanding when you dig in for real.Third, I wish I had known of R when I first read this; I recommend using it along with some sample data sets to follow along with the text so the concepts become skills not justabstract relationships to forget. It would probably be worth the extra time, and I wish I had known to do that then.Fourth, if you are reading this on your own time while making a living, don't expect to finish the book in a month or two.
A**S
Understand the Rapidly Advancing Avalanche of Data Mining Techniques
Math books, at least data science texts, can usually be divided into those which are easy to read but contain little technical rigor and those which are written with a scientific approach to methodology but are so equation dense that it’s hard to imagine them being read outside an advanced academic setting.Fortunately, The Elements of Statistical Learning proves the exception. The text is full with the equations necessary to root the methodology without engaging the reader with long proofs that would tax those of us employing these techniques in the business world.The visual aspects of the text seem to have been written with John Tukey or Edward Tufte in mind. Though their frequent use makes the book some seven hundred pages long, reading and comprehension is made much easier.And, though it’s been almost ten years since the book was published, the techniques described remain, for the most part, at the cutting edge of data science.I was told by some other analysts I know that this was their bible for data science. I was somewhat skeptical of this kind of hyperbole but was pleasantly surprised that the book matched these high expectations. If you have an undergraduate degree in a mathematically related discipline, The Elements of Statistical Learning will prove to be an invaluable reference to understand the rapidly advancing avalanche of data mining techniques.
L**S
Excelente referência
Livro necessário para a biblioteca de estatísticos e cientistas de dados. Apresenta a teoria por associada aos principais métodos estatísticos usados na área de aprendizado de máquina, dando nome a uma nova área que reutiliza métodos estatísticos tradicionais voltados para a ciência de dados, o aprendizado estatístico. Livro fundamental para quem quer se aprofundar na área.
C**R
Bad quality rip-off, barely readable
My bad opinion is not about the content of the book, but the physical item itself. This is a smelly, bad quality rip-off, not the usual Springer edition! Thin pages, bad ink, very very hard to read. Be careful when ordering.
A**R
Got a black and white print
Have read chapter 5 through 9 (about 200 pages) till now. Excellent read, although the print is back and white on ordinary paper.
A**R
A Deniz
GRACIAS
C**N
Excelente libro, a.k.a. The ML Bible
Excelente experiencia de compra. Libro en muy buen estado, a color y buena calidad de hojas - - considerando el precio.El contenido del libro es excelente, y para quien no esté convencido de adquirirlo puede buscar la versión digital gratuita que hay en Internet.
Trustpilot
4 days ago
1 month ago