Modeling with Linear Regression

"In more than three centuries of science everything has changed except perhaps one thing: the love for the simple."
– Jorge Wagensberg

Music—from classical compositions to Sheena is a Punk Rocker by The Ramones, passing through the unrecognized hit from a garage band and Piazzolla's Libertango—is made from recurring patterns. The same scales, combinations of chords, riffs, motifs, and so on appear over and over again, giving rise to a wonderful sonic landscape capable of eliciting and modulating the entire range of emotions humans can experience. In a similar fashion, the universe of statistics and machine learning (ML) is built upon recurring patterns, small motifs that appear now and again. In this chapter, we are going to look at one of the most popular and useful of them, the linear model (or motif, if you want). This is a very useful model on its own and also the building block of many other models. If you ever took a statistics course (even a non-Bayesian one), you may have heard of simple and multiple linear regression, logistic regression, ANOVA, ANCOVA, and so on. All these methods are variations of the same underlying motif, the linear regression model. In this chapter, we will cover the following topics:

  • Simple linear regression
  • Robust linear regression
  • Hierarchical linear regression
  • Polynomial regression
  • Multiple linear regression
  • Interactions
  • Variable variance
..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
3.145.179.252