Multiple regression

The fact is that most real-world analyses have more than one independent variable. Multiple regression is an extension of simple linear regression. The key difference is that there are additional beta coefficients for the additional predictor variables. When training a model, the goal is to find the beta coefficients that minimize the errors of the linear equation. Let's try to mathematically formulate the relationship between the dependent variable and the set of independent variables (features).

Similar to a simple linear equation, the dependent variable, y, is quantified as the sum of an intercept term plus the product of the β coefficients multiplied by the x value for each of the i features:

y = α + β 1 x 1 + β 2 x 2 +...+ β i x i + ε

The error is represented by ε and  indicates that the predictions are not perfect.

The β coefficients allow each feature to have a separate estimated effect on the value of y because of y changes by an amount of β i for each unit increase in xi. Moreover, the intercept (α) indicates the expected value of y when the independent variables are all 0.

Note that all the variables in the preceding equation can be represented by a bunch of vectors. The target and predictor variables are now vectors with a row and the regression coefficients, β, and errors, ε, are also vectors.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
3.137.175.113