Robust regression

In the example datasets that we used in this chapter, we have seen that some observations might threaten the reliability of our results, because of the deviations of their residuals from a normal distribution. The Shapiro test performed on the residuals of model1 (nurses dataset) has shown that the distribution of the residuals was not significantly different from a normal distribution. However, let's be particularly cautious and analyze the same data using robust regression.

As we mentioned earlier, robust regression does not require the residuals to be normally distributed, and therefore, fits our purpose. We will not explore the algorithm. For details about this, the reader can consult Robust Regression in R by Fox and Weisberg (2012). Here, we simply perform robust regression using the rlm() function of the MASS package. Let's first install and load it:

install.packages("MASS"); library(MASS)
model1.rr = rlm(Commit ~ Exhaus + Depers + Accompl, data = nurses)
summary(model1.rr)

The output is as follows:

Robust regression

You might notice that the output of rlm() is laconical in comparison to the output of lm(). There are no p-values provided, no R-squared values, no F test. This makes the use of rlm() quite unpractical, as the user will have to compute them by hand. There is so much controversy on how to do it that the computations in other software packages are currently questioned! The reader interested in computing the robust R-squared can read the paper A robust coefficient of determination for regression by Renaud and Victoria-Feser (2010).

For our example, it seems that the results using lm() and rlm() are pretty similar (see the output of the preceding summary of model1). Therefore, relying on lm() is advised here. However, if you want to be really sure, why not try bootstrapping.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
18.119.103.96