Here y is the prediction result, b0 is the intercept or the bias, and b1 represents x’s (input)
coecient.
This regression applies to the probability of the first class or the default class. For instance,
if we are trying to predict the gender of people through the given data of height values, then we
may have a default class as a male. In such an instance, we can write the probability formally
with the following method where, s = sex, m = male, and h = height.
P(s = m|h)
Observe closely that what we are saying is that the first class y contains our input x.
P(X) = P(Y = 1|X)
It is important to note that while the method of logistic regression is linear, the estimations
are processed with the help of the logistic function. As a result, it diers from linear regression
as input cannot be comprehended with a linear combination.
The estimations for the values of the coecient must be performed via the training data.
For this purpose, the maximum likelihood estimation is used. Predictions are easy with logistic
regression—you just have to put the right value from the data. For instance, suppose we have
a model which is used to predict whether a student is good at study or not. This assessment is
done by going by their marks. Consider in the given data, a student has 40 marks. Provided,
we have the coecient value for b0 = −40 and b1 = 0.4, we can generate an estimation for the
assessment of a student of being deficient in academics, P(bad/marks=40).
y = e^(b0 + b1*X) / (1 + e^(b0 + b1*X))
exp(−18 + 0.4*40) / (1 + EXP(−18 + 0.4*40))
y = 0.1192029220221176
Now, we did get a result, but how will we determine the assessment based on the output, well
for that we must have a benchmark. For instance, going by our benchmark, an intelligent student
has a probability of more than 0.20 and a less intelligent one has a probability of less than 0.20.
Linear Regression
Similar to logistic regression, linear regression also belongs to statistics. In statistics, it is used for
determining the relationship between numerical input and output variables.
Linear regression follows a linear model that is, there is a linear relationship among its
input and output variables. The input is represented by x and the output is represented by y.
To delve further, the value of y is determined by x values with a combination that is linear in
nature.
Whenever the input variable is only a single one, then the method is called simple linear
regression. Otherwise, for multiple values, it is called multiple linear regressions.
The equation of linear regression maps a scale factor for all the input values. This factor is
referred to as a coecient. It is represented by B (Beta). There is one more coecient known as
the bias coecient.
For instance, a regression problem with a single input variable x (simple linear regression),
takes the following equation.
y = B0 + B1*x
256 Internet of Things
Internet_of_Things_CH10_pp249-270.indd 256 9/3/2019 10:15:57 AM