202405181945
Status: #idea
Tags: Multiple Linear Regression
The Method Of Least Squares in Multiple Linear Regression

The logic is identical to the Simple Linear Regression case, except that we now need to represent everything in matrix form.
This changes the formulas since now we need to find our estimators using tensor calculus and stuff, but if you understand the logic in the single-variate case, doing it for multiple variables is just deriving the formulas using matrices all over again.
I will not bother going through the derivation mainly because I got a textbook that does it already, so I will just paste the formulas.
The Importance of
Pretty much all the formulas below are derived through some representation of the sum of error which we're trying to minimize. It is good to familiarize oneself with the ways to do so.
Formulas

From the equation for

The degree of freedoms of
Therefore we can compute the

The image above tells us that

First
In matrix form this is written as:
Where
Since we have a formula for

Recall that
Therefore
Relevant Links
The Method Of Least Squares In Simple Linear Regression
Video on Multiple Linear Regression
