202405271901
Status: #reference
Tags: Regression
MAT3375(X) ~ Regression Analysis
Modeling relationships between outcome (dependent) variables and covariates (independent variables) using simple and multiple linear regression models. Estimation and hypothesis testing using least squares and likelihood methods. Performing model diagnostics and assessing goodness of fit properties. Variable selection and finding the best fit. Non-linear regression and transformations. Weighted regression and generalized least square. Analysis of data using statistical software packages.
Lecture 1

The first lesson starts slowly with the basics, we are introduced with the most fundamental regression technique Simple Linear Regression, and we set the tempo. We will not be dawdling, by the end of the lesson we have seen how to derive the formula for the estimators in the simple case, how to derive the formula for the different inference techniques, and the distribution of the different parameters involved.
The course will be fast-paced.
Lesson Points:
Simple Linear Regression
The Method Of Least Squares In Simple Linear Regression
Properties of Least Square Estimators in SLR
Lecture 2

We start by reviewing simple linear regression.
We then take the rest of the period to explore Analysis of Variance (ANOVA) after briefly introducing multiple linear regression.
Lesson Points
Multiple Linear Regression
Analysis of Variance (ANOVA)
Hypothesis Testing about Rho
Lecture 3
We directly start with Multiple Linear Regression.
Which can be seen as a generalization of Simple Linear Regression.
Since it would quickly become unwieldy to do the same kind of algebra we did for simple linear algebra, on arbitrarily many parameters, it is a pressing issue to find a way to do calculations once so that they will apply to any number of parameters.
And what better way to do that than to resort to Linear Algebra and matrices? Indeed, Multiple Linear Regression makes full use of matrices and matrix calculus to derive it's formulas. In this lesson we cover that, and
go over the formulas to remember for each part.
Lesson Points:
Lecture 4 ~ Start of Model Adequacy Checking

We start seeing generalization of Multiple Linear Regression.
More specifically, how do we deal with Categorical Data.
Lesson Points
Multiple Linear Regression#Special Cases
Multiple Linear Regression#Hypothesis Testing
Lecture 5 ~ A Masterclass on Confidence Intervals
As expected, here we focus on how to compute confidence intervals:
We also touch on the extra sum of squares method, which can be seen as the big brother of the Test On Individual Regression Coefficient (Assuming We know our Model Is Significant).
Lesson Points
Multiple Linear Regression#Confidence Intervals
Extra Sum of Squares Method
Lecture 6 ~ Model Adequacy

After spending so much time on building linear models and building intervals for them, we want to be able to check if our model even does a good job. More specifically, does it do a good job purely by chance, or does the assumptions implied by the use of that model respected?
Lesson Points
Lecture 7

So last lecture we checked how to check if the model assumptions held.
This lecture we learn how to correct a model if it seems that assumptions are not correct.
Depending of what assumptions were broken, we will need to use different methods. The methods will pretty much always involve transforming either the response variable, or the predictor variables.
Lesson Points
Transformations
Box-Cox Transformation for Non-Normality
Lecture 8 ~ Weighted Least Squares
This lecture is fully about weighted least squares, a method that is used when we the variance is not constant. In other words, we use it when the homoscedastitcity assumption is broken.
Lesson Points
Weighted Least Squares (Generalized Least Squares)