Multicollinearity is what happens when a Regression model with multiple predictor variables is hard to interpret because predictor varialbes interact with the response variable and each other in non-trivial ways.
It is something that we want to reduce because by assumption each represents the change in when keeping all other constant, but when multicollinearity is there, this is functionally impossible since changing implicitly changes the value of all the s to which it is collinear. You can check for collinearity by plotting one regressor against another and there seems to be a linear relation between the two, if there is you would fail to include one of the two.
In this image, and are strongly correlated which is problematic, but there doesn't seem to be correlation between the other regressors.
Therefore the only problematic pair is the first one, and we would fail to include it.
You can get that information through correlations table as well:
In this case since the correlation between and is stronger than the one between and , we would pick the first one.