Is it possible to have a (multiple) regression equation with two or more dependent variables? Sure, you could run two separate regression equations, one for each DV, but that doesn't seem like it ...
Those words connote causality, but regression can work the other way round too (use Y to predict X). The independent/dependent variable language merely specifies how one thing depends on the other. Generally speaking it makes more sense to use correlation rather than regression if there is no causal relationship.
What statistical tests or rules of thumb can be used as a basis for excluding outliers in linear regression analysis? Are there any special considerations for multilinear regression?
I have a problem where I need to standardize the variables run the (ridge regression) to calculate the ridge estimates of the betas. I then need to convert these back to the original variables scale.
Well, under the hypothetical scenario that the true regression coefficient is equal to 0, statisticians have figured out how likely a given Z-score is (using the normal distribution curve). Z-scores greater than 2 (in absolute value) only occur about 5% of the time when the true regression coefficient is equal to 0.
A good residual vs fitted plot has three characteristics: The residuals "bounce randomly" around the 0 line. This suggests that the assumption that the relationship is linear is reasonable. The res...
In a log-level regression, the independent variables have an additive effect on the log-transformed response and a multiplicative effect on the original untransformed response:
It appears that isotonic regression is a popular method to calibrate models. I understand that isotonic guarantees a monotonically increasing or decreasing fit. However, if you can get a smoother f...
None of the three plots show correlation (at least not linear correlation, which is the relevant meaning of 'correlation' in the sense in which it is being used in "the residuals and the fitted values are uncorrelated").