About 50 results
Open links in new tab
  1. What is the lasso in regression analysis? - Cross Validated

    Oct 19, 2011 · LASSO regression is a type of regression analysis in which both variable selection and regulization occurs simultaneously. This method uses a penalty which affects they value of …

  2. In linear regression, when is it appropriate to use the log of an ...

    Aug 24, 2021 · This is because any regression coefficients involving the original variable - whether it is the dependent or the independent variable - will have a percentage point change interpretation.

  3. Why Isotonic Regression for Model Calibration?

    Jan 27, 2025 · 1 I think an additional reason why it is so common is the simplicity (and thus reproducibility) of the isotonic regression. If we give the same classification model and data to two …

  4. regression - When is R squared negative? - Cross Validated

    Also, for OLS regression, R^2 is the squared correlation between the predicted and the observed values. Hence, it must be non-negative. For simple OLS regression with one predictor, this is equivalent to …

  5. Explain the difference between multiple regression and multivariate ...

    There ain’t no difference between multiple regression and multivariate regression in that, they both constitute a system with 2 or more independent variables and 1 or more dependent variables.

  6. Support Vector Regression vs. Linear Regression - Cross Validated

    Dec 5, 2023 · Linear regression can use the same kernels used in SVR, and SVR can also use the linear kernel. Given only the coefficients from such models, it would be impossible to distinguish …

  7. Poisson regression to estimate relative risk for binary outcomes

    Brief Summary Why is it more common for logistic regression (with odds ratios) to be used in cohort studies with binary outcomes, as opposed to Poisson regression (with relative risks)? Backgrou...

  8. What is the effect of having correlated predictors in a multiple ...

    The VIF is how much the variance of your regression coefficient is larger than it would otherwise have been if the variable had been completely uncorrelated with all the other variables in the model. Note …

  9. Interpreting interaction terms in logit regression with categorical ...

    My own preference, when trying to interpret interactions in logistic regression, is to look at the predicted probabilities for each combination of categorical variables.

  10. How to calculate pseudo-$R^2$ from R's logistic regression?

    A somewhat related question was asked here, Logistic Regression: Which pseudo R-squared measure is the one to report (Cox & Snell or Nagelkerke)?.