About 511,000 results
Open links in new tab
  1. Regression with multiple dependent variables? - Cross Validated

    Nov 14, 2010 · Is it possible to have a (multiple) regression equation with two or more dependent variables? Sure, you could run two separate regression equations, one for each DV, but that …

  2. regression - When is R squared negative? - Cross Validated

    Also, for OLS regression, R^2 is the squared correlation between the predicted and the observed values. Hence, it must be non-negative. For simple OLS regression with one predictor, this is …

  3. How to describe or visualize a multiple linear regression model

    Then this simplified version can be visually shown as a simple regression as this: I'm confused on this in spite of going through appropriate material on this topic. Can someone please explain to …

  4. regression - Trying to understand the fitted vs residual plot?

    Dec 23, 2016 · A good residual vs fitted plot has three characteristics: The residuals "bounce randomly" around the 0 line. This suggests that the assumption that the relationship is linear is …

  5. regression - Difference between forecast and prediction ... - Cross ...

    I was wondering what difference and relation are between forecast and prediction? Especially in time series and regression? For example, am I correct that: In time series, forecasting seems …

  6. What's the difference between correlation and simple linear …

    Aug 1, 2013 · In particular one piece of information a linear regression gives you that a correlation does not is the intercept, the value on the predicted variable when the predictor is 0. In short - …

  7. How are regression, the t-test, and the ANOVA all versions of the ...

    May 15, 2013 · I add another ANOVA version w/ 3 groups lower down to clarify that a 2-group situation isn't the only ANOVA case that can be understood as a regression; but the reg …

  8. When conducting multiple regression, when should you center …

    Jun 5, 2012 · In some literature, I have read that a regression with multiple explanatory variables, if in different units, needed to be standardized. (Standardizing consists in subtracting the mean …

  9. regression - Converting standardized betas back to original …

    I have a problem where I need to standardize the variables run the (ridge regression) to calculate the ridge estimates of the betas. I then need to convert these back to the original variables scale.

  10. How to derive the standard error of linear regression coefficient

    another way of thinking about the n-2 df is that it's because we use 2 means to estimate the slope coefficient (the mean of Y and X) df from Wikipedia: "...In general, the degrees of freedom of …