**For all intepretations, “predicts” really means “predicts on average”.**

# Linear Regression

\[\mathbb{E}(Y|X) = \beta_0 + \beta_1X\]

- \(\beta_0\) is the average value of \(Y\) when \(X = 0\).
- \(\beta_1\): A 1 increase in \(X\) predicts a change of \(\beta_1\) in \(Y\).

For log-transforms, see Log Transform Interpretation. The tldr is

- If \(Y\) is log-transformed, a 1 increase in \(X\) predicts a \(\exp(\beta_1)\)% change in \(Y\).
- If \(X\) is log-transformed, a \(r\)% increase in \(X\) predicts a \(\log(1 + r/100)\beta_1\) change in \(Y\).
- If \(X\) and \(Y\) are log-transformed, a \(r\)% increase in \(X\) predicts a \((1 + r/100)^{\beta_1}\) change in \(Y\).

# Logistic Regression

Let \(p = \mathbb{P}(Y = 1)\).

\[\textrm{logit}(p) = \log\left(\frac{p}{1 - p}\right) = \beta_0 + \beta_1X\]

- \(\beta_0\) is the log odds when \(X = 0\).
- \(\beta_1\): A 1 increase in \(X\) predicts a change of \(\beta_1\) in the log odds
- \(\exp(\beta_0)\) is the odds ratio when \(X = 0\).
- \(\exp(\beta_1)\): A 1 increase in \(X\) predicts a change of \(\exp(\beta_1)\) in the odds ratio.

## Odds ratio and log odds

The odds ratio is

\[\frac{p}{1-p}\]

- The ratio of successes to failures.
- The number of successes we expect per failure. (E.g. if \(p = .6\), then the odds ratio is 1.5, so we expect 1.5 successes for every failure.)

The log odds is

\[\log\left(\frac{p}{1-p}\right)\]

and should not be interpreted, other than that the coefficients are linear.

# Poisson Regression

\[\log(\mathbb{E}(Y|X)) = \beta_0 + \beta_1X\]

where \(Y\) is a count.

- Interpretations are similar to a log transformed \(Y\) in linear regression. A 1 increase in \(X\) predicts a \(\exp(\beta_1)\)% change in \(Y\).

## Negative Binomial Regression

Same interpretation as Poisson.