Bayesian Linear Regression
Regression analysis |
|
Models |
- Linear regression
- Simple regression
- Ordinary least squares
- Polynomial regression
- General linear model
|
- Generalized linear model
- Discrete choice
- Logistic regression
- Multinomial logit
- Mixed logit
- Probit
- Multinomial probit
- Ordered logit
- Ordered probit
- Poisson
|
- Multilevel model
- Fixed effects
- Random effects
- Mixed model
|
- Nonlinear regression
- Nonparametric
- Semiparametric
- Robust
- Quantile
- Isotonic
- Principal components
- Least angle
- Local
- Segmented
|
|
Estimation |
- Least squares
- Ordinary least squares
- Linear (math)
- Partial
- Total
- Generalized
- Weighted
- Non-linear
- Iteratively reweighted
- Ridge regression
- LASSO
|
- Least absolute deviations
- Bayesian
- Bayesian multivariate
|
Background |
- Regression model validation
- Mean and predicted response
- Errors and residuals
- Goodness of fit
- Studentized residual
- Gauss–Markov theorem
|
|
In statistics, Bayesian linear regression is an approach to linear regression in which the statistical analysis is undertaken within the context of Bayesian inference. When the regression model has errors that have a normal distribution, and if a particular form of prior distribution is assumed, explicit results are available for the posterior probability distributions of the model's parameters.