The general linear model is a flexible generalization of ordinary linear regression that allows for response variables that have other than a normal distribution. It generalizes linear regression by allowing the linear model to be related to the response variable through a link function and by allowing the magnitude of the variance of each measurement to be a function of its predicted value.
General linear models were formulated by John Nelder and Robert Wedderburn as a way of unifying various other statistical models. These models include linear regression, logistic regression and Poisson regression. The proposed an iteratively reweighted least squares method for maximum likelihood estimation of the model parameters. Maximum-likelihood estimation remains popular and is the default method on many statistical computing packages. Other approaches have been developed which include Bayesian approaches and least squares fit to variance stabilized responses.
In a generalized linear model, each outcome of the dependent variables, Y, is assumed to be generated from a particular distribution in the exponential family, a large range of probability distributions that includes the normal, binomial, Poisson and gamma distributions. The mean of the distribution depends on the independent variables.
A point of confusion with the general linear models has to do with the distinction between the general linear models. The general linear model may be viewed as a case of the generalized linear model with identity link. As most exact results of interest are obtain only for the general linear model, the general linear model has undergone a somewhat longer historical development.