The variance is a parameter in the linear model. But typically, it is a nuisance parameter: it needs to be estimated from the data but we are actually interested only in the parameters determining the expected value.
The simple linear regression is a linear model. Like all linear models, it assumes that the response variable is normal distributed. The normal distribution has two parameters: mu (-> expected value) and sigma² (-> variance). The expected value is a linear function of some predictor variable(s), and the variance is considered constant (independent of the predictor variables). The linear function of the predictors is a parametric function which is linear in the parameters (hence the name linear model), so the parameters in this function are all simple coefficients. The function in a simple linear regression has a coefficient for the constant - which is the intercept - and a coefficient for a (continuous) predictor - which is the slope. These two coefficient or parameters are the ones of interest, because they determine the expected value. The third parameter in the model is the variance.
In contrast and for example, the log-linear model is based on the Poisson distribution. This distribution is defined only by the expected value and has no extra parameter for the variance (the variance of the Poisson distribution always equals the expected value). similar to the linear model, the expectation can be expressed as a linear function of some predictor(s), and this model will then have only the parameters defined in the linear function as there is no extra variance parameter.