22,99 €
inkl. MwSt.

Versandfertig in 6-10 Tagen
  • Broschiertes Buch

High Quality Content by WIKIPEDIA articles! Total least squares, also known as errors in variables, rigorous least squares, or orthogonal regression, is a least squares data modeling technique in which observational errors on both dependent and independent variables are taken into account. It is a generalization of Deming regression, and can be applied to both linear and non-linear models. In the least squares method of data modeling, the objective function, S, S=mathbf{r^TWr} is minimized. In linear least squares the model is defined as a linear combination of parameters, boldsymbolbeta, so…mehr

Andere Kunden interessierten sich auch für
Produktbeschreibung
High Quality Content by WIKIPEDIA articles! Total least squares, also known as errors in variables, rigorous least squares, or orthogonal regression, is a least squares data modeling technique in which observational errors on both dependent and independent variables are taken into account. It is a generalization of Deming regression, and can be applied to both linear and non-linear models. In the least squares method of data modeling, the objective function, S, S=mathbf{r^TWr} is minimized. In linear least squares the model is defined as a linear combination of parameters, boldsymbolbeta, so the residuals are given by mathbf{r=y-Xboldsymbolbeta} There are m observations in y and n parameters in ? with mn. X is a m×n matrix whose elements are either constants or functions of the independent variables, x. The weight matrix, W, is, ideally, the inverse of the variance-covariance matrix, mathbf M_y, of the observations, y. The independent variables are assumed to be error-free. The parameter estimates are found by setting the gradient equations to zero, which results in the normal equations mathbf{X^TWXboldsymbolbeta=X^T Wy}