Some discrete data is poorly represented by a straight line, therefore is better to fit them to a curve making the polynomial regression a better option.
To create the polynomial, the least-square method can be easily extended with the polynomial function adding the error:
Making these equations equal to zero, to get the coefficients of a mth degree polynomial becomes a problem of solving a system of m+1 linear equations.
Error calculation
The best adjusted data is the one that minimizes the sum of the squared residues Sr, meaning the error between the model and the experimental data, this is the vertical distance between the points and the line:
Sr=i=1∑n(yi−a0−a1xi−a2xi2−⋯−amxim)2
To calculate the error, we need to sum the squared difference of each point with the mean on the y-axis:
St=i+1∑n(yi−y)2
The coefficient of corelation (how good is the regression) is calculated as: