Another useful case of regression, specially for experimental data, is when the dependent variable y depends on 2 or more independent variables (x1, x2, …, xn) as:
The best adjusted data is the one that minimizes the sum of the squared residues Sr, meaning the error between the model and the experimental data, this is the vertical distance between the points and the line:
Sr=i=1∑n(yi−a0−a1x1−a2x2)2
To calculate the error, we need to sum the squared difference of each point with the mean on the y-axis:
St=i+1∑n(yi−y)2
The coefficient of corelation (how good is the regression) is calculated as: