t-value estimate / std_error R^2 = 1 - SS_{res}/SS_{tot} SS_{rss} = sum((y_i - y_pred)^2) # residual sum of squares SS_{tot} = sum((y_i - y_mean)^2) F = ((SS_{tot} - SS_{res})/p)/(SS_{res}/(n-p-1)) n - number of points p - number of inputs # https://www.rose-hulman.edu/class/ma/inlow/Math485/ftests.pdf p-value -> 0 high significance adjusted R^2 = 1 - (1-R^2)(n-1)/(n-p-1) - it adjusts the r-squared value by penalizing the inclusion of irrelevant, by which we mean highly collinear, variables. It does this by taking into account both the number of predictors and the sample size. https://mjt.cs.illinois.edu/ml/lec2.pdf