Sum of Squares Manipulations

Intro

Simple Linear Regression (SLR) has been tickled to death. One interesting tidbit about SLR is that of the different Sum of Squares formulations that exist and how they tie into just about everything. This posts tries to deconstruct the sum of squares formulations into alternative equations.

Definitions

In the least technical terms possible….

Sum of Squares provides a measurement of the total variability of a data set by squaring each point and then summing them.

$$ \sum\limits_{i = 1}^n {x_i^2} $$

More often, we use the Corrected Sum of Squares, which compares each data point to the mean of the data set to obtain a deviation and then square it.

$$ \sum\limits_{i = 1}^n { { {\left( { {x_i} - \bar x} \right)}^2} } $$

, where the mean is defined as: $$ \bar x = \frac{1}{n}\sum\limits_{i = 1}^n { {x_i} } $$

When we talk about Sum of Squares it will always be the later definition. Why? Well, using the initial definition is sure to cause a data overflow when working with large number (e.g. 1000000000000^2 vs. (1000000000 - 1000000)^2).

Arrangements

There are three key equations:

  1. Sum of Squares over \(x\): $$ {S{xx} } = \sum\limits{i = 1}^n { { {\left( { {x_i} - \bar x} \right)}^2} } $$
  2. Sum of Squares over \(y\): $$ {S{yy} } = \sum\limits{i = 1}^n { { {\left( { {y_i} - \bar y} \right)}^2} } $$
  3. Sum of \(x\) times \(y\): $$ {S{xy} } = \sum\limits{i = 1}^n {\left( { {x_i} - \bar x} \right)\left( { {y_i} - \bar y} \right)} $$

Psst… The last one isn’t a square! In fact, it’s part of what’s called covariance. It’s listed here because of the similarities in manipulations that you will see later on.

These initial arrangements can be modified to take on different forms such as:

$$ \begin{align*} {S_{xx} } &= \sum\limits_{i = 1}^n { { {\left( { {x_i} - \bar x} \right)}^2} } \notag \\ &= \sum\limits_{i = 1}^n {x_i^2} - n{ {\bar x}^2} \notag \\ &= \sum\limits_{i = 1}^n {\left( { {x_i} - \bar x} \right){x_i} } \notag \\ \end{align*} $$

and

$$ \begin{align*} {S_{xy} } &= \sum\limits_{i = 1}^n {\left( { {x_i} - \bar x} \right)\left( { {y_i} - \bar y} \right)} \notag \\ &= \sum\limits_{i = 1}^n { {x_i}{y_i} } - n\bar x\bar y \notag \\ &= \sum\limits_{i = 1}^n {\left( { {x_i} - \bar x} \right){y_i} } \notag \\ &= \sum\limits_{i = 1}^n {\left( { {y_i} - \bar y} \right){x_i} } \notag \\ \end{align*} $$

The next two sections go into depth on how to manipulate these equations. The main point behind manipulating these equations is the use of the mean definition and some series properties.

Providing different forms of the Sum of Squares for \(S_{xx}\) and \(S_{yy}\)

These arrangements can be modified rather nicely to alternative expressions.

For instance, both 1 and 2 can be modified to be:

$$ \begin{align} {S_{xx} } &= \sum\limits_{i = 1}^n { { {\left( { {x_i} - \bar x} \right)}^2} } & \text{Definition} \notag \\ &= \sum\limits_{i = 1}^n {\left( {x_i^2 - 2{x_i}\bar x + { {\bar x}^2} } \right)} & \text{Expand the square} \notag \\ &= \sum\limits_{i = 1}^n {x_i^2} - 2\bar x\sum\limits_{i = 1}^n { {x_i} } + { {\bar x}^2}\sum\limits_{i = 1}^n 1 & \text{Split Summation} \notag \\ &= \sum\limits_{i = 1}^n {x_i^2} - 2\bar x\sum\limits_{i = 1}^n { {x_i} } + \underbrace {n{ {\bar x}^2} }_{\sum\limits_{i = 1}^n c = n \cdot c} & \text{Separate the summation} \notag \\ &= \sum\limits_{i = 1}^n {x_i^2} - 2\bar x\left[ {n \cdot \frac{1}{n} } \right]\sum\limits_{i = 1}^n { {x_i} } + n{ {\bar x}^2} & \text{Multiple by 1} \notag \\ &= \sum\limits_{i = 1}^n {x_i^2} - 2\bar xn \cdot \underbrace {\left[ {\frac{1}{n}\sum\limits_{i = 1}^n { {x_i} } } \right]}_{ = \bar x} + n{ {\bar x}^2}& \text{Group terms for mean} \notag \\ &= \sum\limits_{i = 1}^n {x_i^2} - 2\bar xn\bar x + n{ {\bar x}^2} & \text{Substitute the mean} \notag \\ &= \sum\limits_{i = 1}^n {x_i^2} - 2n{ {\bar x}^2} + n{ {\bar x}^2} & \text{Rearrange terms} \notag \\ &= \sum\limits_{i = 1}^n {x_i^2} - n{ {\bar x}^2} & \text{Simplify} \\ \end{align} $$

We’ll call this result the alternative definition.

We can further manipulate this expression…

$$ \begin{align} {S_{xx} } &= \sum\limits_{i = 1}^n { { {\left( { {x_i} - \bar x} \right)}^2} } & \text{Definition} \notag \\ &= \sum\limits_{i = 1}^n {x_i^2} - n{ {\bar x}^2} & \text{Previous result} \notag \\ &= \sum\limits_{i = 1}^n {x_i^2} - n{\left( {\frac{1}{n}\sum\limits_{i = 1}^n { {x_i} } } \right)^2} & \text{Substitute mean} \notag \\ &= \sum\limits_{i = 1}^n {x_i^2} - n \cdot \frac{1}{ { {n^2} } } \cdot \sum\limits_{i = 1}^n { {x_i} } \cdot \sum\limits_{i = 1}^n { {x_i} } & \text{Square terms} \notag \\ &= \sum\limits_{i = 1}^n {x_i^2} - \frac{1}{n}\sum\limits_{i = 1}^n { {x_i} } \cdot \sum\limits_{i = 1}^n { {x_i} } & \text{Simplify} \notag \\ &= \sum\limits_{i = 1}^n {x_i^2} - \bar x\sum\limits_{i = 1}^n { {x_i} } & \text{Substitute mean} \notag \\ &= \sum\limits_{i = 1}^n {\left( {x_i^2 - \bar x{x_i} } \right)} & \text{One summation} \notag \\ &= \sum\limits_{i = 1}^n {\left( { {x_i} \cdot {x_i} - \bar x \cdot {x_i} } \right)} & \text{Expand} \notag \\ &= \sum\limits_{i = 1}^n {\left( { {x_i} - \bar x} \right){x_i} } & \text{Factor} \\ \end{align} $$

The last result we’ll refer to as the exterior definition.

Therefore, as stated previously, we have:

$$ \begin{align*} {S_{xx} } &= \sum\limits_{i = 1}^n { { {\left( { {x_i} - \bar x} \right)}^2} } \notag \\ &= \sum\limits_{i = 1}^n {x_i^2} - n{ {\bar x}^2} \notag \\ &= \sum\limits_{i = 1}^n {\left( { {x_i} - \bar x} \right){x_i} } \notag \\ \end{align*} $$

Psst… For \(S_{yy}\), simply replace every \(x\) you see above with a \(y\).

e.g.

$$ \begin{align*} {S_{yy} } &= \sum\limits_{i = 1}^n { { {\left( { {y_i} - \bar y} \right)}^2} } \notag \\ &= \sum\limits_{i = 1}^n {y_i^2} - n{ {\bar y}^2} \notag \\ &= \sum\limits_{i = 1}^n {\left( { {y_i} - \bar y} \right){y_i} } \notag \\ \end{align*} $$

Exploring the different forms of \(S_{xy}\)

Based on the previous section, what comes next should not be very surprising. The only real difference between these two sections is the inclusion of a different variable AND the fact that the number of observations between \(x\) and \(y\) are the same (e.g. \(n_{x} = n_{y} = n\)).

$$ \begin{align} {S_{xy} } &= \sum\limits_{i = 1}^n {\left( { {x_i} - \bar x} \right)\left( { {y_i} - \bar y} \right)} & \text{Definition} \notag \\ &= \sum\limits_{i = 1}^n {\left( { {x_i}{y_i} - {x_i}\bar y - \bar x{y_i} + \bar x\bar y} \right)} & \text{Expand Square} \notag \\ &= \sum\limits_{i = 1}^n { {x_i}{y_i} } - \bar y\sum\limits_{i = 1}^n { {x_i} } - \bar x\sum\limits_{i = 1}^n { {y_i} } + \bar x\bar y\sum\limits_{i = 1}^n 1 & \text{Split Summation} \notag \\ &= \sum\limits_{i = 1}^n { {x_i}{y_i} } - \bar y\sum\limits_{i = 1}^n { {x_i} } - \bar x\sum\limits_{i = 1}^n { {y_i} } + n\bar x\bar y & \text{Simplify} \notag \\ &= \sum\limits_{i = 1}^n { {x_i}{y_i} } - \bar y \cdot \left[ {n \cdot \frac{1}{n} } \right] \cdot \sum\limits_{i = 1}^n { {x_i} } - \bar x \cdot \left[ {n \cdot \frac{1}{n} } \right] \cdot \sum\limits_{i = 1}^n { {y_i} } + n\bar x\bar y & \text{Multiply by 1} \notag \\ &= \sum\limits_{i = 1}^n { {x_i}{y_i} } - \bar yn\left[ {\frac{1}{n} \cdot \sum\limits_{i = 1}^n { {x_i} } } \right] - \bar xn\left[ {\frac{1}{n} \cdot \sum\limits_{i = 1}^n { {y_i} } } \right] + n\bar x\bar y & \text{Identify Means} \notag \\ &= \sum\limits_{i = 1}^n { {x_i}{y_i} } - \bar yn\bar x - \bar xn\bar y + n\bar x\bar y & \text{Substitute Means} \notag \\ &= \sum\limits_{i = 1}^n { {x_i}{y_i} } - n\bar x\bar y - n\bar x\bar y + n\bar x\bar y & \text{Rearrange Terms} \notag \\ &= \sum\limits_{i = 1}^n { {x_i}{y_i} } - n\bar x\bar y & \text{Simplify} \\ \end{align} $$

The result is a modified verison of the alternative definition.

We can obtain the similar form as the previous section, except this time we must choose to either have \(y_i\) or \(x_i\) on the exterior… Let’s start by opting for \(y_i\) on the exterior:

$$ \begin{align} {S_{xy} } &= \sum\limits_{i = 1}^n {\left( { {x_i} - \bar x} \right)\left( { {y_i} - \bar y} \right)} & \text{Definition} \notag \\ &= \sum\limits_{i = 1}^n { {x_i}{y_i} } - n\bar x\bar y & \text{Previous Result} \notag \\ &= \sum\limits_{i = 1}^n { {x_i}{y_i} } - n\bar x\left[ {\frac{1}{n}\sum\limits_{i = 1}^n { {y_i} } } \right] & \text{Substitute Mean} \notag \\ &= \sum\limits_{i = 1}^n { {x_i}{y_i} } - \bar x\sum\limits_{i = 1}^n { {y_i} } & \text{Simplify} \notag \\ &= \sum\limits_{i = 1}^n {\left( { {x_i} \cdot {y_i} - \bar x \cdot {y_i} } \right)} & \text{One summation} \notag \\ &= \sum\limits_{i = 1}^n {\left( { {x_i} - \bar x} \right){y_i} } & \text{Factor} \notag \\ \end{align} $$

Alternatively, we can go the opposite route and have \(x_i\) on the exterior:

$$ \begin{align} {S_{xy} } &= \sum\limits_{i = 1}^n {\left( { {x_i} - \bar x} \right)\left( { {y_i} - \bar y} \right)} & \text{Definition} \notag \\ &= \sum\limits_{i = 1}^n { {x_i}{y_i} } - n\bar x\bar y & \text{Previous Result} \notag \\ &= \sum\limits_{i = 1}^n { {x_i}{y_i} } - n\left[ {\frac{1}{n}\sum\limits_{i = 1}^n { {x_i} } } \right]\bar y & \text{Substitute Mean} \notag \\ &= \sum\limits_{i = 1}^n { {x_i}{y_i} } - \bar y\sum\limits_{i = 1}^n { {x_i} } & \text{Simplify} \notag \\ &= \sum\limits_{i = 1}^n {\left( { {x_i} \cdot {y_i} - {x_i} \cdot \bar y} \right)} & \text{One Summation} \notag \\ &= \sum\limits_{i = 1}^n {\left( { {y_i} - \bar y} \right){x_i} } & \text{Factor} \notag \\ \end{align} $$

Both are results from the exterior definition.

Therefore, we have the following equations:

$$ \begin{align*} {S_{xy} } &= \sum\limits_{i = 1}^n {\left( { {x_i} - \bar x} \right)\left( { {y_i} - \bar y} \right)} \notag \\ &= \sum\limits_{i = 1}^n { {x_i}{y_i} } - n\bar x\bar y \notag \\ &= \sum\limits_{i = 1}^n {\left( { {x_i} - \bar x} \right){y_i} } \notag \\ &= \sum\limits_{i = 1}^n {\left( { {y_i} - \bar y} \right){x_i} } \notag \\ \end{align*} $$

A simple test

The above manipulation can be further scrutinized by seeing if it is accurate. To do so, let’s quickly right a few R functions to check the output.

# For reproducibility
set.seed(1337)

# Generate some random data
x = rnorm(10000,3,2)
y = rnorm(10000,1,4)

Let’s formulize the definitions.

# Sxx and Syy definition
s.xx = function(x){
  sum((x-mean(x))^2)
}

# Sxx and Syy Definition definition
s.xx.alt = function(x){
  n = length(x)
  sum(x^2) - n*mean(x)^2
}

# Sxx and Syy Exterior definition
s.xx.ext = function(x){
  sum((x-mean(x))*x)
}

# Sxy Definition
s.xy = function(x,y){
 sum((x-mean(x))*(y-mean(y))) 
}

# Sxy Alternative Definition
s.xy.alt = function(x,y){
  n = length(x)
  sum(x*y) - n*mean(x)*mean(y)
}

# Sxy Exterior Definition
s.xy.ext = function(x,y){
    sum((x-mean(x))*y)
}

Now, let’s see the results of each function:

### Sxx and Syy

# All give the same value for Sxx Definition?
all.equal(s.xx(x), s.xx.alt(x), s.xx.ext(x))
## [1] TRUE
# What is the value?
s.xx.ext(x)
## [1] 40066.65
### Sxy

# All give the same value for Sxy Definition?
all.equal(s.xy(x,y), s.xy.alt(x,y), s.xy.ext(x,y))
## [1] TRUE
# What is the value?
s.xy.ext(x,y)
## [1] 330.3306

Timing

Aside from the derivations and the simple tests, there is one other item to consider… The amount of time it takes to calculate each equation.

# install.packages("microbenchmark")

# Load microbenchmark
library(microbenchmark)

# Benchmark Sxx definition against x data
microbenchmark(s.xx(x), s.xx.alt(x), s.xx.ext(x))
## Unit: microseconds
##         expr    min      lq      mean  median      uq      max neval
##      s.xx(x) 44.068 61.1995 108.13158 84.3010 92.3055 1894.702   100
##  s.xx.alt(x) 40.544 77.5640 113.26308 81.9625 87.5530 2436.758   100
##  s.xx.ext(x) 40.435 65.1715  77.74217 80.6340 90.6615  139.399   100
# Benchmark Syy definition against y data
microbenchmark(s.xx(y), s.xx.alt(y), s.xx.ext(y))
## Unit: microseconds
##         expr    min      lq     mean median      uq      max neval
##      s.xx(y) 45.495 46.7215 51.48408 47.822 51.3955   83.942   100
##  s.xx.alt(y) 41.086 41.8400 65.30329 43.225 56.5315 1294.073   100
##  s.xx.ext(y) 41.768 43.3175 59.98990 44.368 52.8850  889.703   100
# Benchmark Sxy Definition
microbenchmark(s.xy(x,y), s.xy.alt(x,y), s.xy.ext(x,y))
## Unit: microseconds
##            expr    min      lq      mean  median       uq      max neval
##      s.xy(x, y) 67.921 76.8185 136.18511 86.6165 145.4465 2916.835   100
##  s.xy.alt(x, y) 59.420 65.5575 144.93525 70.4935 109.7685 2751.354   100
##  s.xy.ext(x, y) 42.314 48.9660  87.40234 57.3085  89.5770 1883.336   100

In this case, we see that for the \(S_{xx}\) and \(S_{yy}\) the alternative definition is best whereas if we have \(S_{xy}\) then the best speed is from the exterior definition.