INSERTION_SORT (A)
FOR j ← 2 TO length[A] //c1*n
key ← A[j] //c2*(n-1)
i ← j − 1 //c3*(n-1)
WHILE i > 0 and A[i] > key //c4*sigma(j=2 to n)of(tj)
A[i +1] ← A[i] //c5*sigma(j=2 to n)of(tj-1)
i ← i − 1 //c6*sigma(j=2 to n)of(tj-1)
A[i + 1] ← key //c7*(n-1)
c1,c2,c3,c7 make total sense. What doesn't make sense is why:
c4*sigma(j=2 to n)of(tj) becomes c4*sigma(j=2 to n)of(j)
Let's say that we are calculating a worst case of an array of size 5. What the above line is saying is that the time on line 4 is:
c4*(1+2+3+4)
When in reality it is:
c4*((1)+(1+2)+(1+2+3)+(1+2+3+4))
What am I missing? I know the book has to be right.
EDIT Screwed up, should be:
c4*((1)+(1+1)+(1+1+1)+(1+1+1+1))
=c4*(1+2+3+4)