Big-O and related notations are intended to capture the aspects of algorithm performance that are most inherent to the algorithm, independent of how it is being run and measured.
Constant multipliers depend on the unit of measurement, seconds vs. microseconds vs. instructions vs. loop iterations. Even measured in the same units they will be different if measured on different systems. The same algorithm may take 20n instructions in one instruction set, 30n instructions on another. It may take 0.5n microseconds on one, 10n microseconds on another.
Many of the basic algorithm complexities you will see in the literature were calculated decades ago, but remain meaningful across significant changes in processor architecture and even more significant changes in performance.
Similar considerations apply to start-up and similar overheads.
A f(n)
is O(n)
if there exist constants N and c such that, for all n>=N, f(n) <= cn
. For f(n) = 2n the constants are N=0
and c = 2
. The first constant, N, is about ignoring overhead, the second, c, is about ignoring constant multipliers.