Big-O notation

Big-O notation is the main way people determine the runtime of algorithms. (Rightly or wrongly.) It is the worst case analysis and has a formal mathematical definition This is just saying that eventually is at most a scalar multiple of .

This is only an upper bound

If then we also have that , as eventually.