Big-O notation
Big-O notation is the main way people determine the runtime of algorithms. (Rightly or wrongly.) It is the worst case analysis and has a formal mathematical definition
This is only an upper bound
If
then we also have that , as eventually.