Run time complexity

There are 3 main ways people use to analyse the run time complexity of algorithms.

Big-O notation

Big-O notation is the main way people determine the run time of algorithms. (Rightly or wrongly.) It is the worst case analysis and has a formal mathematical definition This is just saying that eventually is at most a scalar multiple of .

This is only an upper bound

If then we also have that , as eventually.

Big-Omega notation

Big-Omega notation is said to be the best case running time. However formally it is simply a lower bound on the run time - the same as Big-O notation is simply an upper bound. This has a similar formal definition This is just saying that eventually is at least a scalar multiple of .

Big-Theta notation

Big-Theta notation is the tight bound on the run time. This is only really defined when some function is the following and then we have this is also an equivalence relation, so as well.

Comments

In some algorithmic disciplines, these notions are not sufficient and people talk about average run time of an algorithm. Whilst Big-O notation determines a functions complexity in practical terms if that run time is only really recognised in very specific circumstances - it could have a much better average run time on most cases people care about and come across in real life.