The time complexity of an algorithm refers to the computational complexity that describes the amount of time it takes to run. This running may vary from a constant time to factorial time. Constant infers that the algorithm is very fast. On the contrary, factorial time infers that the algorithm is very very slow. So what are the other categories in between the constant running time and the factorial running time? Click here to see all the categories.