Complexity

Mere computability investigates the algorithmic solvability of (decision or function) problems in principle. Although this is often already difficult enough to establish (and for many natural problems even wrong), in practice one usually wants efficient a solution: by devising data structures and algorithms and analyzing their worst-case behavior. Here, exponential asymptotic growth is generally considered inefficient -- yet a refined diagonalization argument (the hierarchy theorem due to Hartmanis and Stearns) proves that some decidable problems may require such running times for their solution. This raises the question of optimality of an algorithm, that is, how close it comes asymptotically to the intrinsic difficulty of the problem it solves. Put differently: is it worth while to try improving it or not?

To avoid possible confusion with heuristics: We consider here as algorithm only one formally correct with an asymptotic running time bound provably obeyed on every possible input.

This 'intrinsic difficulty' of a problem is called its complexity; and the running time of any algorithm solving it constitutes (asymptotically) an upper bound. Conversely, if this can be complemented by a matching lower bound, it means that the algorithm is optimal.

Such a lower bound asserts that any conceivable algorithm solving the problem requires asymptotically a certain number of steps. Thus taking into account all possible (even prospective) algorithms is, understandably, usually quite challenging. Moreover for such a claim referring to all algorithms to make sense, the underlying model of computation needs to be specified rather precisely.

Some minor details of its definition often become irrelevant when focussing on asymptotic growth, though. Moreover switching to an asymptotically superior algorithm beats, for sufficiently large inputs, any constant factor performance improvement gained by, e.g., purchasing faster hardware.

As indicated above, time (i.e. the number of steps) is often the primary criterion for efficiency. However 'fast' algorithms in this sense may turn out impractical because they would exceed other limited ressources such as space (i.e. memory consumption), communication, or #processors in parallel computing.