TOPICS
Search

Complexity


The complexity of a process or algorithm is a measure of how difficult it is to perform. The study of the complexity of algorithms is known as complexity theory.

In general, complexity is a concept referring to the amount of detail required to describe a given system or result. The origin and explanation of complexity in natural, physical, and mathematical systems has been a subject of human consideration and debate for millennia and has included both theological and scientific arguments. In the modern computer age, many systems (e.g., fractals, cellular automata, and nonlinear systems involving chaos) have been discovered and devised in which a set of very simple rules leads to very complex behaviors. The investigation of such phenomena using simple programs has been undertaken by Stephen Wolfram in his ambitious work A New Kind of Science (Wolfram 2002). According to Wolfram (2002, p. 861), "Just how complexity arises was never really resolved, and in the end I believe that it is only with the ideas of this book that this can successfully be done."


See also

Complexity Theory, Integer Complexity

Explore with Wolfram|Alpha

References

Wolfram, S. A New Kind of Science. Champaign, IL: Wolfram Media, pp. 861-863, 2002.

Referenced on Wolfram|Alpha

Complexity

Cite this as:

Weisstein, Eric W. "Complexity." From MathWorld--A Wolfram Web Resource. https://mathworld.wolfram.com/Complexity.html

Subject classifications