Complexity is often described as a situation where the whole is greater than the sum of its parts. While this description is true on the surface, it actually misses the whole point about complexity. Complexity is really about the whole being much less than the sum of its parts. Let me explain. Consider a television screen with 100 pixels that can be either black or white. The number of possible images the screen can show is . That’s a really big number. Most of those images would look like random white noise. However, a small set of them would look like things you recognize, like dogs and trees and salmon tartare coronets. This narrowing of possibilities, or a reduction in entropy to be more technical, increases information content and complexity. However, too much reduction of entropy, such as restricting the screen to be entirely black or white, would also be considered to have low complexity. Hence, what we call complexity is when the possibilities are restricted but not completely restricted.

Another way to think about it is to consider a very high dimensional system, like a billion particles moving around. A complex system would be if the attractor of this six billion dimensional system (3 for position and 3 for velocity of each particle), is a lower dimensional surface or manifold. The flow of the particles would then be constrained to this attractor. The important thing to understand about the system would then not be the individual motions of the particles but the shape and structure of the attractor. In fact, if I gave you a list of the positions and velocities of each particle as a function of time, you would be hard pressed to discover that there even was a low dimensional attractor. Suppose the particles lived in a box and they moved according to Newton’s laws and only interacted through brief elastic collisions. This is an ideal gas and what would happen is that the motions of the positions of the particles would be uniformly distributed throughout the box while the velocities would obey a Normal distribution, called a Maxwell-Boltzmann distribution in physics. The variance of this distribution is proportional to the temperature. The pressure, volume, particle number and temperature will be related by the ideal gas law, PV=NkT, with the Boltzmann constant set by Nature. An ideal gas at equilibrium would not be considered complex because the attractor is a simple fixed point. However, it would be really difficult to discover the ideal gas law or even the notion of temperature if one only focused on the individual particles. The ideal gas law and all of thermodynamics was discovered empirically and only later justified microscopically through statistical mechanics and kinetic theory. However, knowledge of thermodynamics is sufficient for most engineering applications like designing a refrigerator. If you make the interactions longer range you can turn the ideal gas into a liquid and if you start to stir the liquid then you can end up with turbulence, which is a paradigm of complexity in applied mathematics. However, the main difference between an ideal gas and turbulent flow is the dimension of the attractor. In both cases, the attractor dimension is still much smaller than the full range of possibilities.

The crucial point is that focusing on the individual motions can make you miss the big picture. You will literally miss the forest for the trees. What is interesting and important about a complex system is not what the individual constituents are doing but how they are related to each other. The restriction to a lower dimensional attractor is manifested by the subtle correlations of the entire system. The dynamics on the attractor can also often be represented by an “effective theory”. Here the use of the word “effective” is not to mean that it works but rather that the underlying microscopic theory is superseded by a macroscopic one. Thermodynamics is an effective theory of the interaction of many particles. The recent trend in biology and economics had been to focus on the detailed microscopic interactions (there is push back in economics in what has been dubbed the macro-wars). As I will relate in future posts, it is sometimes much more effective (in the works better sense) to consider the effective (in the macroscopic sense) theory than a detailed microscopic theory. In other words, there is no “theory” *per se* of a given system but rather sets of effective theories that are to be selected based on the questions being asked.

this is somewhat redundant but one can also call this emergence. to me a big question is whether as P W Anderson put it ‘more is different’. one old paper i read by trainor (who collborated with Lumsden, co-author with E O Wilson of ‘genes, mind, culture’—which also uses a stat mech / diffusion analogy) suggested that collective states in quantum theory may represent the idea that more is different—the whole is not the sum of the parts.

another question is whether ‘the more things change (or are different) the more they stay the same. for example, starting from a micro theory (eg newtonian mechanics as represented by a hamiltonian/lagrangian/hamilton-jacobi), then approximating it via the ergodic theorem to get a kinetic /stat mech type theory, you then coarse grain (or even renormalize) it and end up with another representation as a ‘thermodynamic lagrangian’ which also has associated hamilton-jacobi, etc equations similar to the original micro model. One also sees this in fractals.

you can then get metaphysical or fictional, and imagine ‘superorganisms’ so if one were to move away from a human centered view, they are just atoms in some larger being, which in turn is an atom in another set of beings, and all of them have their own opinions too. ‘its turtles all the way down’.

LikeLike

@ishi see https://sciencehouse.wordpress.com/2009/12/28/parallel-worlds/

LikeLike

yeah thats the idea go(e)dal universes, shelah/kleene/harvey friedman -FOM-LIST ) on transfinite artihmatic /turing (phD) degrees (350.org mckibben) (feferman (stanford) and hammel may not agree).

i thought it inetresting that von neumann put economics into a least action principle—samuleson did it for econ too in 1972 pnas, via kerner’s statistical mechanics of biodiveristy (recently recuperated by hubbel’s neutral theory’. (fokker-planck–ie second order truncated taylors series (weirstrauss appprox thm).)

LikeLike

Certain people think Cellulitis Upper Arm is still associated

with fat. They claimed that this product does exactly

what it claims. Hence, the tiny veins that transport blood towards brain might become

a little more research. I believe the safest thing for skin is natural oils such as almond or grapeseed oil before

topical application. Rub it in well to allow the honey to absorb.

LikeLike