Emergence is a process which describes the appearance of emergent properties and phenomena. A property of a system is emergent, if it is not a property of any fundamental element. Composite entities can have properties that can not be found in the parts of the composition. Emergence happens through connections and interactions. In other words, emergence happens if "more is different", if there are properties of a group which can not be explained by the properties of the parts, entities and agents alone. Russell Standish says in his book 'Theory of Nothing' "Informally, emergence is the notion that something is more than the sum of the parts from which it constructed."
An emergent behavior is not imposed from the outside by a central controller or organizer, it results solely from the interactions between the agents. Emergence is a characteristic feature of many complex and self-organizing systems. A JASSS review of an emergence book says that "90% of papers on complexity and social simulation explicitly refer to emergence, i.e. emergent processes, properties, dynamics, and patterns". Emergence and self-organization seem to be a contradiction to the second law of thermodynamics, which says that organization and order can not increase in isolated systems. They are possible because they usually happen in open systems, which extract information and order out of the environment and produce waste (import of order/information and export of disorder/entropy). The process of self-organization refers to the boundary between system and environment, the process of emergence involves the microscopic-macroscopic boundary between the individual and the collective group.
Already Aristotle knew emergence: he said the whole is sometimes more than the sum of its parts (He considered the question of unity for aggregated things "which have several parts and in which the totality is not, as it were, a mere heap, but the whole is something besides the parts", Aristotle Metaphysics Book VIII, Chapter 6). And Hamlet said, "There is more in heaven and earth, Horatio, than is dreamed of in your philosophy." Perhaps this is not what Shakespeare had in mind, but in complexity theory, the whole is typically more than the parts. For example if we take a team or group, and remove all the members, we are left with the common intentions and beliefs (for example the common vision or the shared goal), which are abstract and immaterial, but have a profound effect. Do we get the essence of "emergence" if we take the whole minus the parts?
Types and Forms
The definitions here in the wiki are more like Chalmers, and less like Bedau. In his paper Weak emergence, Bedau argues that strong emergence is irrelevant, because downward causation is not possible in strong forms of emergence. Bedau probably is right here, but instead of rejecting the term "strong emergence" altogether, one can also try to define it in a different way, as the appearance of a new code, as the emergence of a new system, as supervenience without downward-causation. In this case, both weak and strong emergence are irreducible and unpredictable, only the degree varies, which is in total agreement with the definition of other authors, for instance Assad and Packard.
Andrew Assad and Norman H. Packard define in their chapter "Emergence" (in "Emergence", Mark A. Bedau and Paul Humphreys (Eds.), MIT Press, 2007, also available in longer form as a technical report) four types of emergent patterns or levels which mark the endpoints on a scale of predictability:
- Non-emergent: Behavior is immediately deducible upon inspection of the specification or rules generating it
- Weakly emergent: Behavior is deducible in hindsight from the specification after observing the behavior
- Strongly emergent: Behavior is deducible in theory, but its elucidation is prohibitively difficult
- Maximally emergent: Behavior is impossible to deduce from the specification
Instead of distinguishing between strong and maximally emergent one can also introduce an additional type "multiple emergent" which lies between weak and strong emergence. If we do this, we can distinguish roughly between four types of emergence: (1) Simple/Nominal Emergence (2) Weak Emergence (3) Multiple Emergence (4) Strong Emergence. They can be distinguished by the different degree of predictability and the different types of roles.
|Type and Name||Roles||Frequency||Predictability||Levels of Abstraction|
I Nominal or Intentional
The different types and classes of emergence describe different causal relationships between higher and lower levels of abstractions.
- Intentional: Only one causally independent level of abstraction, i.e. causal influences are contained in only one level of abstraction
- Weak: 2 levels of abstraction, two causally interdependent levels of abstraction, the level of the group and the level of the individual agent
- Multiple: 3 levels of abstraction, a third level of abstraction indicates the appearance of a new system
- Strong: 4 levels of abstraction, two causally independent levels of abstraction, each with two causally interdependent levels of abstraction
While the higher level of abstraction emerges from the lower level, the lower level implements the details of the higher level. Russ Abbott has argued in "The reductionist blind spot" that the best way to understand emergence is through the lens of implementation - emergent properties can be described as a high level abstraction which is implemented by low level elements. The lower level of abstraction implements the higher level. In this sense, weak emergence means the implementation of a feature (by a small number of rules), while strong emergence means the implementation of a new system (by a large number of rules).
1. Simple/Nominal Emergence (Type I)
The weakest possible sense is totally predictable, and has the strongest form of constraints: each component and element has a fixed and constant role, which is not allowed to change in the course of time. A system in form of a machine has for instance a function which is different from the function of the parts and components, but the overall function is well-known, and it only matches the planned and designed function. There are no unpredicted or unexpected behavior patterns.
2. Weak Emergence (Type II)swarm intelligence, which can be seen in the figure on the left side: coherent global structures appear and become visible on a higher level of organization through the local interaction of several autonomous agents. Top-down feedback from the group imposes in turn constraints on the local interactions. An example is a flock of birds, which limits the possible movements of the individual birds. An important element is context-dependence: agents adjust their behavior and their role in the group according to the actual context and situation. Feedback from the group or the environment to the agent is possible through this form of context-dependent flexibility.
3. Multiple Emergence (Type III)
Multiple emergence is a form of emergence with multiple positive and negative feedback loops. The behavior is not predictable, and can be chaotic. Completely new roles can appear, while old roles disappear. A typical example for multiple emergence are bubbles are droplets. M. Mitchell Waldrop says in his book "Complexity: The Emerging Science at the Edge of Order and Chaos " (Simon & Schuster, 1992) about the complex patterns of droplets :
"Imagine spilling a little water onto the surface of a highly polished tray, (..) it beads up into a complex pattern of droplets. And it does so because two countervailing forces are at work. There is gravity, which tries to spread out the water to make a very thin, flat film across the whole surface. That's negative feedback. And there is surface tension, the attraction of one water molecule to another, which tries to pull the liquid together into compact globules. That's positive feedback. It's the mix of the two forces that produces the complex patterns of beads. Moreover, that pattern is unique. Try the experiment again and you'll get a completely different arrangement of droplets. Tiny accidents of history - infinitesimal dust motes and invisible irregularities in the surface of the tray - get magnified by the positive feedback into major differences in the outcome." (page 36)
4. Strong Emergence (Type IV)Downward causation and autonomy in weak emergence he says "Such [macro] causal powers cannot be explained in terms of the aggregation of the micro-level potentialities; they are primitive or "brute" natural powers that arise inexplicably with the existence of certain macro-level entitites." He admits himself that this concept is scientifically irrelevant and no essentially examples exist.
Contrary to Bedau and others, we define "strong emergence" here as the appearance of a new code, as the emergence of a new system, as supervenience without downward- or upward-causation. It occurs if high-level patterns form the building block for a new system. The strongest possible sense of emergence is related to supervenience, the weakest form of causal dependence. The stronger the emergence, the weaker the causal dependence. The strong form is not predictable, even in principle, because it describes the appearance of a new code or completely new system in a multi-level or multi-scale system with many levels. Combinatorial explosion renders any attempt of explaining emergent macroscopic, high-level phenomena in terms of microscopic low-level phenomena useless and futile. An intermediate or mesoscopic level often protects the macroscopic level from the microscopic level, i.e. the microscopic level is irrelevant to the behavior of the macroscopic level. Therefore strong emergence can be considered as crossing the barrier of relevance.
Strong emergence is very rare, and is normally the result of a long evolutionary process, or the result of deliberate and intentional design. Classic examples for strong emergence are the appearance of life and living systems through the emergence of the genetic code, and the appearance of culture and cultural systems through the emergence of memetic code (i.e. linguistic codes and languages in general). In both cases, completely new evolutionary or complex adaptive systems appeared, which are subject to their own evolution.
The difference between weak and strong emergence is on the one hand the magnitude and strength of the process. Both weak and strong emergence are irreducible and unpredictable, only the degree varies. In all cases of emergence, the low-level realizes the high-level, and the high-level is composed of the low-level. On the other hand, the relationship between the microscopic and the macroscopic level is different. In the case of weak emergence the relationship between both levels is a mutual interaction: high-level and low-level patterns influence each other. In the case of strong emergence the relationship between both levels is characterized by supervenience and implementation: the low-level implements (and realizes) the high-level, and the high-level supervenes on the low-level.
Strong emergence can be distinguished from weak by the existence of a code which specifies a new system in a system. Strong emergence is the emergence of a whole new system, with new building blocks and interaction laws. The "strong" emergence of a system is identical or at least closely related to the simulation or representation of a system through another system - simulation is the attempt to represent certain features of the behavior of a system by the behavior of another system. The interface between the new and the old system is described by a new code or language.
Characteristic signs for strong emergence are the existence of
- adaptors between two independent worlds
- codes and languages which define a correspondence between two independent worlds
- a multi-level or multi-scale system with many levels
- mesoscopic levels between the microscopic level of the individual cells, elements and agents, and the macroscopic level of the total system
Shadow Emergence (special case)
Strong emergence without an explicit code can be considered as Shadow Emergence: it is an implementation of a new system in an old one without a code, an implementation of a system without abstraction, compiler and high-level language.
Emergence and Evolution
While self-organization is frequently seen as the cause for complexity in nature (since nobody "organizes" nature), emergence is sometimes mistaken for the origin of jumps in complexity. Yet neither self-organization nor emergence is responsible for overwhelming complexity heights or sudden changes in complexity. Simple forms of emergence (Type I, Type II and partially Type III) can be considered as the result of self-organization. This form is often temporary or instable: flocks dissolve, schools of fish dissociate, and social groups disintegrate after a while.
The real jumps in complexity are related to emergence in evolutionary systems. Evolution is still the main reason and the driving force for the complexity and diversity which can be found in nature, and neither the concept of self-organization nor the phenomenon of emergence can really replace evolution or natural selection. The evolution of "selfish genes" seems to be responsible itself for all forms of sudden jumps in complexity in history, i.e. the appearance of more and more complex species or complex properties in the course of time.
Evolution is also responsible for all forms of "strong emergence" (Type IV), in which whole new evolutionary systems appear (associated with the emergence of a new code which is used to create the new evolutionary system). A new code marks the limits between the type I-III emergence due to forced extrinsic organization and intrinsic self-organization, and the type IV emergence due to evolution and deliberate design of a new code. It also marks the limits of self-organization in general.
Sudden jumps in complexity due to evolution are often related to fitness barriers. There are at least three different ways to cope with fitness barriers in evolution (Type IIIb), (1) bypass it, (2) tunnel through it, or (3) overcome it:
- to bypass through exaptation: explore a different direction and make a sudden side-leap
- to tunnel right through the barrier by borrowing complexity
- to wait for a catastrophe, until the barrier is reduced through catastrophic events
Christopher Langton's Ant and John Von Neumann's self-reproducing automata belong to the classic examples of "emergence", besides Conway's Game of Life from the field of ALife and Schelling's segregation model from the social sciences.
Langton's Ant is sometimes considered as a cellular automata, but it is more like an agent-based simulation with just one agent: the ant that wanders around. The rules for "Langton's Ant" are remarkably simple:
- If the ant is on a white cell it turns left 90 degrees and moves one unit forward
- If the ant is on a black cell it turns right 90 degrees and moves one unit forward
- As the ant moves to the next cell, the one that it is on changes color from white to black, or the reverse.
They can be formulated even simpler:
- The ant reverses the color of any cell it visits.
- When the ant visits a white square it turns left; when it visits a black square it turns right.
When the ant is started on an empty grid, these simple rules result first in very comlex and chaotic pattern, but after about ten thousand moves the ant suddenly shows a repetitive pattern: it gets locked into a periood or cycle of 104 steps and tries to build a broad diagonal "highway", each time displacing the ant two pixels vertically and horizontally. It is a good example for stigmergy and Type II emergence: the ant changes the environment (the micro-macro direction), and the environment changes in turn the behavior of the ant (the macro-micro feedback).
There are many applets for Langton's Ant on the net, for example  and . Mathworld says it is a "4-state 2-dimensional Turing machine".
If a single agent which follows very simple rules can produce such a complex behavior pattern, then the behavior of a whole Multi-Agent System with multiple agents and many different agent types is of course much more complicated.
The self-reproducing automata from John Von Neumann (and later from Edgar Codd) is more like a cellular automata and was deliberately constructed as a kind of "self-reproducing machine", see .
The simplest form of emergence is probably the following. To build a single termite mound in an environment consisting of randomly-scattered wood chips, a group of termites each has only to follow one simple rule :
While wandering randomly
- if you find a chip then pick it up
- unless you're already carrying a chip in which case drop it
These simple rules lead to an automatic aggregation of chips and "heap formation". Several small heaps will start to emerge, but then the largest heap will grow at the expense of the smaller ones until there is only the larger one left, see  and .
The following phenomena are examples for emergent properties:
- a "who eats whom" foodweb is an emergent property of a complex ecosystem. It emerges as a stable, repeated pattern in a complex ecosystem
- a supply chain network is an emergent property of a complex economic system
- PageRank is an algorithm developed by Google to determine a web pages "inbound link ranking". The rank of each page is an emergent property. A Web page alone is hard to evaluate regarding its usefulness, correctness, and popularity, but taken together, all Web pages do give useful information, which can be extracted with the PageRank algorithm.
- a painting emerges from the unique combination of colored points, strokes and lines created by the painter, similar to a book which arises from a unique combination of letters, or a piece of music (all Type I)
- the atmosphere offers a lot of interesting emergent phenomena, although it is only made of air, water droplets, and energy. On the physical level we have properties as pressure, (wind) speed and temperature, on the meteorological level we have clouds, rainbows and other meteorological patterns and regularities. Daniel C. Dennett writes about these examples in his book "Brainchildren - Essays on Designing Minds", Penguin Press Science, 1998 (p. 228)
- "Think of meteorology and its relation to physics. Clouds go scudding by, rain falls, snow flakes pile up in drifts, rainbows emerge; this is the language of folk meteorology. Modern day folk meteorologists - that is, all of us - know perfectly well that somehow or other all those individual clouds and rainbows and snowflakes and gusts of wind are just emergent saliencies (saliencies relative to our perceptual apparatus) of vast distributions of physical energy, water droplets, and the like. [...] The meteorologist's instruments are barometers, hygrometers, and thermometers, not cloudometers, rainbometers, and snowflakometers. The regularities of which the science of meteorology is composed concern pressure, temperature, and relative humidity, not the folk-meteorological categories."
Everything arises from atoms. Genes shape life-forms. Brain chemicals shape behavior. Assemblies of neurons shape self-consciousness and thoughts. Just how exactly?
Articles and Papers
- Reductionism, emergence, and levels of abstractions
- Emergence Explained, also here
- The reductionist blind spot
David J. Chalmers
Mark A. Bedau
Shalizi's notebook entry on Emergent Properties
- John H. Holland, Emergence from chaos to order (1998) Oxford University Press, ISBN 0738201421
- Steven Johnson, Emergence (2002) Scribner, ISBN 0684868768
- Stephen Wolfram, A New Kind of Science (2002), ISBN 1579550088.
- Jochen Fromm, The emergence of complexity (2004) Kassel University Press, ISBN 3899580699
- Thomas C. Schelling, Micromotives and Macrobehavior (1978) W. W. Norton and Company
- Harold J. Morowitz, The Emergence of Everything: How the World Became Complex (2002) Oxford University Press, ISBN 019513513X
- Armand Delsemme, Our Cosmic Origins: From the Big Bang to the Emergence of Life and Intelligence (1998) Cambridge University Press
- John Maynard Smith and Eörs Szathmáry, The Major Transitions in Evolution (1997) Oxford University Press, ISBN 019850294X
- Mark A. Bedau and Paul Humphreys (Eds.), Emergence: Contemporary Readings in Philosophy and Science, MIT Press, 2007, ISBN 0262524759
- Exploring Emergence: An introduction to emergence using Conway's Game of Life from the MIT Media Lab
- Stanford Encyclopedia of Philosophy entry on Emergent Properties
- Russ Abbott's Museum of unintended consequences in social systems and everyday life
- Questions and Answers about Emergence
- JASSS review of Bedau's emergence book
- Emergence and Evolution - Constraints on Form by Chris Lucas