What is Cybernetics?


Cybernetics is the interdisciplinary study of the Structure of Regulatory system. Cybernetics is closely related to control theory and systems theory. Both in its origins and in its evolution in the second-half of the 20th century, cybernetics is equally applicable to physical and social (that is, language-based) systems.

Contemporary cybernetics began as an interdisciplinary study connecting the fields of control systems, electrical network theory, mechanical engineering, logic modeling, evolutionary biology, neuroscience, anthropology, and psychology in the 1940s, often attributed to the Macy Conferences.

Other fields of study which have influenced or been influenced by cybernetics include game theory, system theory (a mathematical counterpart to cybernetics), psychology(especially neuropsychology, behavioral psychology, cognitive psychology, philosophy, and architecture.

Friday, January 30, 2009

Bionics

Bionics (also known as biomimetics, biognosis, biomimicry, or bionical creativity engineering) is the application of biological methods and systems found in nature to the study and design of engineering systems and modern technology. The word "bionic" was coined by Jack E. Steele in 1958, possibly originating from the Greek word "βίον", pronounced "bion", meaning "unit of life" and the suffix -ic, meaning "like" or "in the manner of", hence "like life". Some dictionaries, however, explain the word as being formed from "biology" + "electronics".

The transfer of technology between lifeforms and synthetic constructs is, according to proponents of bionic technology, desirable because evolutionary pressure typically forces living organisms, including fauna and flora, to become highly optimized and efficient. A classical example is the development of dirt- and water-repellent paint (coating) from the observation that the surface of the lotus flower plant is practically unsticky for anything (the lotus effect).

Examples of bionics in engineering include the hulls of boats imitating the thick skin of dolphins; sonar, radar, and medical ultrasound imaging imitating the echolocation of bats.

In the field of computer science, the study of bionics has produced artificial neurons, artificial neural networks, and swarm intelligence. Evolutionary computation was also motivated by bionics ideas but it took the idea further by simulating evolution in silico and producing well-optimized solutions that had never appeared in nature.

It is estimated by Julian Vincent, professor of biomimetics at the University of Bath in the UK, that "at present there is only a 10% overlap between biology and technology in terms of the mechanisms used".


History

The name biomimetics was coined by Otto Schmitt in the 1950s. The term bionics was coined by Jack E. Steele in 1958 while working at the Aeronautics Division House at Wright-Patterson Air Force Base in Dayton. However, biomimicry or biomimetics is more preferred in technology world in efforts to avoid confusion between the medical term bionics. Coincidentally, Martin Caidin used the word for his 1972 novel Cyborg, which inspired the series The Six Million Dollar Man. Caidin was a long-time aviation industry writer before turning to fiction full time.


Methods

Often, the study of bionics emphasizes implementing a function found in nature rather than just imitating biological structures. For example, in computer science, cybernetics tries to model the feedback and control mechanisms that are inherent in intelligent behavior, while artificial intelligence tries to model the intelligent function regardless of the particular way it can be achieved.

The conscious copying of examples and mechanisms from natural organisms and ecologies is a form of applied case-based reasoning, treating nature itself as a database of solutions that already work. Proponents argue that the selective pressure placed on all natural life forms minimizes and removes failures.

Although almost all engineering could be said to be a form of biomimicry, the modern origins of this field are usually attributed to Buckminster Fuller and its later codification as a house or field of study to Janine Benyus.

Roughly, we can distinguish three biological levels in the fauna or flora, after which technology can be modeled:

  • Mimicking natural methods of manufacture
  • Imitating mechanisms found in nature (velcro)
  • Studying organizational principles from social behaviour of organisms, such as the flocking behaviour of birds, the foraging behaviour of bees and ants, and the Swarm Intelligence(SI)-based behaviour of a school of fish.

Examples of biomimetics

  • Velcro is the most famous example of biomimetics. In 1948, the Swiss engineer George de Mestral was cleaning his dog of burrs picked up on a walk when he realized how the hooks of the burrs clung to the fur.
  • Cat's eye reflectors were invented by Percy Shaw in 1935 after studying the mechanism of cat eyes. He had found that cats had a system of reflecting cells, known as tapetum lucidum, which was capable of reflecting the tiniest bit of light.
  • Leonardo da Vinci's flying machines and ships are early examples of drawing from nature in engineering.
  • Julian Vincent drew from the study of pinecones when he developed in 2004 "smart" clothing that adapts to changing temperatures. "I wanted a nonliving system which would respond to changes in moisture by changing shape", he said. "There are several such systems in plants, but most are very small — the pinecone is the largest and therefore the easiest to work on". Pinecones respond to higher humidity by opening their scales (to disperse their seeds). The "smart" fabric does the same thing, opening up when the wearer is warm and sweating, and shutting tight when cold.
  • "Morphing aircraft wings" that change shape according to the speed and duration of flight were designed in 2004 by biomimetic scientists from Penn State University. The morphing wings were inspired by different bird species that have differently shaped wings according to the speed at which they fly. In order to change the shape and underlying structure of the aircraft wings, the researchers needed to make the overlying skin also be able to change, which their design does by covering the wings with fish-inspired scales that could slide over each other. In some respects this is a refinement of the swing-wing design.
Lotus leaf surface, rendered: microscopic view
  • Some paints and roof tiles have been engineered to be self-cleaning by copying the mechanism from the Nelumbo lotus.
  • Nanostructures and physical mechanisms that produce the shining color of butterfly wings were reproduced in silico by Greg Parker, professor of Electronics and Computer Science at the University of Southampton and research student Luca Plattner in the field of photonics, which is electronics using photons as the information carrier instead of electrons.
  • The wing structure of the blue morpho butterfly was studied and the way it reflects light was mimicked to create an RFID tag that can be read through water and on metal.
  • Neuromorphic chips, silicon retinae or cochleae, has wiring that is modelled after real neural networks. S.a.: connectivity
  • Synthetic or "robotic" vegetation, which aids in conservation and restoration, are machines designed to mimic many of the functions of living vegetation.
  • Medical adhesives involving glue and tiny nano-hairs are being developed based on the physical structures found in the feet of geckos.

Specific uses of the term

In medicine

Bionics is a term which refers to the flow of concepts from biology to engineering and vice versa. Hence, there are two slightly different points of view regarding the meaning of the word.

In medicine, bionics means the replacement or enhancement of organs or other body parts by mechanical versions. Bionic implants differ from mere prostheses by mimicking the original function very closely, or even surpassing it.

Bionics' German equivalent, "Bionik", always adheres to the broader meaning, in that it tries to develop engineering solutions from biological models. This approach is motivated by the fact that biological solutions will usually be optimized by evolutionary forces.

While the technologies that make bionic implants possible are still in a very early stage, a few bionic items already exist, the best known being the cochlear implant, a device for deaf people. By 2004 fully functional artificial hearts were developed. Significant further progress is expected to take place with the advent of nanotechnologies. A well known example of a proposed nanodevice is a respirocyte, an artificial red cell, designed (though not built yet) by Robert Freitas.

Kwabena Boahen from Ghana was a professor in the Department of Bioengineering at the University of Pennsylvania. During his eight years at Penn, he developed a silicon retina that was able to process images in the same manner as a living retina. He confirmed the results by comparing the electrical signals from his silicon retina to the electrical signals produced by a salamander eye while the two retinas were looking at the same image.

Politics

A political form of biomimcry is bioregional democracy, wherein political borders conform to natural ecoregions rather than human cultures or the outcomes of prior conflicts.

Critics of these approaches often argue that ecological selection itself is a poor model of minimizing manufacturing complexity or conflict, and that the free market relies on conscious cooperation, agreement, and standards as much as on efficiency - more analogous to sexual selection. Charles Darwin himself contended that both were balanced in natural selection - although his contemporaries often avoided frank talk about sex, or any suggestion that free market success was based on persuasion, not value.

Advocates, especially in the anti-globalization movement, argue that the mating-like processes of standardization, financing and marketing, are already examples of runaway evolution - rendering a system that appeals to the consumer but which is inefficient at use of energy and raw materials. Biomimicry, they argue, is an effective strategy to restore basic efficiency.

Biomimicry is also the second principle of Natural Capitalism.

Other uses

In a more specific meaning, it is a creativity technique that tries to use biological prototypes to get ideas for engineering solutions. This approach is motivated by the fact that biological organisms and their organs have been well optimized by evolution. In chemistry, a biomimetic synthesis is a man-made chemical synthesis inspired by biochemical processes.

Another, more recent meaning of the term "bionics" refers to merging organism and machine. This approach results in a hybrid system combining biological and engineering parts, which can also be referred as a cybernetic organism (cyborg). Practical realization of this was demonstrated in Kevin Warwick's implant experiments bringing about ultrasound input via his own nervous system.

In 2006 Mercedes-Benz introduced its Bionic concept car.

Biocybernetics

Biocybernetics is the application of cybernetics to the biological science, comprised of biological disciplines that benefit from the application of cybernetics: neurology, multicellular systems and others. Biocybernetics plays a major role in systems biology, seeking to integrate different levels of information to understand how biological systems function.

Biocybernetics as an abstract science is a part of theoretical biology, and based upon the principles of systemics.



Terminology

Biocybernetics is a cojoined word from bio (Greek: βίο / life) and cybernetics (Greek: κυβερνητική / controlling-governing). It is sometimes written together or with a blank or written fully as biological cybernetics, whilst the same rules apply. Most write it together though, as Google statistics show. The same applies to neuro cybernetics which should also be looked up as neurological, when doing extensive research.

Same or familiar fields

As those disciplines are dealing on theoretical/abstract foundations and are in accordance with the popularity of computers. Thus papers and research is in greater numbers going on under different names: e.g. molecular cybernetics -> molecular computational systems OR molecular systems theory OR molecular systemics OR molecular information/informational systems

Please heed this when you engage in an extensive search for information to assure access to a broad range of papers.


Categories

  • biocybernetics - the study of an entire living organism
  • neurocybernetics - cybernetics dealing with neurological models. (psycho-cybernetics was the title of a self-help book, and is not a scientific discipline)
  • molecular cybernetics - cybernetics dealing with molecular systems (e.g. molecular biology cybernetics)
  • cellular cybernetics - cybernetics dealing with cellular systems (e.g. information technology/cell phones,.. or biological cells)
  • evolutionary cybernetics - study of the evolution of informational systems (See also evolutionary programming, evolutionary algorithm)
  • any distinct informational system within the realm of biology

Bioengineering

Bioengineering (also known as Biological Engineering) is the application of engineering principles to address challenges in the fields of biology and medicine. As a study, it encompasses biomedical engineering and it is related to biotechnology.

Bioengineering applies engineering principles to the full spectrum of living systems. This is achieved by utilising existing methodologies in such fields as molecular biology, biochemistry, microbiology, pharmacology, cytology, immunology and neuroscience and applies them to the design of medical devices, diagnostic equipment, biocompatible materials, and other important medical needs.

Bioengineering is not limited to the medical field. Bioengineers have the ability to exploit new opportunities and solve problems within the domain of complex systems. They have a great understanding of living systems as complex systems which can be applied to many fields including entrepreneurship.

Much as other engineering disciplines also address human health (e.g., prosthetics in mechanical engineering), bioengineers can apply their expertise to other applications of engineering and biotechnology, including genetic modification of plants and microorganisms, bioprocess engineering, and biocatalysis. However, the Main Fields of Bioengineering may be categorised as:

  • Biomedical Engineering; Biomedical technology; Biomedical Diagnosis, Biomedical Therapy, Biomechanics, Biomaterials.
  • Genetic Engineering; Cell Engineering, Tissue Culture Engineering.

The word was invented by British scientist and broadcaster Heinz Wolff in 1954.

"Bioengineering" is also the term used to describe the use of vegetation in civil engineering construction.

The term bioengineering may also be applied to environmental modifications such as surface soil protection, slope stabilisation, watercourse and shoreline protection, windbreaks, vegetation barriers including noise barriers and visual screens, and the ecological enhancement of an area.



Bioengineering

Bioengineering (also known as Biological Engineering) is the application of engineering principles to address challenges in the fields of biology and medicine. As a study, it encompasses biomedical engineering and it is related to biotechnology.

Bioengineering applies engineering principles to the full spectrum of living systems. This is achieved by utilising existing methodologies in such fields as molecular biology, biochemistry, microbiology, pharmacology, cytology, immunology and neuroscience and applies them to the design of medical devices, diagnostic equipment, biocompatible materials, and other important medical needs.

Bioengineering is not limited to the medical field. Bioengineers have the ability to exploit new opportunities and solve problems within the domain of complex systems. They have a great understanding of living systems as complex systems which can be applied to many fields including entrepreneurship.

Much as other engineering disciplines also address human health (e.g., prosthetics in mechanical engineering), bioengineers can apply their expertise to other applications of engineering and biotechnology, including genetic modification of plants and microorganisms, bioprocess engineering, and biocatalysis. However, the Main Fields of Bioengineering may be categorised as:

  • Biomedical Engineering; Biomedical technology; Biomedical Diagnosis, Biomedical Therapy, Biomechanics, Biomaterials.
  • Genetic Engineering; Cell Engineering, Tissue Culture Engineering.

The word was invented by British scientist and broadcaster Heinz Wolff in 1954.

"Bioengineering" is also the term used to describe the use of vegetation in civil engineering construction.

The term bioengineering may also be applied to environmental modifications such as surface soil protection, slope stabilisation, watercourse and shoreline protection, windbreaks, vegetation barriers including noise barriers and visual screens, and the ecological enhancement of an area.



Thursday, January 29, 2009

Conversation Theory

Conversation Theory is a cybernetic and dialectic framework that offers a scientific theory to explain how interactions lead to "construction of knowledge", or, "knowing": wishing to preserve both the dynamic/kinetic quality, and the necessity for there to be a "knower". This work is proposed by Gordon Pask in the 1970s.


Overview

Conversation Theory regards social systems as symbolic, language-oriented systems where responses depend on one person's interpretation of another person's behavior, and where meanings are agreed through conversations. But since meanings are agreed, and the agreements can be illusory and transient, scientific research requires stable reference points in human transactions to allow for reproducible results. Pask found these points to be the understandings which arise in the conversations between two participating individuals, and which he defined rigorously.

Conversation Theory describes interaction between two or more cognitive systems, such as a teacher and a student or distinct perspectives within one individual, and how they engage in a dialog over a given concept and identify differences in how they understand it.

Conversation Theory came out of the work of Gordon Pask on instructional design and models of individual learning styles. In regard to learning styles, he identified conditions required for concept sharing and described the learning styles holist, serialist, and their optimal mixture versatile. He proposed a rigorous model of analogy relations.


Topics

Conversation Theory as developed by Pask originated from this cybernetics framework and attempts to explain learning in both living organisms and machines. The fundamental idea of the theory was that learning occurs through conversations about a subject matter which serves to make knowledge explicit.

Levels of conversation

Conversations can be conducted at a number of different levels:

  • Natural language (general discussion)
  • Object languages (for discussing the subject matter)
  • Metalanguages (for talking about learning/language)

Conversation

Through recursive interactions called "Conversation" their differences may be reduced until agreement--that is, agreement up to a point which Pask called "agreement over an understanding"--may be reached. A residue of the interaction may be captured as an "entailment mesh", an organized and publicly available collection of resultant knowledge, itself a major product of the theory as devotees argue they afford many advantages over semantic networks and other, less formalized and non-experimentally based "representations of knowledge".

The Derivation of a concept from at least two concurrently existing topics or concepts
Alternative derivations may be shown with conjunctive (AND) and disjunctive pathways (OR). This is logically equivalent to T1 = (T2 AND T3) OR (T4 AND T5)
Any two concepts can produce the third, shown as the cyclic form of three concepts --- note that the arrows should show that BOTH T1 and T2 are required to produce T3; similarly for generating T1 or T2 from the others.
Lastly a formal analogy is shown where the derivations of the concept triples are indicated. The diamond shape denotes analogy and can exist between any three topics because of the shared meanings and differences.
Analogy

The relation of one topic to another by an analogy can also be seen as a restriction on a mapping and a distinction to produce the second topic or concept.

Cognitive Reflector

From Conversation Theory Pask developed what he called a "Cognitive Reflector". This is a virtual machine for selecting and executing concepts or topics from an entailment mesh shared by at least a pair of participants. It features an external modelling facility on which agreement between, say, a teacher and pupil may be shown by reproducing public descriptions of behaviour. We see this in essay and report writing or the "practicals" of science teaching.

Lp was Pask's protolanguage which produced operators like Ap which concurrently executes, Con, the concept of a Topic, T to produce a Description, D. Thus:

Ap(Con(T)=> D(T), where => stands for produces.

A succinct account of these operators is presented in Pask Amongst many fascinating insights he points out three indexes are required for concurrent execution, two for parallel and one to designate a serial process. He subsumes this complexity by designating participants A, B etc.

In Commentary toward the end of Pask he states:

The form not the content of the theories (conversation theory and interactions of actors theory) return to and is congruent with the forms of physical theories; such as wave particle duality (the set theoretic unfoldment part of conversation theory is a radiation and its reception is the interpretation by the recipient of the descriptions so exchanged, and vice versa). The particle aspect is the recompilation by the listener of what a speaker is saying. Theories of many universes, one at least for each participant A and one to participant B- are bridged by analogy. As before this is the truth value of any interaction; the metaphor for which is culture itself.

Learning strategies

In order to facilitate learning, Pask argued that subject matter should be represented in the form of structures which show what is to be learned. These structures exist in a variety of different levels depending upon the extent of the relationships displayed. The critical method of learning according to Conversation Theory is "teachback" in which one person teaches another what they have learned.

Pask identified two different types of learning strategies:

  • Serialists – Progress through a structure in a sequential fashion
  • Holists - Look for higher order relations

Gordon Pask

Andrew Gordon Speedie Pask (* June 28, 1928 in Derby; † March 28, 1996 London) was an English cybernetician and psychologist who made significant contributions to cybernetics, instructional psychology, experimental epistemology and educational technology.


Biography

Pask was born in Derby, England in 1928. After qualifying precociously as a Mining Engineer at Liverpool Polytechnic, now Liverpool John Moores University, Pask obtained an MA in Natural Sciences from Cambridge in 1952 and a PhD in Psychology from the University of London in 1964. Whilst Visiting Professor of Educational Technology he obtained the first DSc from the Open University. From the sixties Pask directed commercial research at System Research Ltd in Richmond, Surrey and his partnership, Pask Associates, near Clapham Common during the eighties and nineties.

Pask held faculty positions at Brunel University, University of Illinois at Chicago, University of Illinois at Urbana-Champaign, National Autonomous University of Mexico, Concordia University, Georgia Institute of Technology, University of Oregon, and University of Amsterdam.

In 1968 Gordon Pask and his pupil Roy Ascott were elected Associate Member of the Institution of Computer Science, London. In 1974 he was elected president of the Society for General Systems Research, now the International Society for Systems Science. Pask was chairman of the Cybernetics Society from 1976 to 1979. He advised the professional cybernetician to proceed in the manner of the consulting detective Sherlock Holmes.

In 1995 he was awarded a ScD from his alma mater, Downing College, Cambridge, and he was a recipient of the Wiener medal from the Cybernetics Society in London.

In 1956 Pask had been married to Elizabeth Poole with whom he had two daughters. He was further active in the theatre and wrote a collection of short stories "Adventures with Professor Flaxman-Low" (narrated extract with notes) as a literary comment on his work. For many years he was Senior Tutor at the Architectural Association in London. He drew and painted and was a member of the Chelsea Arts Club and the Athenaeum Club.


Work: overview

Gordon's primary contribution to cybernetics and systems theory, as well as to numerous other fields, was his emphasis on the personal nature of reality, and on the process of learning as stemming from the consensual agreement of interacting actors in a given environement. Life and intelligence lie somewhere in the conflict between closed, unique, construction and open, shared, interaction. Between a specific material fabric, and a general conceptual/functional organization. In fact, his message, still very much mute to the more hardcore computationalist ears in the Artificial Intelligence and Artificial Life communities, stresses that only systems striving out of this conflict can be considered to be alive and/or intelligent, and endowed with the potential for open-ended conceptual/functional variety.

Pask's most well known work was the development of

  • Conversation Theory: is a cybernetic and dialectic framework that offers a scientific theory to explain how interactions lead to "construction of knowledge", or, as Pask preferred "knowing" (wishing to preserve both the dynamic/kinetic quality, and the necessity for there to be a "knower"). It came out of his work on instructional design and models of individual learning styles. In regard to learning styles, he identified conditions required for concept sharing and described the learning styles holist, serialist, and their optimal mixture versatile. He proposed a rigorous model of analogy relations.
  • Interactions of Actors Theory: This is a generalized account of the eternal kinetic processes that support kinematic conversations bounded with beginnings and ends in all media. It is reminiscent of Freud's psychodynamics, Bateson's panpsychism (see "Mind and Nature: A Necessary Unity" 1970). Pask's nexus of analogy, dependence and mechanical spin produces the differences that are central to cybernetics.

Interactions of Actors Theory

While working with clients in the last years of his life, Gordon Pask produced an axiomatic scheme for his Interactions of Actors Theory, less well-known than his Conversation Theory. "Interactions of Actors (IA), Theory and Some Applications", as the manuscript is entitled, is essentially a concurrent spin calculus applied to the living environment with strict topological constraints. One of the most notable associates of Gordon Pask, Gerard de Zeeuw, was a key contributor to the development of Interactions of Actors theory.

The figure shows Pask's famous "repulsive carapace" force surrounding a concept. It is shown by the minus sign, it has a clockwise or anticlockwise spin - compare Spin (physics). The spin signature is determined by the residual parity of a braid which is the thick line enclosed by the cylinder. The plus sign labels a process seeking closure by "eating its own tail". Three of these toroidal structures can produce a Borromean link model of the minimal stable concept. Pask said the prismatic tensegrity could be used as a model for the interaction in a Borromean link.


Prismatic Tensegrity space filling unit cell of a minimal concept. The red, blue and green rods exert compressive repulsions, the black lines represent attractive tensions. The Borromean link shown is regarded as a resonance form (c.f. tautomerism) of Pask's minimal persisting concept triple.

Interactions of Actors Theory (IA) is a process theory. As a means to describe the interdisciplinary nature of his work, Pask would make analogies to physical theories in the classic positivist enterprises of the social sciences. Pask sought to apply the axiomatic properties of agreement or epistemological dependence to produce a "sharp-valued" social science with precision comparable to the results of the hard sciences. It was out of this inclination that he would develop his Interactions of Actors Theory. Pask's concepts produce relations in all media and he regarded IA as a process theory. In his Complementarity Principle (see New Cybernetics (Gordon Pask)) he stated "Processes produce products and all products (finite, bounded, coherent objects) are produced by processes".

Most importantly Pask believed that no two concepts could be the same because of their different histories. He called this the "No Doppelgangers" clause or edict. Later he reflected "Time is incommensurable for Actors". He saw these properties as necessary to produce differentiation and innovation or new coherences in physical nature and, indeed, minds.

In 1995 Pask stated what he called his Last Theorem: "Like concepts repel and unlike concepts attract". For ease of application Pask stated the differences and similarities of descriptions (the products of processes) were context and perspective dependent. In the last three years of his life Pask presented models based on Knot theory knots which described minimal persisting concepts. He interpreted these as acting as computing elements which exert repulsive forces in order to interact and persist in filling the space. The knots, links and braids of his entailment mesh models of concepts, which could include tangle-like processes seeking "tail-eating" closure, Pask called "tapestries".

His analysis proceeded with like seeming concepts repelling or unfolding but after a sufficient duration of interaction (he called this duration "faith") a pair of similar or like-seeming concepts will always produce a difference and thus an attraction. Amity (availability for interaction), respectability (observability), responsibility (able to respond to stimulus), unity (not uniformity) were necessary properties to produce agreement (or dependence) and agreement-to-disagree (or relative independence) when Actors interact. Concepts could be applied imperatively or permissively when a Petri (see Petri net) condition for synchronous transfer of meaningful information occurred. Extending his physical analogy Pask associated the interactions of thought generation with radiation : "operations generating thoughts and penetrating conceptual boundaries within participants, excite the concepts bounded as oscillators, which, in ridding themselves of this surplus excitation, produce radiation"

In sum, IA supports the earlier kinematic Conversation Theory work where minimally two concurrent concepts were required to produce a non-trivial third. One distinction separated the similarity and difference of any pair in the minimum triple. However, his formal methods denied the competence of mathematics or digital serial and parallel processes to produce applicable descriptions because of their innate pathologies in locating the infinitesimals of dynamic equilibria (Stafford Beer's "Point of Calm"). He dismissed the digital computer as a kind of kinematic "magic lantern". He saw mechanical models as the future for the concurrent kinetic computers required to describe natural processes. He believed that this implied the need to extend quantum computing to emulate true field concurrency rather than the current von Neumann architecture.

Reviewing IA he said:

Interaction of actors has no specific beginning or end. It goes on forever. Since it does so it has very peculiar properties. Whereas a conversation is mapped (due to a possibility of obtaining a vague kinematic, perhaps picture-frame image, of it, onto Newtonian time, precisely because it has a beginning and end), an interaction, in general, cannot be treated in this manner. Kinematics are inadequate to deal with life: we need kinetics. Even so as in the minimal case of a strict conversation we cannot construct the truth value, metaphor or analogy of A and B. The A, B differences are generalizations about a coalescence of concepts on the part of A and B; their commonality and coherence is the similarity. The difference (reiterated) is the differentiation of A and B (their agreements to disagree, their incoherences). Truth value in this case meaning the coherence between all of the interacting actors.

He added:

It is essential to postulate vectorial times (where components of the vectors are incommensurate) and furthermore times which interact with each other in the manner of Louis Kaufmann's knots and tangles.

In experimental Epistemology Pask, the "philosopher mechanic", produced a tool kit to analyze the basis for knowledge and criticize the teaching and application of knowledge from all fields: the law, social and system sciences to mathematics, physics and biology. In establishing the vacuity of invariance Pask was challenged with the invariance of atomic number. "Ah", he said "the atomic hypothesis". He rejected this instead preferring the infinite nature of the productions of waves.

Pask held that concurrence is a necessary condition for modeling brain functions and he remarked IA was meant to stand AI, Artificial Intelligence, on its head. Pask believed it was the job of cybernetics to compare and contrast. His IA theory showed how to do this. Heinz von Foerster called him a genius, "Mr. Cybernetics", the "cybernetician's cybernetician".

Hewitt's Actor model

The Hewitt, Bishop and Steiger approach concerns sequential processing and inter-process communication in digital, serial, kinematic computers. It is a parallel or pseudo-concurrent theory as is the theory of concurrency. See Concurrency (computer science). In Pask's true field concurrent theory kinetic processes can interrupt (or, indeed, interact with) each other, simply reproducing or producing a new resultant force within a coherence (of concepts) but without buffering delays or priority.


No Doppelgangers

"There are no Doppelgangers" is a fundamental theorem, edict or clause of cybernetics due to Gordon Pask in support of his theories of learning and interaction in all media: Conversation Theory and Interactions of Actors Theory. It accounts for physical differentiation and is Pask's exclusion principle. It states no two products of concurrent interaction can be the same because of their different dynamic contexts and perspectives. No Doppelgangers is necessary to account for the production by interaction and intermodulation (c.f. beats) different, evolving, persisting and coherent forms. Direct evidence is seen, for example, in spectral line broadening. Two proofs are presented both due to Pask.

Duration Proof

Consider a pair of moving, dynamic participants A and B producing an interaction T. Their separation will vary during T. The duration of T observed from A will be different from the duration of T observed from B.

Let Ts and Tf be the start and finish times for the transfer of meaningful information.

Where <> stands for "is not equal to" we can write:

TsA <> TfB,

TsB <> TfB,

TsA <> TsB,

TfA <> TsB

TfA <> TsA

TfA <> TfB

Thus

A <> B

Q.E.D.

Pask remarked :

Conversation is defined as having a beginning and an end and time is vectorial. The components of the vector are commensurable (in duration). On the other hand actor interaction time is vectorial with components that are incommensurable. In the general case there is no well-defined beginning and interaction goes on indefinitely. As a result the time vector has incommensurable components. Both the quantity and quality differ.

No Doppelgangers applies in both the Conversation Theory's kinematic domain (bounded by beginnings and ends) where times are commensurable and in the eternal kinetic Interactions of Actors domain where times are incommensurable.

Reproduction Proof

The second proof is more reminiscent of R.D. Laing: Your concept of your concept is not my concept of your concept- a reproduced concept is not the same as the original concept. Pask defined concepts as persisting, countably infinite, recursively packed spin processes (like many cored cable, or skins of an onion) in any medium (stars, liquids, gases, solids, machines and, of course, brains) that produce relations.

Here we prove A(T) <> B(T).

D means "description of" and reads A's concept of T produces A's description of T, evoking Dirac notation (required for the production of the quanta of thought: the transfer of "set-theoretic tokens", as Pask puts it in 1996).

TA = A(T) = , A's Concept of T,

TB = B(T) = , B's Concept of T,

or, in general

TZ = Z(T) = ,

also, in general

AA = A(A) = , A's Concept of A,

AB = A(B) = , A's Concept of B.

and vice versa, or, in general terms

ZZ = Z(Z) = ,

given that for all Z and all T, the concepts

TA = A(T) is not equal to TB = B(T)

and that

AA = A(A) is not equal to BA = B(A) and vice versa, hence, there are no Doppelgangers.

Q.E.D.

A Mechanical Model

Pask attached a piece of string to a bar with three knots in it. Then he attached a piece of elastic to the bar with three knots in it. One observing actor, A, on the string would see the knotted intervals on the other actor as varying as the elastic was stretched and relaxed corresponding to the relative motion of B as seen from A. The knots correspond to the beginning of the experiment then the start and finish of the A/B interaction. Referring to the three intervals, where x, y, z, are the separation distances of the knots from the bar and each other, he noted x > y > z on the string for participant A does not imply x > z for participant B on the elastic. A change of separation between A and B producing Doppler shifts during interaction, recoil or the differences in relativistic proper time for A and B, would account for this for example. On occasion a second knotted string was tied to the bar representing coordinate time.

Further Context

To set in further context Pask won a prize from Old Dominion University for his Complementarity Principle: "All processes produce products and all products are produced by processes". This can be written:

Ap(Con Z(T)) => D Z(T) where => means produces and Ap means the "application of". This can also be written

.

Pask distinguishes Imperative (written &Ap or IM) from Permissive Application (written Ap) where information is transferred in the Petri net manner, the token appearing as a hole in a torus producing a Klein bottle containing recursively packed concepts.

Pask's "hard" or "repulsive" carapace was a condition he required for the persistence of concepts. He endorsed Rescher's Coherence Theory of Truth approach where a set membership criterion of similarity also permitted differences amongst set or coherence members, but he insisted repulsive force was exerted at set and members' coherence boundaries. He said of Spencer Brown's Laws of Form that distinctions must exert repulsive forces. This is not accepted by Spencer Brown and others. Without a repulsion, or Newtonian reaction at the boundary, sets, their members or interacting participants would diffuse away forming a "smudge"; Hilbertian marks on paper would not be preserved. Pask, the mechanical philosopher, wanted to apply these ideas to bring a new kind of rigour to cybernetic models.

Second-order cybernetics

Second-order cybernetics, also known as the cybernetics of cybernetics, investigates the construction of models of cybernetic systems. It investigates cybernetics with awareness that the investigators are part of the system, and of the importance of self-referentiality, self-organizing, the subject-object problem, etc.



Overview

The anthropologists Gregory Bateson and Margaret Mead contrasted first and second-order Cybernetics with this diagram in an interview in 1973. It emphasizes the requirement for a possibly constructivist participant observer in the second order case.

Heinz von Foerster attributes the origin of second-order cybernetics to the attempts of classical cyberneticians to construct a model of the mind. Researchers realized that:

. . . a brain is required to write a theory of a brain. From this follows that a theory of the brain, that has any aspirations for completeness, has to account for the writing of this theory. And even more fascinating, the writer of this theory has to account for her or himself. Translated into the domain of cybernetics; the cybernetician, by entering his own domain, has to account for his or her own activity. Cybernetics then becomes cybernetics of cybernetics, or second-order cybernetics.

The work of Heinz von Foerster, Humberto Maturana, Gordon Pask, Ranulph Glanville, and Paul Pangaro is strongly associated with second-order cybernetics. Pask recommended the term New Cybernetics in his last paper which emphasises all observers are participant observers that interact.

New Cybernetics


New Cybernetics
is a study of self-organizing systems, looking beyond the issues of the "first", "old" or "original" cybernetics and their politics and sciences of control, to the autonomy and self-organization capabilities of complex systems. New cybernetics is otherwise known as the cybernetics of cybernetics or second order cybernetics, and second order cybernetics is called a new cybernetics.



Overview

The so-called "new cybernetics" is an attempt to move away from the cybernetics of Norbert Wiener. Old cybernetics is tied to the image of the machine and physics; whereas, new cybernetics closely resembles organisms and biology. The main task of the new cybernetics is to overcome entropy by using "noise" as positive feedback.

In 1992 Gordon Pask summarized the differences between the old and the new cybernetics as a shift in emphasis:

  • ... from information to coupling
  • ... from the reproduction of "order-from-order" (Schroedinger 1944) to the generation of "order-from-noise" (von Foerster 1960)
  • ... from transmission of data to conversation
  • ... from external to participant observation - in short, from "CCC" to an approach that could be assimilated to Matura and Varela's concept to autopoiesis.

Gertrudis van de Vijver stated in 1994, that the old cybernetics, the new cybernetics and the cognitive paradigms are not that revolutionary different from each other, and as is mostly the case, the so-called new paradigms are in a sense "older" that the "old" paradigms. The so-called "old" paradigms were in most cases strategically successful specializations in a general framework. Their success was based on a strong but useful simplification of the issues. The "new" paradigms are further specializations in the earlier one, or (as is mostly the case) a strategic retreat which broadens the specialized approach and is a return to the original, broader inspiration and outlook. This is what happens, we believe, with the new cybernetics, the post-cybernetics, as well as with the new cognitive approach...


History

In March 1946, the first of ten influential interdisciplinary Macy conferences were devoted to the, then also called, new cybernetics, and opened with two presentations: the first by von Neumann on the new computing machines, followed by neurobiologist Lorente de No on the electric properties of the nervous system. These circuiting of analogies between behaviour of computers and the nervous system became central to cybernetic imagination and its founding desire to define the essential "unity of a set of problems" organized around "communication, control, and statistical mechanics, whether in the machine or living tissue. In particular, the early cyberneticists are convinced that research on computers and the organization of the human brain are one and the same field, that is, "the subject embracing both the engineering and the neurology aspect is essentially one."

Wiener defined cybernetics in 1948 as the study of "control and communication of the animal and the machine". This definition captures the original ambition of cybernetics to appear as a unified theory of behaviour of living organisms and machines, viewed as systems governed by the same physical laws. The initial phase of cybernetics involved disciplines more or less directly related to the study of those systems, like communication and control engineering, biology, psychology, logic, and neurophysiology. Very soon, a number of attempts were made to place the concept of control at the focus of analysis also in other fields, such as economics, sociology, and anthropology. The ambition of "classic" cybernetics thus seemed to involve also several human sciences, as it developed in a highly interdisciplinary approach, aimed at seeking common concepts and methods in rather different disciplines. In classic cybernetics this ambition did not produce the desired results and new approaches had to be attempted in order to achieve them, at least partially.

In the 1970s New cybernetics has emerged in multiple fields, first in biology. Some biologists influenced by cybernetic concepts (Maturana and Varela, 1980); Varela, 1979; Atlan, 1979) realized that the cybernetic metaphors of the program upon which molecular biology had been based rendered a conception of the autonomy of the living being impossible. Consequently, these thinkers were led to invent a new cybernetics, one more suited to the organization of mankind discovers in nature - organizations he has not himself invented. The possibility that this new cybernetics could also account for social forms of organization, remained an object of debate among theoreticians on self-organization in the 1980s.

In political science in the 1980s unlike its predecessor, the new cybernetics concerns itself with the interaction of autonomous political actors and subgroups and the practical can reflexive consciousness of the subject who produce and reproduce the structure of political community. A dominant consideration is that of recursiveness, or self-reference of political action both with regards to the expression of political consciousness and with the ways in which systems build upon themselves.

Geyer and van der Zouwen in 1978 discuss a number of characteristics of the merging "new cybernetics". One characteristic of new cybernetics is that it views information as construct and reconstructed by an individual interacting with the environment. This provides an epistemological foundation of science, by viewing it as observer-dependent. Another characteristic of the new cybernetics is its contribution towards bridging the "micro-macro gap". That is, it link the individual with the society. Geyer and van derZouten also noted that a transition form classical cybernetics to the new cybernetics involves a transition form classical problems to new problems. These shifts in the thinking involve, among others a change form emphasis on the system being steered to the system doing the steering, and the factor which guide the steering decisions. And new emphasis on communication between several systems which are trying to steer each other.


New Cybernetics: Topics

Just as quantum theory has superseded classical physics, so the new cybernetics approach has superseded the classical theory of communication. According to F. Merrel (1988) in this new era, and speaking generally of the reigning conceptual framework, incompleteness, openness, inconsistency, statistical models, undecidability, indeterminacy, complementarity, polycity, interconnectedness, and fields and frames and references are the order of the day.

Geyer & J. van der Zouwen (1992) recognize four themes in both sociocybernetics and new cybernetics:

  • To give an epistemological foundation for science as an observer-observer system. Feedback and feedforward loops are constructed not only between the observer, but also between the object that are observed, but also between them and the observer.
  • The transition form classical, rather mechanistic first order cybernetics to modern, second order cybernetics, characterized by the differences summarized by Gordon Pask.
  • These problems shifts in cybernetics involve an extremely thorough reconceptualization of many all too easily accepted and taken for granted concepts -- which yield new notions of stability, temporality, independence, structure versus behaviour, and many other concepts.
  • The actor-oriented systems approach, promulgated in 1978 made it possible to bridge the "micro-macro" gap in social science thinking.

Other topics where new cybernetics is developed are:

  • Artificial neural network
  • Living systems
  • New robotic approaches
  • Reflexive understanding
  • Political communication
  • Social dimensions of cognitive science
  • Sustainable development
  • Symbolic Artificial Intelligence
  • Systemic group therapy

Types of new Cybernetics

Cybernetics of cybernetics

The term "Cybernetics of cybernetics" is also called "second order cybernetics".

Organisational cybernetics

Organizational cybernetics is distinguished from management cybernetics. Both uses many of the same terms but interpret them according to another philosophy of systems thinking. Organizational cybernetics by contrast offers a significant break with the assumption of the hard approach. The full flowering of organizational cybernetics is represented by Beer's Viable System Model.

Organizational Cybernetics (OC) studies organizational design, and the regulation and self-regulation of organizations from a systems theory perspective that also takes the social dimension into consideration. Researchers in economics, public administration and political science focus on the changes in institutions, organisation and mechanisms of social steering at various levels (sub-national, national, European, international) and in different sectors (including the private, semi-private and public sectors; the latter sector is emphasised).

Sociocybernetics

The reformulation of sociocybernetics as an "actor-oriented, observer-dependent, self-steering, time-variant" paradigm of human systems, was most clearly articulated by Geyer and van der Zouwen in 1978 and 1986. They stated that sociocybernetics is more than just social cybernetics, which could be defined as the application of the general systems approach to social science. Social cybernetics is indeed more than such a one-way knowledge transfer. It implies a feed-back loop from the area of application - the social sciences - to the theory being applied, namely cybernetics; consequently, sociocybernetics can indeed be viewed as part of the new cybernetics: as a result of its application to social science problems, cybernetics, itself, has been changed and has moved from its originally rather mechanistic point of departure to become more actor-oriented and observer-dependent. In summary, the new sociocybernetics is much more subjective and the sociological approach than the classical cybernetics approach with its emphasis on control. The new approach has a distinct emphasis on steering decisions; furthermore, it can be seen as constituting a reconceptualization of many concepts which are often routinely accepted without challenge.

Second order cybernetics

The term "second order cybernetics" is of Heinz von Foerster's own, introduced in hindsight to denote the explicit preoccupation of the new cybernetics with the nature of self-reflexive systems.

Learning organization

The learning organization has its origins in companies like Shell, where Arie de Geus described learning as the only sustainable competitive advantage using the 1973 oil crisis as a framework. The Learning Organization is seen as a response to an increasingly unpredictable and dynamic business environment. Here are some definitions by key writers:

"The essence of organisational learning is the organization's ability to use the amazing mental capacity of all its members to create the kind of processes that will improve its own" (Nancy Dixon 1994)

"A Learning Company is an organisation that facilitates the learning of all its members and continually transforms itself" (M. Pedler, J. Burgoyne and Tom Boydell, 1991)

"Organizations where people continually expand their capacity to create the results they truly desire, where new and expansive patterns of thinking are nurtured, where collective aspiration is set free, and where people are continually learning to learn together" (Peter Senge, 1990)

Learning organizations are those that have in place systems such as "first-delivery teams" to accompany (historic) product shipments, mechanisms and processes such as strategic knowledge generation and distillation that are used to continually enhance strategic behaviours and organizations — their capabilities and those who work with it or for it — to achieve sustainable objectives for themselves and the communities in which they participate.

The important points to note about this definition are that learning organizations:

  • Are adaptive to their external environment
  • Continually enhance their capability to change/adapt
  • Develop collective as well as individual learning
  • Use the results of learning to achieve better results

Emergence

A termite "cathedral" mound produced by a termite colony: a classic example of emergence in nature.

In philosophy, systems theory and science, emergence is the way complex systems and patterns arise out of a multiplicity of relatively simple interactions. Emergence is central to the theories of integrative levels and of complex systems.


Definitions

The concept has been in use since at least the time of Aristotle. John Stuart Mill and Julian Huxley are just some of the historic luminaries who have written on the concept.

The term "emergent" was coined by the pioneer psychologist G. H. Lewes, who wrote:

"Every resultant is either a sum or a difference of the co-operant forces; their sum, when their directions are the same -- their difference, when their directions are contrary. Further, every resultant is clearly traceable in its components, because these are homogeneous and commensurable. It is otherwise with emergents, when, instead of adding measurable motion to measurable motion, or things of one kind to other individuals of their kind, there is a co-operation of things of unlike kinds. The emergent is unlike its components insofar as these are incommensurable, and it cannot be reduced to their sum or their difference." (Lewes 1875, p. 412)(Blitz 1992)

Professor Jeffrey Goldstein in the School of Business at Adelphi University provides a current definition of emergence in the journal, Emergence (Goldstein 1999). For Goldstein, emergence can be defined as: "the arising of novel and coherent structures, patterns and properties during the process of self-organization in complex systems"(Corning 2002).

Goldstein's definition can be further elaborated to describe the qualities of this definition in more detail:

"The common characteristics are: (1) radical novelty (features not previously observed in systems); (2) coherence or correlation (meaning integrated wholes that maintain themselves over some period of time); (3) A global or macro "level" (i.e. there is some property of "wholeness"); (4) it is the product of a dynamical process (it evolves); and (5) it is "ostensive" - it can be perceived. For good measure, Goldstein throws in supervenience -- downward causation." (Corning 2002)


Strong vs. weak emergence

Emergence may be generally divided into two perspectives, that of "weak emergence" and "strong emergence". Weak emergence describes new properties arising in systems as a result of the interactions at an elemental level. Emergence, in this case, is merely part of the language, or model that is needed to describe a system's behaviour.

But if, on the other hand, systems can have qualities not directly traceable to the system's components, but rather to how those components interact, and one is willing to accept that a system supervenes on its components, then it is difficult to account for an emergent property's cause. These new qualities are irreducible to the system's constituent parts (Laughlin 2005). The whole is greater than the sum of its parts. This view of emergence is called strong emergence. Some fields in which strong emergence is more widely used include etiology, epistemology and ontology.

Regarding strong emergence, Mark A. Bedau observes:

"Although strong emergence is logically possible, it is uncomfortably like magic. How does an irreducible but supervenient downward causal power arise, since by definition it cannot be due to the aggregation of the micro-level potentialities? Such causal powers would be quite unlike anything within our scientific ken. This not only indicates how they will discomfort reasonable forms of materialism. Their mysteriousness will only heighten the traditional worry that emergence entails illegitimately getting something from nothing."(Bedau 1997)

However, "the debate about whether or not the whole can be predicted from the properties of the parts misses the point. Wholes produce unique combined effects, but many of these effects may be co-determined by the context and the interactions between the whole and its environment(s)." (Corning 2002) Along that same thought, Arthur Koestler stated, "it is the synergistic effects produced by wholes that are the very cause of the evolution of complexity in nature" and used the metaphor of Janus to illustrate how the two perspectives (strong or holistic vs. weak or reductionistic) should be treated as perspectives, not exclusives, and should work together to address the issues of emergence.(Koestler 1969) Further,

"The ability to reduce everything to simple fundamental laws does not imply the ability to start from those laws and reconstruct the universe..The constructionist hypothesis breaks down when confronted with the twin difficulties of scale and complexity. At each level of complexity entirely new properties appear. Psychology is not applied biology, nor is biology applied chemistry. We can now see that the whole becomes not merely more, but very different from the sum of its parts."(Anderson 1972)


Objective or subjective quality

The properties of complexity and organization of any system are considered by Crutchfield to be subjective qualities determined by the observer.

"Defining structure and detecting the emergence of complexity in nature are inherently subjective, though essential, scientific activities. Despite the difficulties, these problems can be analysed in terms of how model-building observers infer from measurements the computational capabilities embedded in non-linear processes. An observer’s notion of what is ordered, what is random, and what is complex in its environment depends directly on its computational resources: the amount of raw measurement data, of memory, and of time available for estimation and inference. The discovery of structure in an environment depends more critically and subtly, though, on how those resources are organized. The descriptive power of the observer’s chosen (or implicit) computational model class, for example, can be an overwhelming determinant in finding regularity in data."(Crutchfield 1994)

On the other hand, Peter Corning argues "Must the synergies be perceived/observed in order to qualify as emergent effects, as some theorists claim? Most emphatically not. The synergies associated with emergence are real and measurable, even if nobody is there to observe them." (Corning 2002)


Emergence in philosophy

In philosophy, emergence is often understood to be a much stronger claim about the etiology of a system's properties. An emergent property of a system, in this context, is one that is not a property of any component of that system, but is still a feature of the system as a whole. Nicolai Hartmann, one of the first modern philosophers to write on emergence, termed this categorial novum (new category).


Emergent properties and processes

An emergent behaviour or emergent property can appear when a number of simple entities (agents) operate in an environment, forming more complex behaviours as a collective. If emergence happens over disparate size scales, then the reason is usually a causal relation across different scales. In other words there is often a form of top-down feedback in systems with emergent properties. The processes from which emergent properties result may occur in either the observed or observing system, and can commonly be identified by their patterns of accumulating change, most generally called 'growth'. Why emergent behaviours occur include: intricate causal relations across different scales and feedback, known as interconnectivity. The emergent property itself may be either very predictable or unpredictable and unprecedented, and represent a new level of the system's evolution. The complex behaviour or properties are not a property of any single such entity, nor can they easily be predicted or deduced from behaviour in the lower-level entities: they are irreducible. No physical property of an individual molecule of air would lead one to think that a large collection of them will transmit sound. The shape and behaviour of a flock of birds or shoal of fish are also good examples.

One reason why emergent behaviour is hard to predict is that the number of interactions between components of a system increases combinatorially with the number of components, thus potentially allowing for many new and subtle types of behaviour to emerge. For example, the possible interactions between groups of molecules grows enormously with the number of molecules such that it is impossible for a computer to even count the number of arrangements for a system as small as 20 molecules.

On the other hand, merely having a large number of interactions is not enough by itself to guarantee emergent behaviour; many of the interactions may be negligible or irrelevant, or may cancel each other out. In some cases, a large number of interactions can in fact work against the emergence of interesting behaviour, by creating a lot of "noise" to drown out any emerging "signal"; the emergent behaviour may need to be temporarily isolated from other interactions before it reaches enough critical mass to be self-supporting. Thus it is not just the sheer number of connections between components which encourages emergence; it is also how these connections are organised. A hierarchical organisation is one example that can generate emergent behaviour (a bureaucracy may behave in a way quite different from that of the individual humans in that bureaucracy); but perhaps more interestingly, emergent behaviour can also arise from more decentralized organisational structures, such as a marketplace. In some cases, the system has to reach a combined threshold of diversity, organisation, and connectivity before emergent behaviour appears.

Unintended consequences and side effects are closely related to emergent properties. Luc Steels writes: "A component has a particular functionality but this is not recognizable as a subfunction of the global functionality. Instead a component implements a behaviour whose side effect contributes to the global functionality [...] Each behaviour has a side effect and the sum of the side effects gives the desired functionality" (Steels 1990). In other words, the global or macroscopic functionality of a system with "emergent functionality" is the sum of all "side effects", of all emergent properties and functionalities.

Systems with emergent properties or emergent structures may appear to defy entropic principles and the second law of thermodynamics, because they form and increase order despite the lack of command and central control. This is possible because open systems can extract information and order out of the environment.

Emergence helps to explain why the fallacy of division is a fallacy. According to an emergent perspective, intelligence emerges from the connections between neurons, and from this perspective it is not necessary to propose a "soul" to account for the fact that brains can be intelligent, even though the individual neurons of which they are made are not.


Emergent structures in nature

Emergent structures are patterns not created by a single event or rule. Nothing commands the system to form a pattern. Instead, the interaction of each part with its immediate surroundings causes a complex chain of processes leading to some order. One might conclude that emergent structures are more than the sum of their parts because the emergent order will not arise if the various parts are simply coexisting; the interaction of these parts is central. Emergent structures can be found in many natural phenomena, from the physical to the biological domain. For example, the shape of weather phenomena such as hurricanes are emergent structures.

It is useful to distinguish three forms of emergent structures. A first-order emergent structure occurs as a result of shape interactions (for example, hydrogen bonds in water molecules lead to surface tension). A Second-order emergent structure involves shape interactions played out sequentially over time (for example, changing atmospheric conditions as a snowflake falls to the ground build upon and alter its form). Finally, a third-order emergent structure is a consequence of shape, time, and heritable instructions. For example, an organism's genetic code sets boundary conditions on the interaction of biological systems in space and time.

Non-living, physical systems

In physics, emergence is used to describe a property, law, or phenomenon which occurs at macroscopic scales (in space or time) but not at microscopic scales, despite the fact that a macroscopic system can be viewed as a very large ensemble of microscopic systems.

An emergent property need not be more complicated than the underlying non-emergent properties which generate it. For instance, the laws of thermodynamics are remarkably simple, even if the laws which govern the interactions between component particles are complex. The term emergence in physics is thus used not to signify complexity, but rather to distinguish which laws and concepts apply to macroscopic scales, and which ones apply to microscopic scales.

Some examples include:

  • Classical mechanics: The laws of classical mechanics can be said to emerge as a limiting case from the rules of quantum mechanics applied to large enough masses. This may be puzzling, because quantum mechanics is generally thought of as more complicated than classical mechanics.
  • Colour: Elementary particles do not absorb or emit specific wavelengths of light and thus have no colour; it is only when they are arranged in atoms that they absorb or emit specific wavelengths of light and can thus be said to have a colour.
  • Friction: Forces between elementary particles are conservative. However, friction emerges when considering more complex structures of matter, whose surfaces can convert mechanical energy into heat energy when rubbed against each other. Similar considerations apply to other emergent concepts in continuum mechanics such as viscosity, elasticity, tensile strength, etc.
  • Patterned ground: the distinct, and often symmetrical geometric shapes formed by ground material in periglacial regions.
  • Statistical mechanics was initially derived using the concept of a large enough ensemble that fluctuations about the most likely distribution can be all but ignored. However, small clusters do not exhibit sharp first order phase transitions such as melting, and at the boundary it is not possible to completely categorize the cluster as a liquid or solid, since these concepts are (without extra definitions) only applicable to macroscopic systems. Describing a system using statistical mechanics methods is much simpler than using a low-level atomistic approach.
  • Weather

Temperature is sometimes used as an example of an emergent macroscopic behaviour. In classical dynamics, a snapshot of the instantaneous momenta of a large number of particles at equilibrium is sufficient to find the average kinetic energy per degree of freedom which is proportional to the temperature. For a small number of particles the instantaneous momenta at a given time are not statistically sufficient to determine the temperature of the system. However, using the ergodic hypothesis, the temperature can still be obtained to arbitrary precision by further averaging the momenta over a long enough time.

Convection in a fluid or gas is another example of emergent macroscopic behaviour that makes sense only when considering differentials of temperature. Convection cells, particularly Bénard cells, are an example of a self-organizing system (more specifically, a dissipative system) whose structure is determined both by the constraints of the system and by random perturbations: the possible realizations of the shape and size of the cells depends on the temperature gradient as well as the nature of the fluid and shape of the container, but which configurations are actually realized is due to random perturbations (thus these systems exhibit a form of symmetry breaking).

In some theories of particle physics, even such basic structures as mass, space, and time are viewed as emergent phenomena, arising from more fundamental concepts such as the Higgs boson or strings. In some interpretations of quantum mechanics, the perception of a deterministic reality, in which all objects have a definite position, momentum, and so forth, is actually an emergent phenomenon, with the true state of matter being described instead by a wavefunction which need not have a single position or momentum. Most of the laws of physics themselves as we experience them today appear to have emerged during the course of time making emergence the most fundamental principle in the universe and raising the question of what might be the most fundamental law of physics from which all others emerged. Chemistry can in turn be viewed as an emergent property of the laws of physics. Biology (including biological evolution) can be viewed as an emergent property of the laws of chemistry. Finally, psychology could at least theoretically be understood as an emergent property of neurobiological laws.

Living, biological systems

Life is a major source of complexity, and evolution is the major principle or driving force behind life. In this view, evolution is the main reason for the growth of complexity in the natural world. If we speak of the emergence of complex living beings and life-forms, we refer therefore to processes of sudden changes in evolution.

Flocking is a well-known behaviour in many animal species from swarming locusts to schools of fish to flocks of birds. Emergent structures are a common strategy found in many animal groups: colonies of ants, mounds built by termites, swarms of bees, shoals/schools of fish, flocks of birds, and herds/packs of mammals.

An example to consider in detail is an ant colony. The queen does not give direct orders and does not tell the ants what to do. Instead, each ant reacts to stimuli in the form of chemical scent from larvae, other ants, intruders, food and build up of waste, and leaves behind a chemical trail, which, in turn, provides a stimulus to other ants. Here each ant is an autonomous unit that reacts depending only on its local environment and the genetically encoded rules for its variety of ant. Despite the lack of centralized decision making, ant colonies exhibit complex behavior and have even been able to demonstrate the ability to solve geometric problems. For example, colonies routinely find the maximum distance from all colony entrances to dispose of dead bodies.

A broader example of emergent properties in biology is the combination of individual atoms to form molecules such as polypeptide chains, which in turn fold and refold to form proteins. These proteins, assuming their functional status from their spatial conformation, interact together to achieve higher biological functions and eventually create - organelles, cells, tissues, organs, organ systems, organisms. Cascade phenotype reactions, as detailed in Chaos theory, may arise from individual genes mutating respective positioning. In turn, all the biological communities in the world form the biosphere, where its human participants form societies, and the complex interactions of meta-social systems such as the stock market.


Emergence in culture and engineering

Emergent processes or behaviours can be seen in many places, such as traffic patterns, cities, political systems of governance, cabal and market-dominant minority phenomena in politics and economics, organizational phenomena in computer simulations and cellular automata.

Economics

The stock market is an example of emergence on a grand scale. As a whole it precisely regulates the relative security prices of companies across the world, yet it has no leader; there is no one entity which controls the workings of the entire market. Agents, or investors, have knowledge of only a limited number of companies within their portfolio, and must follow the regulatory rules of the market and analyse the transactions individually or in large groupings. Trends and patterns emerge which are studied intensively by technical analysts.

World Wide Web & Internet

The World Wide Web (WWW) is a popular example of a decentralized system exhibiting emergent properties. There is no central organization rationing the number of links, yet the number of links pointing to each page follows a power law in which a few pages are linked to many times and most pages are seldom linked to. A related property of the network of links in the World Wide Web is that almost any pair of pages can be connected to each other through a relatively short chain of links. Although relatively well known now, this property was initially unexpected in an unregulated network. It is shared with many other types of networks called small-world networks.(Barabasi, Jeong, & Albert 1999, pp. 130-131)

Internet traffic also exhibits several emergent properties. The burstiness of Internet traffic has been well-noted as a fractal or self-similar distribution (Leland et. al. 1994, pp. 1-15). This self-similar traffic has been found to have a similar Hurst exponent of approximately 0.8 (fractal dimension 1.2) in measurements of various Internet links around the world regardless of user profiles or traffic types (web, email, P2P, etc.). In addition, due to the congestion control mechanism, TCP flows can become globally synchronized at bottlenecks simultaneously increasing and then decreasing throughput in coordination. Congestion, widely regarded as a nuisance, is an emergent property of the spreading of bottlenecks across a network in high traffic flows which can be considered as a phase transition (see review of related research in (Smith 2008, pp. 1-31)).

Architecture and cities

Bangkok can be seen as a example of spontaneous order

Emergent structures appear at many different levels of organization or as spontaneous order. Emergent self-organization appears frequently in cities where no planning or zoning entity predetermines the layout of the city. (Krugman 1996, pp. 9-29) The interdisciplinary study of emergent behaviors is not generally considered a homogeneous field, but divided across its application or problem domains.

Often architects and landscapers will not design all the pathways of a complex of buildings. Instead they will let usage patterns emerge and then place pavement where pathways have become worn in.

The on-course action and vehicle progression of the 2007 Urban Challenge could possibly be regarded as an example of cybernetic emergence. Patterns of road use, indeterministic obstacle clearance times, etc. will work together to form a complex emergent pattern that can not be deterministically planned in advance.

Architecture firms that work specifically with the concept of Emergence as it relates to the built environment include Emergent Architecture, founded by Tom Wiscombe in 1999.

Mathematics

A Möbius strip in mathematics demonstrates emergence

Although the above examples of emergence are often contentious, mathematics provides a rigorous basis for defining and demonstrating emergence. In Emergence is coupled to scope, not level, Alex Ryan shows that a Möbius strip has emergent properties (Ryan 2006). The Möbius strip is a one-sided, one-edged surface. Further, a Möbius strip can be constructed from a set of two-sided, three edged, triangular surfaces. Only the complete set of triangles is one-sided and one-edged: any subset does not share these properties. Therefore, the emergent property can be said to emerge precisely when the final piece of the Möbius strip is put in place. An emergent property is a spatially or temporally extended feature – it is coupled to a definite scope, and cannot be found in any component because the components are associated with a narrower scope.

Pithily, emergent properties are those that are global, topological properties of the whole.

Computer AI

Some artificially intelligent computer applications work off of emergent behavior. One example are Boids which flock.

Language

It has been argued that language, or at least language change, are emergence phenomena. While each speaker merely tries to reach his own communicative goals, he uses language in a particular way. If enough speakers behave in that way, language is changed (Keller 1994).

Fads and beliefs

An emergent concept (EC) is a slight variation on consensus reality that is accepted as plausible. The hallmarks of an emergent concept, as opposed to some categories of Internet memes/phenomena, urban myths, or the like, are that EC are increasingly accepted as truth or plausible, based upon other empirical or anecdotal evidence in the mind of the believer or society (in its subsets) as a whole.


Emergence in political philosophy

Economist and philosopher Friedrich Hayek wrote about emergence in the context of law, politics, and markets. His theories are most fully developed in Law, Legislation and Liberty, which sets out the difference between cosmos or "grown order" (that is, emergence), and taxis or "made order". Hayek dismisses philosophies that do not adequately recognize the emergent nature of society, and which describe it as the conscious creation of a rational agent (be it God, the Sovereign, or any kind of personified body politic, such as Hobbes's leviathan). The most important social structures, including the laws ("nomos") governing the relations between individual persons, are emergent, according to Hayek. While the idea of laws and markets as emergent phenomena comes fairly naturally to an economist, and was indeed present in the works of early economists such as Bernard Mandeville, David Hume, and Adam Smith, Hayek traces the development of ideas based on spontaneous-order throughout the history of Western thought, occasionally going as far back as the presocratics. In this, he follows Karl Popper, who blamed the idea of the state as a made order on Plato in The Open Society and its Enemies.

Emergence in organisational theory

Emergence in organisational theory is referred to as the complex process whereby the right person or idea emerges exactly at the right moment. Just when a problem or a necessity arises, the potential solutions also emerge.