Concepts of Systems Thinking
- 1 What is a Concept
- 2 System and Structure
- 3 State
- 4 Behavior and Function
- 5 Entropy & Homeostasis
- 6 Regulation and Control
- 7 Effectiveness, Adaption and Learning
- 8 Completeness
- 9 Hierarchy and Emergence
- 10 References
- 11 Signatures
What is a Concept
General System Theory (von Bertalanffy, 1968) considers the similarities between systems from different domain as a set of common systems concepts. GST allows us to make comparisons between system based on different technologies; judge the goodness or completeness of a system and develop domain independent systems approaches which can form the basis of disciplines such as Systems Engineering.
GST tendeds to concentrate on the principles and philosophy behind this idea. “Despite the importance of system concepts … we do not yet have a unified or integrated set (i.e. a system) of such concepts” (Ackoff, 1971). While many researchers and practitioners have created generic concepts, these tend to be a stepping stone to theories and approaches. This situation is made worse by the variety of domains and disciplines in which systems research is conducted and reported. Ackoff proposes a System of System Concepts to bring together the wide variety of concepts which have been proposed, into 30 distinct concepts, grouped under headings of Systems, System Change, Classifications of Behavior and Adaptation and Learning. It is written from a systems research perspective and can be a little abstract and hard to relate to practice. (Skyttner, 2001) describes the main GST concepts proposed by a number of authors; (Flood and Carlson, 1993) give a description of concepts as an overview of systems thinking; (Hitchins, 2007) relates the concepts to Systems Engineering Practice.
System and Structure
The definition of System (Glossary) includes the fundamental concepts of a set of Elements which exhibit sufficient Cohesion (Hitchins, 2007) or Togetherness (Boardman and Sauser 2008) to form a Bounded whole.
A system exists in an Environment which contains related systems and conditions:
- Closed System, has no relationships with the environment.
- Open System, shares Inputs and Outputs with its environment across the boundary.
System elements may be conceptual organisations of ideals in symbolic form or real objects, e.g. people, data, physical artifacts, etc.
- Abstract system all elements are conceptual.
- Concrete system contains at least two elements which are objects.
Unless other wise stated, the remaining concepts below apply to open, concrete systems.
Any quality or property of a system element is called an Attribute. State is a set of system attributes at a given time. A System Event describes any change to attributes of a system (or environment) and hence its state:
- Static, a single state exists with no events.
- Dynamic, multiple possible Stable states exist. A stable state is one in which a system will remain until another event occurs.
- Homeostatic, system is static but its elements are dynamic. The system maintains its state by internal adjustments.
State can be monitored using State Variables attributes which indicate system state. The set of possible combinations of State over time is called State Space. State is generally Continuous, but can be modelled using a Finite State Model (or State Machine).
- Deterministic systems has a one-to-one mapping of state variables to state space, allowing future states to be predicted from past states.
- Non-Deterministic systems have a many-to-many mapping of state variables, future state cannot be predicted. This may be due to random changes in state, or because its structure is sufficiently complex that while it is deterministic it may take up different states due to very small (below our ability to measure) differences it starting state.
The later is one definition of a Chaotic system, e.g. stock market or weather, are examples whose past states can be explained using deterministic reasoning, but whose future states cannot be predicted with any certainty.
Behavior and Function
(Ackoff, 1971) considers Change to be how a system is affected by events, and Behavior as the effect a system has upon its environment.
Three kind of System Change are described: we React to a request by turning on a light, or we Respond to darkness by deciding to turn on the light or we Act to turn on the lights at a fixed time, randomly or with discernable reasoning.
System Behavior is a change which leads to events in itself or other systems. Thus, action, reaction or response may constitute behavior in some cases. Systems have varying levels of Goal Seeking behaviour:.
- A systems goal may be to simply exist, behaving to sustain itself in its current or one or more alternative viable states. Many natural or social systems have this goal.
- Purposive systems have multiple goals, with some shared outcome. Such a system can be asked/used to provide pre-determined outcomes, within an agreed time period. Such a system may have some freedom to choose how to achieve the goal. If it has memory it may develop Processes describing the behaviors needed for defined goals. Most machines or software systems are purposive.
- Purposeful systems are free to determine the goals needed to achieve an outcome. Such a system can be tasked to pursue Objective over a longer time through a series of goals. Humans, and sufficiently complex machines, are purposeful.
Ackoff defines Function as outcomes which contribute to goals or objectives. To have a function, a system must be able to provide the outcome in two or more different ways (this is called Equifinity).
This view of function and behavior is common in system science. In this paradigm all system elements have behavior of some kind, but to be capable of functioning in certain ways requires a certain richness of behaviors.
In most hard systems approaches (Flood and Carson, 1993) a set of functions are described from the problem statement, and then associated with one or more alternative element structures. This process may be repeated until system Components (implementable combinations of function and structure) have been defined (Martin, 1997). Here Function is defined as a task or activity that must be performed to achieve a desired outcome; or as a Transformation of Inputs to Outputs.
- Synchronous, a regular interaction with a closely related system.
- Asynchronous, an irregular response to a demand from another system, often triggering a set response.
The behavior of the resulting system is then assessed. In this case behavior is seen as an external property of the system as a whole, and often describe as analogous to human or organic behavior (Hitchins, 2007).
Entropy & Homeostasis
Entropy is the tendency of systems to move towards disorder or disorganisation. In physics entropy is used to describe how “organised” heat energy is “lost” into the “random” background energy of the surrounding environment, e.g. 2nd law of Thermodynamics.
A similar effect can be seen in engineered systems. What happens to a building or garden, which is left unused for any time? Entropy can be used as a metaphor for aging, skill fade, obsolescence, miss-use, boredom, etc.
Negentropy describes the forces working in a system to hold off entropy. Homeostasis is the biological equivalent of this, describing behavior which maintains a Steady State or Dynamic Equilibrium. Examples of this process in nature include human cells, which maintain the same function while replacing their physical content at regular intervals.
Again this can be used as a metaphor for the fight against entropy, e.g. training, discipline, maintenance, etc. Homeostasis may be designed into a system or it can be Self Organising, arising from the interaction between elements.
Regulation and Control
Cybernetics, the science of control, defines two basic control mechanisms:
- Negative feedback, maintaining system state against set objectives or levels.
- Positive feedback, forced growth or contraction to new levels.
One of the main concerns of cybernetics is the balance between stability and speed of response. Cybernetic considers systems in three ways. A Black-Box view looks at the whole system, control can only be achieved by carefully balancing inputs with outputs which mitigates against responding quickly. A White-Box view considers the system elements and their relationships; here control mechanisms can be imbedded into this structure giving more responsive control and associated risks to stability. A Grey-Box view sits between these two, with control exerted at the major sub-system level. Another useful control concept is that of a Meta-System, which sits over the system and is responsible for controlling its functions, either as a black-box or white-box. In this case behavior arises from the combination of system and meta-system.
Systems behavior is influenced by Variety, in particular in its control functions. The law of requisite variety (Ashby, 1956) states that a control system must have great than or equal to the variety of the system it is controlling. The effect of variety on system behavior can often be seen in the relationship between:
- Specialisation, the focus of system behaviour to exploit particular features of its environment.
- Flexibility, the ability of a system to adapt quickly to environmental change.
Effectiveness, Adaption and Learning
A systems Effectiveness is a measure of its ability to perform the functions necessary to achieve goals or objectives. (Ackoff, 1971) defines this as the product of the number of combinations of behavior to reach a function and the efficiency of each combination.
(Hitchins, 2007) describes effectiveness as a combination of performance (how well a function is done in ideal conditions), availability (how often the function is there when needed) and survivability (how likely is it that the system will be able to use the function fully).
An Adaptive System is one able change itself or its environment if its effectiveness is insufficient to achieve its current or future goals or objectives. Ackoff defines four types of adaption, changing the environment or the system, in response to internal or external factors.
A system may also Learn, improving its effectiveness over time, without any change in state or goal.
How do we apply system concepts to an Engineered system, (Hitchins, 2009) proposes a set of necessary and sufficient questions to help ensure all systemic issues have been considered when assessing an existing or proposed system description.
- System Reactions. This describes the ability of a system to rearrange itself.
- Adaptation. This describes the necessity for a system to adapt faster than the rate of change of the environment.
- Connected Variety. This describes the ability of connected systems to achieve stability through increased variety.
- Preferred Patterns. This describes how the stability of interacting systems is enhanced by increased cohesion.
- Cyclic Progression. This describes how systems, primarily political and economic systems, cycle in response to variations in input energy and feedback loops.
Hitchins Generic Reference Model asks questions under six heading based on these concepts, related to its Function (what it does and Form (what the system is).
Hierarchy and Emergence
System behavior is related to the combinations of element behaviors. Most systems exhibit increasing variety, they have behavior resulting from the combination of element behaviour. The term Synergy, or weak emergence, is used to describe the idea that “the whole being greater than the sum of the parts”. While this is generally true it is also possible to get reducing variety in which the whole function at less than the sum of the part.
Open Systems tend to form Hierarchies of coherent System Elements, or Sub-Systems. In natural system hierarchy is a consequence of wholeness, with strongly cohesive elements grouping together, forming structures which reduce complexity and increase robustness (Simons, 1962). Soci-technical systems formControl Hierarchies, with systems at a higher level having some ownership of control over those at lower levels.
We take advantage of this by viewing systems using Systemic Reduction. A system is characterised by its behaviour in a wider system or environment and considered in detail as a set of sub-system Structures and Functions. This system description is focused at a particular level of resolution. We can change this level of resolution by focusing upon the wider system, or upon one of the sub-systems. While this allows us to focus on a given system-of-interest we must take care to continue to take a holistic view of the wider system and environment.
When we look across a hierarchy of system we generally see increasing Complexity at the higher level, relating to both the structure of the system and how it is used. The terms Emergence and Emergent Properties are generally used to describe behaviors emerging across a complex system hierarchy. These last two ideas are fundamental to Engineered System and the Systems Approach, and are discussed in more detail in the related topics.
von Bertalanffy, L. 1968. General system theory: Foundations, development, applications. Revised ed. New York, NY: Braziller.
Ackoff, R.L. 1971. Twords a System of Systems Concepts, Management Science, Vol.17 No. 11, USA.
Flood, R.L. and Carson, E.R. (1993) Dealing With Complexity: An Introduction to the Theory and Application of Systems Science. New York: Plenum Press
Hitchins, D. (2007) Systems Engineering: A 21st Century Systems Methodology: Wiley
Skyttner, L. (2001) General Systems Theory: Ideas and Applications. Singapore: World Scientific Publishing Co. (Pages 53 - 69)
Ashby, W R. 1956. Introduction to Cybernetics London; Wiley (chapter 11)
Simon, H. A. 1962. The Architecture of Complexity. Proceedings of the American Philosophical Society, Vol. 106, No. 6. (Dec. 12, 1962), pp. 467-482.
Martin J. N. 1997. Systems Engineering Guidebook. CRC Press, 1997. ISBN 0849378370
Checkland, Peter. 1999. Systems Thinking, Systems Practice. New York: John Wiley & Sons.
Hitchins, Derek. 2009. What are the General Principles Applicable to Systems? Insight, 59-63.
Waring, A. (1996) Practical Systems Thinking. London: International Thomson Business Press (Chapter 1)
Edson, Robert. 2008. Systems Thinking. Applied. A Primer. edited by AsysT Institute. Arlington, VA: Analytic Services.
Hitchins, Derek K. 2007. Systems Engineering: A 21st Century Systems Methodology Edited by A. P. Sage, Wiley Series in Systems Engineering and Management. Hoboken, NJ: John Wiley & Sons.
Jackson, Scott, Derek Hitchins, and Howard Eisner. 2010. What is the Systems Approach? INCOSE Insight, April, 41-43.
Lawson, Harold. 2010. A Journey Through the Systems Landscape. London: College Publications, Kings College.
Peter Checkland is an INCOSE Pioneer and one of the most respected authorities on systems theory.
Derek Hitchins is an INCOSE Fellow and an author of several books on systems theory and systems engineering. He is a respected authority on both subjects.]]]