Difference between revisions of "Concepts of Systems Thinking"

From SEBoK
Jump to: navigation, search
(Control Behavior)
Line 88: Line 88:
Ackoff defines [[Function (glossary)]] as outcomes which contribute to goals or objectives.  To have a function, a system must be able to provide the outcome in two or more different ways (this is called '''Equifinity''').  
Ackoff defines [[Function (glossary)]] as outcomes which contribute to goals or objectives.  To have a function, a system must be able to provide the outcome in two or more different ways (this is called '''Equifinality''').  
This view of function and behavior is common in systems science.  In this paradigm all system elements have behavior of some kind, but to be capable of functioning in certain ways requires a certain richness of behaviors.
This view of function and behavior is common in systems science.  In this paradigm all system elements have behavior of some kind, but to be capable of functioning in certain ways requires a certain richness of behaviors.

Revision as of 17:59, 30 July 2012

This article forms part of the Systems Thinking Knowledge Area. It defines a system of systems concepts , knowledge that can be used to understand problems and solutions to support Systems Thinking.

The concepts below have been synthesised from a number of sources. Ackoff (1971) proposed a System of "System Concepts" as part of General Systems Theory (GST). (Skyttner 2001) describes the main (GST) concepts from a number of Systems Science authors; (Flood and Carlson 1993) give a description of concepts as an overview of systems thinking; (Hitchins 2007) relates the concepts to systems engineering practice.

Wholeness and Interaction

A system is defined by a set of elements which exhibit sufficient cohesion (Hitchins 2007) or "togetherness" (Boardman and Sauser 2008) to form a "bounded" whole.

According to (Hitchins 2009, p. 60) “interaction” between elements is the key system concept. The focus on interactions and holism is a push-back against the perceived reductionist focus on parts, and a recognition that in complex systems, the interactions among parts is at least as important as the parts themselves.

An open system is defined by the interactions between System Elements (glossary) within a system boundary and by the interaction between system elements and other systems within an environment , see What is a System?.

The remaining concepts below apply to open systems.


regularity is a uniformity or similarity (Bertalanffy 1968) that exists in multiple entities or at multiple times. The importance of regularities is that they make science possible and engineering efficient and effective. Without regularities, we would be forced to consider every natural and artificial system problem and solution, as unique. We would have no scientific laws, and no categories or taxonomies, and each engineering effort would start from a clean slate.

“Similarities” and “Differences” exist in any set or population. Every system problem or solution can be regarded as unique but no problem/solution is entirely unique. The nomothetic approach assumes regularities among entities and investigates what the regularities are. The idiographic approach assumes each entity is unique and investigates the unique qualities of entities, (Bertalanffy 1975).

A very large amount of regularity exists in both natural systems and engineered system . The Principles and Patterns of Systems Thinking capture and exploit that regularity.

Separation of Concerns

abstraction is the process of taking away characteristics from something in order to reduce it to a set of essential characteristics (SearchCIO 2012). In attempting to understand complex situations it is easier to focus on bounded problems, whose solutions “still remaining agnostic to the greater problem.” (Erl 2012). This sounds reductionist, but is applied effectively in natural systems and engineered systems. The key is that one of the selected problems needs to be the concerns of the system as a whole. This idea of a balance between using abstraction to focus on specific concerns while ensuring we continue to consider the whole is at the centre of Systems Approaches.

A view is a subset of information observed of one or more entities, such as systems. The physical or conceptual point from which a view is observed is the viewpoint , which can be motivated by one or more observer concerns. Different views of the same target must be both separated, to reflects separation of concerns, and integrated such that all views of a given target are consistent and form a coherent whole (Hybertson 2009). Sample views of a system: internal (what does it consist of?); external (what are its properties and behavior as a whole?); static (parts, structures); dynamic (interactions).

Encapsulation is the enclosing of one thing within another or the degree to which it is enclosed. System encapsulation encloses system elements and their interactions from the external environment, and usually involves a system boundary that hides the internal from the external. Encapsulation is associated with modularity the degree to which a system's components may be separated and recombined (Griswold 1995). Modularity applies to systems in many domains, natural, social and engineered. In engineering, encapsulation is the isolation of a system function within a module and providing precise specifications for the module (IEEE Std. 610.12-1990).

State and Behavior

Any quality or property of a system element is called an attribute . The state of a system is a set of system attributes at a given time. A "System Event" describes any change to attributes of a system (or environment) and hence its state:

  • Static - a single state exists with no events.
  • Dynamic - multiple possible stable states exist.
  • Homeostatic - system is static but its elements are dynamic. The system maintains its state by internal adjustments.

A stable state is one in which a system will remain until another event occurs.

State can be monitored using "State Variables", values of attributes which indicate the system state. The set of possible values of state variables over time is called the "state space". State variables are generally continuous, but can be modeled using a "Finite State Model" (or "State Machine").

(Ackoff 1971) considers "change" to be how a system is affected by events, and System behavior as the effect a system has upon its environment. A system can

  • react to a request by turning on a light,
  • respond to darkness by deciding to turn on the light
  • act to turn on the lights at a fixed time, randomly or with discernible reasoning.

A “Stable System” is one which has one or more stable states within an environment for a range of possible events:

  • Deterministic systems have a one-to-one mapping of state variables to state space, allowing future states to be predicted from past states.
  • Non-Deterministic systems have a many-to-many mapping of state variables; future state cannot be reliably predicted.

The relationship between determinism and system complexity, including the idea of Chaotic systems is further discussed in the Complexity article.

Survival Behavior

Systems often act to continue to exist, behaving to sustain themselves in one or more alternative viable states. Many natural or social systems have this goal, either consciously or as a "self organizing" system, arising from the interaction between elements.

entropy is the tendency of systems to move towards disorder or disorganization. In physics, entropy is used to describe how “organized” heat energy is “lost” into the “random” background energy of the surrounding environment; e.g. 2nd Law of Thermodynamics. A similar effect can be seen in engineered systems. What happens to a building or garden, which is left unused for any time? Entropy can be used as a metaphor for aging, skill fade, obsolescence, misuse, boredom, etc.

"Negentropy" describes the forces working in a system to hold off entropy. homeostasis is the biological equivalent of this, describing behavior which maintains a "steady state" or "dynamic equilibrium". Examples in nature include human cells, which maintain the same function while replacing their physical content at regular intervals. Again this can be used as a metaphor for the fight against entropy, e.g. training, discipline, maintenance, etc.

(Hitchins 2007) describes the relationship between the viability of a system and the number of connections between its elements. Hitchins' concept of "connected variety" states that stability of a system increases with its connectivity (both internally and with its environment). See variety below.

Goal Seeking Behavior

Some systems have reasons for existence beyond simple survival. Goal seeking is one of the defining characteristics of Engineered Systems

  • A goal is a specific outcome which a system can achieve in a specified time
  • An objective is a longer term outcome which can be achieved through a series of goals.
  • An ideal is an objective which cannot be achieved with any certainty, but for which progress towards the objective has value.

Systems may be single goal seeking (perform set tasks), multi-goal seeking (perform related tasks) or reflective (set goals to tackle objectives or ideas). There are two types of goal seeking systems:

  • purposive systems have multiple goals, with some shared outcome. Such a system can be used to provide pre-determined outcomes, within an agreed time period. Such a system may have some freedom to choose how to achieve the goal. If it has memory it may develop processes describing the behaviors needed for defined goals. Most machines or software systems are purposive.
  • purposeful systems are free to determine the goals needed to achieve an outcome. Such a system can be tasked to pursue objectives or Ideals over a longer time through a series of goals. Humans and sufficiently complex machines are purposeful.

Control Behavior

cybernetics , the science of control, defines two basic control mechanisms:

  • Negative feedback, maintaining system state against a set objectives or levels.
  • Positive feedback, forced growth or contraction to new levels.

One of the main concerns of cybernetics is the balance between stability and speed of response. A black-box system view looks at the whole system. Control can only be achieved by carefully balancing inputs with outputs which reduces speed of response. A white-box system view considers the system elements and their relationships; here control mechanisms can be imbedded into this structure giving more responsive control and associated risks to stability.

Another useful control concept is that of a "meta-system", which sits over the system and is responsible for controlling its functions, either as a black-box or white-box. In this case, behavior arises from the combination of system and meta-system.

variety describes the number of different ways elements can be controlled, dependent on the different ways in which then can be combined. The law of requisite variety (Ashby 1956) states that a control system must have at least as much variety as the system it is controlling.


Ackoff defines function as outcomes which contribute to goals or objectives. To have a function, a system must be able to provide the outcome in two or more different ways (this is called Equifinality).

This view of function and behavior is common in systems science. In this paradigm all system elements have behavior of some kind, but to be capable of functioning in certain ways requires a certain richness of behaviors.

In most hard systems approaches (Flood and Carson 1993) a set of functions are described from the problem statement, and then associated with one or more alternative element structures. This process may be repeated until a system component (implementable combinations of function and structure) has been defined (Martin 1997). Here "function" is defined as a task or activity that must be performed to achieve a desired outcome; or as a "transformation" of inputs to outputs. This transformation may be:

  • Synchronous, a regular interaction with a closely related system.
  • Asynchronous, an irregular response to a demand from another system, often triggering a set response.

The behavior of the resulting system is then assessed as a combination of function and effectiveness . In this case behavior is seen as an external property of the system as a whole, and often described as analogous to human or organic behavior (Hitchins 2009).

Hierarchy, Emergence and Complexity

System behavior is related to combinations of element behaviors. Most systems exhibit "increasing variety"; i.e., they have behavior resulting from the combination of element behaviors. The term "synergy", or weak emergence, is used to describe the idea that “the whole is greater than the sum of the parts”. While this is generally true, it is also possible to get reducing variety in which the whole function is less than the sum of the parts.

Complexity frequently takes the form of Hierarchy (glossary)|hierarchies (glossary)]]; hierarchic systems have some common properties independent of their specific content; hierarchic systems will evolve far more quickly than non-hierarchic systems of comparable size Simon (1996). A natural system hierarchy is a consequence of wholeness, with strongly cohesive elements grouping together forming structures which reduce complexity and increase robustness (Simons 1962).

Socio-technical systems form "control hierarchies", with systems at a higher level having some ownership of control over those at lower levels. Hitchins (2009) describes how systems form "preferred patterns" which can be used to the enhanced stability of interacting systems hierarchies.

Looking across a hierarchy of systems generally reveals increasing complexity at the higher level, relating to both the structure of the system and how it is used. The terms emergence describes behaviors emerging across a complex system hierarchy.

Effectiveness, Adaptation and Learning

Systems effectiveness is a measure of the system's ability to perform the functions necessary to achieve goals or objectives. Ackoff (1971) defines this as the product of the number of combinations of behavior to reach a function and the efficiency of each combination.

Hitchins (2007) describes effectiveness as a combination of performance (how well a function is done in ideal conditions), availability (how often the function is there when needed) and survivability (how likely is it that the system will be able to use the function fully).

Duality is a characteristic of systems in which they exhibit seemingly contradictory characteristics that are important for the system (Hybertson 2009). The yin yang concept in Chinese philosophy emphasizes the interaction between dual elements and their harmonization, ensuring a constant dynamic balance often through a cyclic dominance of one element and then the other, such as day and night (Internet Encyclopedia of Philosophy 2006).

From a systems perspective, the interaction, harmonization, and balance are important. For example control behavior as discussed above is a balance between:

  • Specialization, the focus of system behavior to exploit particular features of its environment.
  • flexibility , the ability of a system to adapt quickly to environmental change.

(Hybertson 2009) defines Leverage as the duality between

  • Power the extent to which a system solves a specific problem
  • Generality the extent to which a system solves a whole class of problems.

While some systems or elements may be optimised for one extreme of such dualities a dynamic balance is needed to be effective in solving complex problems.

System elements and their environment change in a positive, neutral, or negative in individual situations. An adaptive System is one that is able to change itself or its environment if its effectiveness is insufficient to achieve its current or future goals or objectives. Ackoff defines four types of adaptation, changing the environment or the system, in response to internal or external factors.

A system may also "learn", improving its effectiveness over time, without any change in state or goal.


Works Cited

Bertalanffy, L. von. 1968. General System Theory: Foundations, Development, Applications, Revised ed. New York, NY, USA: Braziller.

Ackoff, R.L. 1971. "Towards a System of Systems Concepts". Management Science. 17(11).

Ashby, W R. 1956. "Chapter 11". Introduction to Cybernetics. London, UK: Wiley.

Flood, R.L. and E.R. Carson. 1993. Dealing With Complexity: An Introduction to the Theory and Application of Systems Science. New York, NY, USA: Plenum Press.

Hitchins, D. 2007. Systems Engineering: A 21st Century Systems Methodology. Hoboken, NJ, USA: John Wiley and Sons.

Hitchins, D. 2009. "What are the General Principles Applicable to Systems?" INCOSE Insight. 12(4): 59-63.

Martin J. N. 1997. Systems Engineering Guidebook. Boca Raton, FL, USA: CRC Press.

Skyttner, L. 2001. General Systems Theory: Ideas and Applications. Singapore: World Scientific Publishing Co. p. 53-69.

Simon, H. A. 1962. "The Architecture of Complexity." Proceedings of the American Philosophical Society. 106(6) (Dec. 12, 1962): 467-482.

Primary References

Ackoff, R.L. 1971. "Towards a System of Systems Concepts." Management Science. 17(11).

Hitchins, D. 2009. "What are the General Principles Applicable to Systems?" INCOSE Insight. 12(4): 59-63.

Additional References

Edson, Robert. 2008. Systems Thinking. Applied. A Primer. Arlington, VA, USA: Applied Systems Thinking Institute (ASysT), Analytic Services Inc.

Hitchins, D. 2007. Systems Engineering: A 21st Century Systems Methodology. Hoboken, NJ, USA: John Wiley & Sons.

Jackson, S., D. Hitchins, and H. Eisner. 2010. "What is the Systems Approach?" INCOSE Insight. 13(1) (April 2010): 41-43.

Lawson, H. 2010. A Journey Through the Systems Landscape. London, UK: College Publications, Kings College.

Waring, A. 1996. "Chapter 1." Practical Systems Thinking. London, UK: International Thomson Business Press.

< Previous Article | Parent Article | Next Article >

Comments from SEBoK 0.5 Wiki

Please note that in version 0.5, this article was titled "Overview of System Concepts”.

<html> <iframe src="http://www.sebokwiki.org/05/index.php?title=Talk:Overview_of_System_Concepts&printable=yes" width=825 height=200 frameborder=1 scrolling=auto> </iframe> </html>

SEBoK v. 1.9.1 released 30 September 2018

SEBoK Discussion

Please provide your comments and feedback on the SEBoK below. You will need to log in to DISQUS using an existing account (e.g. Yahoo, Google, Facebook, Twitter, etc.) or create a DISQUS account. Simply type your comment in the text field below and DISQUS will guide you through the login or registration steps. Feedback will be archived and used for future updates to the SEBoK. If you provided a comment that is no longer listed, that comment has been adjudicated. You can view adjudication for comments submitted prior to SEBoK v. 1.0 at SEBoK Review and Adjudication. Later comments are addressed and changes are summarized in the Letter from the Editor and Acknowledgements and Release History.

If you would like to provide edits on this article, recommend new content, or make comments on the SEBoK as a whole, please see the SEBoK Sandbox.

blog comments powered by Disqus