Difference between revisions of "Concepts of Systems Thinking"

From SEBoK
Jump to: navigation, search
(Introduction)
(Byline)
 
(239 intermediate revisions by 16 users not shown)
Line 1: Line 1:
==Introduction==
+
----
 +
'''''Lead Author:''''' ''Rick Adcock'', '''''Contributing Authors:''''' ''Scott Jackson, Janet Singer, Duane Hybertson''
 +
----
 +
This article forms part of the [[Systems Thinking]] knowledge area (KA). It describes {{Term|Systems Concept (glossary)|systems concepts}}, knowledge that can be used to understand {{Term|Problem (glossary)|problems}} and {{Term|Solution (glossary)|solutions}} to support [[Systems Thinking|systems thinking]].
  
In the Systems Overview Knowledge Area (link) we defined System (Glossary) as a general idea which can be applied to help understand complex real world situations, and System Science (Glossary) as a collective term for a number of scientific disciplines which have applied this idea to different domains in different ways.
+
The {{Term|Concept (glossary)|concepts}} below have been synthesized from a number of sources, which are themselves summaries of concepts from other authors. Ackoff (1971) proposed a system of system concepts as part of {{Term|General System Theory (glossary)|general system theory}} (GST); Skyttner (2001) describes the main GST concepts from a number of {{Term|Systems Science (glossary)|systems science}} authors; Flood and Carlson (1993) give a description of concepts as an overview of systems thinking; Hitchins (2007) relates the concepts to {{Term|Systems Engineering (glossary)|systems engineering}} practice; and Lawson (2010) describes a system of system concepts where systems are categorized according to fundamental concepts, types, topologies, focus, {{Term|Complexity (glossary)|complexity}}, and roles.  
  
==Principles and Concepts==
+
==Wholeness and Interaction==  
 +
A {{Term|System (glossary)|system}} is defined by a set of {{Term|Element (glossary)|elements}} which exhibit sufficient {{Term|Cohesion (glossary)|cohesion}}, or "togetherness", to form a bounded whole (Hitchins 2007; Boardman and Sauser 2008).
  
General System Theory (von Bertalanffy, 1968) considers the similarities between systems from different domain as a set of common systems Principles and Concepts. GST allows us to make comparisons between system based on different technologies; judge the goodness or completeness of a system and develop domain independent systems approaches which can form the basis of disciplines such as Systems Engineering.
+
According to Hitchins, interaction between elements is the "key" system concept (Hitchins 2009, 60). The focus on interactions and {{Term|Holism (glossary)|holism}} is a push-back against the perceived {{Term|Reductionism (glossary)|reductionist}} focus on parts and provides recognition that in {{Term|Complex (glossary)|complex}} systems, the interactions among parts is at least as important as the parts themselves.
  
GST tends to concentrate on the principles and philosophy behind this idea.  “Despite the importance of system concepts … we do not yet have a unified or integrated set (i.e. a system) of such concepts” (Ackoff, 1971).  
+
An {{Term|Open System (glossary)|open system}} is defined by the interactions between {{Term|System Element (glossary)|system elements}} within a {{Term|System Boundary (glossary)|system boundary}} and by the interaction between system elements and other systems within an {{Term|Environment (glossary)|environment}} (see [[What is a System?]]). The remaining concepts below apply to open systems.
  
We define:
+
==Regularity==
  
*A principle is basically a rule of conduct or behaviorTo take this further we can say that a principle is a “basic generalization that is accepted as true and that can be used as a basis for reasoning or conduct.” [WordWeb.com]  A principle can also be thought of as a “basic truth or law or assumption.” [ibid.] 
+
{{Term|Regularity (glossary)|Regularity}} is a uniformity or similarity that exists in multiple entities or at multiple times (Bertalanffy 1968)Regularities make science possible and {{Term|Engineering (glossary)|engineering}} efficient and effective. Without regularities, we would be forced to consider every natural and artificial system problem and solution as unique. We would have no scientific laws, no categories or taxonomies, and each engineering effort would start from a clean slate.  
  
*A concept is an abstraction; a general idea inferred or derived from specific instances. For example, we look at our pet dog and we can infer that there are other dogs of that “type.”  Hence, from this observation (or perhaps a set of observations) we develop a concept of a dog in our mind. Concepts are bearers of meaning, as opposed to agents of meaning and can only be thought about, or designated, by means of a name.  
+
Similarities and differences exist in any set or population. Every system problem or solution can be regarded as unique, but no problem/solution is in fact entirely unique. The nomothetic approach assumes regularities among entities and investigates what the regularities are. The idiographic approach assumes each entity is unique and investigates the unique qualities of entities, (Bertalanffy 1975).
  
Principles depend on concepts in order to state this “truth.”  Hence, principles and concepts go hand in hand; principles cannot exist without concepts and concepts are not very useful without principles to help us understand the proper way to act.
+
A very large amount of regularity exists in both natural systems and {{Term|Engineered System (glossary)|engineered systems}}. [[Patterns of Systems Thinking|Patterns of systems thinking]] capture and exploit that regularity.
  
While many researchers and practitioners have created GST concepts, these tend to be a stepping stone to theories and approaches. This situation is made worse by the variety of domains and disciplines in which systems research is conducted and reportedAckoff proposes a System of System Concepts to bring together the wide variety of concepts which have been proposedHis 30 distinct concepts are grouped under four headings, or principles, of “How Systems are formed”; “How System Change”; “How Systems Behave” and “How Systems Adaptation and Learning”.   
+
==State and Behavior==
 +
Any quality or property of a {{Term|System Element (glossary)|system element}} is called an {{Term|Attribute (glossary)|attribute}}. The {{Term|State (glossary)|state}} of a system is a set of system attributes at a given timeA '''system event''' describes any change to the{{Term|Environment (glossary)|environment}} of a system, and hence its state:
 +
*'''Static''' - A single state exists with no events. 
 +
*'''Dynamic''' - Multiple possible stable states exist.   
 +
*'''Homeostatic''' - System is static but its elements are dynamicThe system maintains its state by internal adjustments.
  
Hitchins (Hitchins, 2009) defines a similar set of principles which also consider some of the issues of hierarchy and complexity of particular relevance to a system approach.  
+
A stable state is one in which a system will remain until another event occurs.
  
In this Knowledge area we will identify a set of System Principles, against which the important System Concepts taken from a range of System Science sources have been describedThese concepts form a set of axioms, assumptions, and premises which can be applied to both the understanding of Natural and Social systems; and to the understanding and/or intervention in Engineered or Socio- technical systems.  
+
State can be monitored using state variables, {{Term|Value (glossary)|values}} of attributes which indicate the system stateThe set of possible values of state variables over time is called the "'state space'".  State variables are generally continuous, but can be modeled using a finite state model (or, "state machine").
  
The System Principle are summarized here, and used to organize the concepts in the Knowledge Area:
+
Ackoff (Ackoff 1971) considers "change" to be how a system is affected by events, and system {{Term|Behavior (glossary)|behavior}} as the effect a system has upon its environment. A system can
 +
*'''react''' to a request by turning on a light,
 +
*'''respond''' to darkness by deciding to turn on the light
 +
*'''act''' to turn on the lights at a fixed time, randomly or with discernible reasoning.
  
#'''Wholeness''': all system are formed from groups of related elements into a whole with an observable shared identity in an environment.
+
A stable system is one which has one or more stable states within an environment for a range of possible events:
#'''Behavior''': all system exhibit behaviors resulting from the interaction between elements.
+
* '''Deterministic''' systems have a one-to-one mapping of state variables to state space, allowing future states to be predicted from past states.
#'''Survival Behavior''': all system have one or more stable state, and will act to sustain those state against environmental pressures of disturbances.
+
* '''Non-Deterministic''' systems have a many-to-many mapping of state variables; future state cannot be reliably predicted.
#'''Goal Seeking Behavior''': some system will exhibit more complex combinations of behavior to create the functions needed to complete specific goals or broader objectives.
 
#'''Control”: all systems have regulation and control mechanism to guide their behaviors.
 
#'''Effectiveness''': some systems are able to assess their effectiveness against a desired objective and to adaption and learn to sustain and improve that effectiveness
 
#'''Hierarchy''': all system form hierarchical structures, additional behaviors will emerge within the hierarchy due to interactions between system elements. 
 
#'''Complexity''': At some levels of a hierarchy systems are sufficiently complex that they can only be understood, used or changed through a Systems Approach.  
 
  
One of the purposes of this part of the SEBoK is to identify those aspects of systems science which apply to Systems Engineering, through the application a Systems Approach (glossary) within a defined context.  The topics of Hierarchy, Complexity and Emergence help us to identify which system need a Systems Approach, and how to tailor that approach to suit the kinds of problem and appropriate solutions.  
+
The relationship between determinism and system complexity, including the idea of {{Term|Chaos (glossary)|chaotic}} systems, is further discussed in the [[Complexity]] article.
  
Overview of System Concepts
+
==Survival Behavior==
  
==Introduction==
+
Systems often behave in a manner that allows them to sustain themselves in one or more alternative viable states.  Many natural or social systems have this goal, either consciously or as a "self organizing" system, arising from the interaction between {{Term|Element (glossary)|elements}}.
  
In this Topic we have organized the key System Concepts around the 8 principles defined in the Knowledge Area introduction.  A number of key sources have been used.  
+
{{Term|Entropy (glossary)|Entropy}} is the tendency of systems to move towards disorder or disorganization.  In physics, entropy is used to describe how organized heat energy is “lost” into the random background energy of the surrounding environment (the 2nd Law of Thermodynamics).  A similar effect can be seen in {{Term|Engineered System (glossary)|engineered systems}}.  What happens to a building or garden left unused for any time?  Entropy can be used as a metaphor for aging, skill fade, obsolescence, misuse, boredom, etc.
  
Ackoff (Ackoff, 1971) proposes a System of System Concepts to bring together the wide variety of concepts which have been proposedAckoff’s concepts are written from a systems research perspective and can be a little abstract and hard to relate to practice. (Skyttner, 2001) describes the main GST concepts proposed by a number of authors; (Flood and Carlson, 1993) give a description of concepts as an overview of systems thinking; (Hitchins, 2007) relates the concepts to Systems Engineering Practice.
+
"Negentropy" describes the forces working in a system to hold off entropy. {{Term|Homeostasis (glossary)|Homeostasis}} is the biological equivalent of this, describing behavior which maintains a "steady state" or "dynamic equilibrium"Examples in nature include human cells, which maintain the same function while replacing their physical content at regular intervals.  Again, this can be used as a metaphor for the fight against entropy, e.g. training, discipline, maintenance, etc.
  
==Wholeness==
+
Hitchins (Hitchins 2007) describes the relationship between the viability of a system and the number of connections between its elements. Hitchins's concept of connected variety states that stability of a system increases with its connectivity (both internally and with its environment).  (See {{Term|Variety (glossary)|variety}}.)
The definition of System (Glossary) includes the fundamental concepts of a set of '''Elements''' which exhibit sufficient '''Cohesion''' (Hitchins, 2007) or '''Togetherness''' (Boardman and Sauser 2008) to form a '''Bounded''' whole.
 
  
A system exists in an '''Environment''' which contains related systems and conditions:
+
==Goal Seeking Behavior==
*'''Closed System''', has no relationships with the environment.
 
*'''Open System''', shares '''Inputs''' and '''Outputs''' with its environment across the boundary.
 
  
System elements may be conceptual organizations of ideals in symbolic form or real objects, e.g. people, data, physical artifacts, etc. 
+
Some systems have reasons for existence beyond simple survivalGoal seeking is one of the defining characteristics of {{Term|Engineered System (glossary)|engineered systems}}:
*'''Abstract''' system all elements are conceptual. 
 
*'''Concrete''' system contains at least two elements which are objects.   
 
  
Unless other wise stated, the remaining concepts below apply to open, concrete systems.
+
*A '''goal''' is a specific outcome which a system can achieve in a specified time
 +
*An '''objective''' is a longer term outcome which can be achieved through a series of goals.
 +
*An '''ideal''' is an objective which cannot be achieved with any certainty, but for which progress towards the objective has {{Term|Value (glossary)|value}}.  
  
==Behavior==
+
Systems may be single goal seeking (perform set tasks), multi-goal seeking (perform related tasks), or reflective (set goals to tackle objectives or ideas). There are two types of goal seeking systems:
===State===
 
Any quality or property of a system element is called an '''Attribute'''.  '''State''' is a set of system attributes at a given time.  A '''System Event''' describes any change to attributes of a system (or environment) and hence its state:
 
*'''Static''', a single state exists with no events.
 
*'''Dynamic''', multiple possible Stable states exist.  A stable state is one in which a system will remain until another event occurs.
 
*'''Homeostatic''', system is static but its elements are dynamic.  The system maintains its state by internal adjustments.
 
  
State can be monitored using '''State Variables''' attributes which indicate system state.  The set of possible combinations of State over time is called '''State Space'''.  State is generally '''Continuous''', but can be modeled using a '''Finite State Model''' (or '''State Machine''').   
+
*{{Term|Purposive (glossary)}} systems have multiple goals with some shared outcomeSuch a system can be used to provide pre-determined outcomes within an agreed time period.  This system may have some freedom to choose how to achieve the goal.  If it has memory it may develop {{Term|Process (glossary)|processes}} describing the behaviors needed for defined goalsMost machines or {{Term|Software (glossary)|software}} systems are purposive.
* '''Deterministic''' systems has a one-to-one mapping of state variables to state space, allowing future states to be predicted from past states.
 
* '''Non-Deterministic''' systems have a many-to-many mapping of state variables, future state cannot be predicted.  This may be due to random changes in state, or because its structure is sufficiently complex that while it is deterministic it may take up different states due to very small (below our ability to measure) differences it starting state.   
 
The later is one definition of a '''Chaotic''' system, e.g. stock market or weather, are examples whose past states can be explained using deterministic reasoning, but whose future states cannot be predicted with any certainty.
 
  
===System Events===
+
*{{Term|Purposeful (glossary)}} systems are free to determine the goals needed to achieve an outcome.  Such a system can be tasked to pursue objectives or ideals over a longer time through a series of goals.  Humans and sufficiently complex machines are purposeful.
(Ackoff, 1971) considers Change to be how a system is affected by events, and Behavior as the effect a system has upon its environment.
 
  
Three kind of '''System Change''' are described: we '''React''' to a request by turning on a light, or we '''Respond''' to darkness by deciding to turn on the light or we '''Act''' to turn on the lights at a fixed time, randomly or with discernable reasoning.
+
==Control Behavior==
 +
{{Term|Cybernetics (glossary)|Cybernetics}}, the science of {{Term|Control (glossary)|control}}, defines two basic control mechanisms:
  
'''System Behavior''' is a change which leads to events in itself or other systems. Thus, action, reaction or response may constitute behavior in some cases.  Systems have varying levels of behavior:
+
*'''Negative feedback''', maintaining system state against a set objectives or levels.
  
==Survival Behavior==
+
*'''Positive feedback''', forced growth or contraction to new levels.
  
All systems seek to continue to exist, behaving to sustain themselves in one or more alternative viable statesMany natural or social systems have this goal, either consciously or '''Self Organizing''', arising from the interaction between elements..
+
One of the main concerns of cybernetics is the balance between stability and speed of response.  A {{Term|Black-Box System (glossary)|black-box system (glossary)}} view looks at the whole system. Control can only be achieved by carefully balancing inputs with outputs, which reduces speed of responseA {{Term|White-Box System (glossary)|white-box system (glossary)}} view considers the {{Term|System Element (glossary)|system elements}} and their relationships; control mechanisms can be imbedded into this structure to provide more responsive control and associated risks to stability.  
  
'''Entropy''' is the tendency of systems to move towards disorder or disorganization.  In physics entropy is used to describe how “organized” heat energy is “lost” into the “random” background energy of the surrounding environment, e.g. 2nd law of Thermodynamics.   
+
Another useful control concept is that of a "meta-system", which sits over the system and is responsible for controlling its functions, either as a black-box or white-box.  In this case, behavior arises from the combination of system and meta-system.   
  
A similar effect can be seen in engineered systems.  What happens to a building or garden, which is left unused for any time?  Entropy can be used as a metaphor for aging, skill fade, obsolescence, miss-use, boredom, etc.
+
Control behavior is a trade between
  
'''Negentropy''' describes the forces working in a system to hold off entropy.  '''Homeostasis''' is the biological equivalent of this, describing behavior which maintains a Steady State or Dynamic Equilibrium.  Examples of this process in nature include human cells, which maintain the same function while replacing their physical content at regular intervals.
+
*'''Specialization''', the focus of system behavior to exploit particular features of its environment, and
 +
*{{Term|Flexibility (glossary)}}, the ability of a system to adapt quickly to environmental change.
  
Again this can be used as a metaphor for the fight against entropy, e.g. training, discipline, maintenance, etc.   
+
While some system elements may be optimized for either specialization, a temperature sensitive switch, flexibility, or an autonomous human controller, complex systems must strike a balance between the two for best resultsThis is an example of the concept of {{Term|Dualism (glossary)|dualism}}, discussed in more detail in [[Principles of Systems Thinking]].
  
==Goal Seeking Behavior==
+
{{Term|Variety (glossary)|Variety}} describes the number of different ways elements can be controlled, and is dependent on the different ways in which they can then be combined.  The Law of Requisite Variety states that a control system must have at least as much variety as the system it is controlling (Ashby 1956).
  
===Goals and Objectives===
+
==Function==
  
Some systems have other reasons for existence, beyond simple survival.
+
Ackoff defines {{Term|Function (glossary)|function}} as outcomes which contribute to goals or objectives.  To have a function, a system must be able to provide the outcome in two or more different ways. (This is called '''equifinality''').  
  
*A Goal is a specific outcome which a system can achieve in a specified time
+
This view of function and {{Term|Behavior (glossary)|behavior}} is common in systems science.  In this {{Term|Paradigm (glossary)|paradigm}}, all system elements have behavior of some kind; however, to be capable of functioning in certain ways requires a certain richness of behaviors.
*An Objective is a longer term outcome which can be achieved through a series of goals.
 
*An Ideal is an objective which cannot be achieved with an certainty, for which progress towards the objective has value.  
 
  
Systems may be single goal seeking (perform set tasks), multi-goal seeking (perform related tasks) or reflective (set goals to against objectives or ideas).  We define two types of goal seeking system:
+
In most {{Term|Hard System (glossary)|hard systems}} approaches, a set of functions are described from the problem statement, and then associated with one or more alternative element {{Term|Structure (glossary)|structures}} (Flood and Carson 1993).  This process may be repeated until a system {{Term|Component (glossary)|component}} (implementable combinations of function and structure) has been defined (Martin 1997).  Here, function is defined as either a task or activity that must be performed to achieve a desired outcome or as a transformation of inputs to outputs. This transformation may be:
  
*'''Purposive''' systems have multiple goals, with some shared outcome.  Such a system can be asked/used to provide pre-determined outcomes, within an agreed time period.  Such a system may have some freedom to choose how to achieve the goal.  If it has memory it may develop Processes describing the behaviors needed for defined goals.  Most machines or software systems are purposive.
+
*'''Synchronous''', a regular interaction with a closely related system, or
 +
*'''Asynchronous''', an irregular response to a demand from another system that often triggers a set response.
  
*'''Purposeful''' systems are free to determine the goals needed to achieve an outcome.  Such a system can be tasked to pursue Objective or Ideals over a longer time through a series of goalsHumans, and sufficiently complex machines, are purposeful.
+
The behavior of the resulting system is then assessed as a combination of function and {{Term|Effectiveness (glossary)|effectiveness}}In this case behavior is seen as an external property of the system as a whole and is often described as analogous to human or organic behavior (Hitchins 2009).
  
===Function===
+
==Hierarchy, Emergence and Complexity==
  
Ackoff defines '''Function''' as outcomes which contribute to goals or objectivesTo have a function, a system must be able to provide the outcome in two or more different ways (this is called '''Equifinity''').  
+
System behavior is related to combinations of element behaviors.  Most systems exhibit '''increasing variety'''; i.e., they have behavior resulting from the combination of element behaviorsThe term "synergy", or weak {{Term|Emergence (glossary)|emergence}}, is used to describe the idea that the whole is greater than the sum of the parts.  This is generally true; however, it is also possible to get '''reducing variety''', in which the whole function is less than the sum of the parts, (Hitchins 2007).  
  
This view of function and behavior is common in system science. In this paradigm all system elements have behavior of some kind, but to be capable of functioning in certain ways requires a certain richness of behaviors.
+
Complexity frequently takes the form of {{Term|Hierarchy (glossary)|hierarchies (glossary)}}. Hierarchic systems have some common properties independent of their specific content, and they will evolve far more quickly than non-hierarchic systems of comparable size (Simon 1996).  A natural system hierarchy is a consequence of wholeness, with strongly cohesive elements grouping together forming structures which reduce complexity and increase {{Term|Robustness (glossary)|robustness}} (Simons 1962).
  
In most hard systems approaches (Flood and Carson, 1993) a set of functions are described from the problem statement, and then associated with one or more alternative element structures. This process may be repeated until system '''Components''' (implementable combinations of function and structure) have been defined (Martin, 1997).  Here '''Function''' is defined as a task or activity that must be performed to achieve a desired outcome; or as a '''Transformation''' of '''Inputs''' to '''Outputs'''.   
+
{{Term|Encapsulation (glossary)|Encapsulation}} is the enclosing of one thing within another. It may also be described as the degree to which it is enclosed. System encapsulation encloses system elements and their interactions from the external environment, and usually involves a system boundary that hides the internal from the external; for example, the internal organs of the human body can be optimized to work effectively within tightly defined conditions because they are protected from extremes of environmental change.   
  
*'''Synchronous''', a regular interaction with a closely related system.
+
Socio-technical systems form  what are known as control hierarchies, with systems at a higher level having some ownership of control over those at lower levels.  Hitchins (2009) describes how systems form "preferred patterns" which can be used to the enhanced stability of interacting systems hierarchies.
  
*'''Asynchronous''', an irregular response to a demand from another system, often triggering a set response.
+
Looking across a hierarchy of systems generally reveals increasing complexity at the higher level, relating to both the structure of the system and how it is used.  The term {{Term|Emergence (glossary)|emergence}} describes behaviors emerging across a complex system hierarchy.
  
The behavior of the resulting system is then assessed.  In this case behavior is seen as an external property of the system as a whole, and often describe as analogous to human or organic behavior (Hitchins, 2009).
+
==Effectiveness, Adaptation and Learning==
  
==Control==
+
Systems {{Term|Effectiveness (glossary)|effectiveness}} is a measure of the system's ability to perform the functions necessary to achieve goals or objectives.  Ackoff (Ackoff 1971) defines this as the product of the number of combinations of behavior to reach a function and the efficiency of each combination.
'''Cybernetics''', the science of control, defines two basic control mechanisms:
 
  
*'''Negative feedback''', maintaining system state against set objectives or levels.
+
Hitchins (2007) describes effectiveness as a combination of '''performance''' (how well a function is done in ideal conditions), '''availability''' (how often the function is there when needed), and '''survivability''' (how likely is it that the system will be able to use the function fully).
  
*'''Positive feedback''', forced growth or contraction to new levels.
+
System elements and their environment change in a positive, neutral or negative way in individual situations.  An {{Term|Adaptability (glossary)|adaptive}} system is one that is able to change itself or its environment if its effectiveness is insufficient to achieve its current or future objectives.  Ackoff (Ackoff 1971) defines four types of adaptation, changing the environment or the system in response to internal or external factors.  
  
One of the main concerns of cybernetics is the balance between stability and speed of response.  Cybernetic considers systems in three ways.  A '''Black-Box''' view looks at the whole system, control can only be achieved by carefully balancing inputs with outputs which reduces speed of response.  A '''White-Box''' view considers the system elements and their relationships; here control mechanisms can be imbedded into this structure giving more responsive control and associated risks to stability. A Grey-Box view sits between these two, with control exerted at the major sub-system level. Another useful control concept is that of a Meta-System, which sits over the system and is responsible for controlling its functions, either as a black-box or white-box.  In this case behavior arises from the combination of system and meta-system.
+
A system may also '''learn''', improving its effectiveness over time, without any change in state or goal.
  
Systems behavior is influenced by '''Variety''', in particular in its control functions (Hitchins, 2009).  The law of requisite variety (Ashby, 1956) states that a control system must have great than or equal to the variety of the system it is controlling.  The effect of variety on system behavior can often be seen in the relationship between:
+
==References==
 
 
*'''Specialisation''', the focus of system behaviour to exploit particular features of its environment.
 
 
 
*'''Flexibility''', the ability of a system to adapt quickly to environmental change.
 
 
 
==Effectiveness, Adaption and Learning==
 
A systems '''Effectiveness''' is a measure of its ability to perform the functions necessary to achieve goals or objectives.  (Ackoff, 1971) defines this as the product of the number of combinations of behavior to reach a function and the efficiency of each combination.
 
 
 
(Hitchins, 2007) describes effectiveness as a combination of '''performance''' (how well a function is done in ideal conditions), '''availability''' (how often the function is there when needed) and '''survivability''' (how likely is it that the system will be able to use the function fully).
 
 
 
An '''Adaptive System''' is one able change itself or its environment if its effectiveness is insufficient to achieve its current or future goals or objectives.  Ackoff defines four types of adaption, changing the environment or the system, in response to internal or external factors.
 
 
 
A system may also '''Learn''', improving its effectiveness over time, without any change in state or goal.
 
  
==Hierarchy, Emergence and Complexity==
+
===Works Cited===
  
System behavior is related to the combinations of element behaviors. Most systems exhibit '''increasing variety''', they have behavior resulting from the combination of element behavior. The term '''Synergy''', or weak emergence, is used to describe the idea that “the whole being greater than the sum of the parts”.  While this is generally true it is also possible to get '''reducing variety''' in which the whole function at less than the sum of the part.  
+
Ackoff, R.L. 1971. "Towards a System of Systems Concepts". ''Management Science.'' 17(11).  
  
Open Systems tend to form '''Hierarchies''' of coherent System Elements, or Sub-Systems.   In natural system hierarchy is a consequence of wholeness, with strongly cohesive elements grouping together, forming structures which reduce complexity and increase robustness (Simons, 1962). Soci-technical systems form '''Control Hierarchies''', with systems at a higher level having some ownership of control over those at lower levels. (Hitchins, 2009) describes how systems form '''Preferred Patterns''' which can be used to the enhanced stability of interacting systems hierarchies.
+
Ackoff, R. 1979. "The Future of Operational Research is Past." ''Journal of the Operational Research Society''. 30(2): 93–104, Pergamon Press.
  
 +
Ashby, W R. 1956. "Chapter 11".  ''Introduction to Cybernetics''. London, UK: Wiley.
  
We take advantage of this by viewing systems using '''Systemic Reduction'''. A system is characterized by its behavior in a wider system or environment and considered in detail as a set of sub-system Structures and Functions.  This system description is focused at a particular level of resolution.  We can change this level of resolution by focusing upon the wider system, or upon one of the sub-systemsWhile this allows us to focus on a given system-of-interest we must take care to continue to take a '''holistic''' view of the wider system and environment.
+
Bertalanffy, L. von. 1968. ''General System Theory: Foundations, Development, Applications,'' Revised ed. New York, NY, USA: Braziller.   
  
When we look across a hierarchy of system we generally see increasing '''Complexity''' at the higher level, relating to both the structure of the system and how it is used. The terms '''Emergence''' and '''Emergent Properties''' are generally used to describe behaviors emerging across a complex system hierarchy. These last two ideas are fundamental to Engineered System and the Systems Approach, and are discussed in more detail in the related topics.
+
Bertalanffy, L. von. 1975. 'Perspectives on General System Theory.'' E. Taschdjian, ed. New York: George Braziller.
  
==Completeness==
+
Boardman, J. and B. Sauser. 2008. ''Systems Thinking: Coping with 21st Century Problems.'' Boca Raton, FL, USA: Taylor & Francis.
  
How do we apply system concepts to an Engineered System?  (Hitchins, 2007) proposes a set of necessary and sufficient questions to help ensure all systemic issues have been considered when assessing an existing or proposed system description.
+
Flood, R.L. and E.R. Carson. 1993. ''Dealing With Complexity: An Introduction to the Theory and Application of Systems Science''. New York, NY, USA: Plenum Press.
  
Hitchins Generic Reference Model asks questions under six heading based on these concepts, related to its Function (what it does) and Form (what the system is).  
+
Hitchins, D. 2007. ''Systems Engineering: A 21st Century Systems Methodology''. Hoboken, NJ, USA: John Wiley and Sons.
  
 +
Hitchins, D. 2009. "[[What are the General Principles Applicable to Systems?]]" INCOSE ''Insight.'' 12(4): 59-63.
  
==References==
+
Lawson, H. 2010. ''A Journey Through the Systems Landscape''. London, UK: College  Publications, Kings College.
  
===Citations===
+
Martin J. N. 1997.  ''Systems Engineering Guidebook.''  Boca Raton, FL, USA: CRC Press.
  
von Bertalanffy, L. 1968. General system theory: Foundations, development, applications. Revised ed. New York, NY: Braziller.
+
Skyttner, L. 2001. ''General Systems Theory: Ideas and Applications.'' Singapore: World Scientific Publishing Co. p. 53-69.
  
Ackoff, R.L. 1971. Towards a System of Systems Concepts, Management Science, Vol.17 No. 11, USA.
+
Simon, H.A. 1962. "The Architecture of Complexity." ''Proceedings of the American Philosophical Society.'' 106(6) (Dec. 12, 1962): 467-482.
  
Flood, R.L. and Carson, E.R. (1993) Dealing With Complexity: An Introduction to the Theory and Application of Systems Science. New York: Plenum Press
+
Simon, H. 1996. ''The Sciences of the Artificial'', 3rd ed. Cambridge, MA: MIT Press.
  
Hitchins, D. (2007) Systems Engineering: A 21st Century Systems Methodology: Wiley
 
 
Hitchins, Derek. 2009. What are the General Principles Applicable to Systems? Insight, 59-63.
 
 
Skyttner, L. (2001) General Systems Theory: Ideas and Applications. Singapore: World Scientific Publishing Co. (Pages 53 - 69)
 
 
Ashby, W R. 1956. Introduction to Cybernetics London; Wiley (chapter 11)
 
 
Simon, H. A. 1962.  The Architecture of Complexity. Proceedings of the American Philosophical Society, Vol. 106, No. 6. (Dec. 12, 1962), pp. 467-482.
 
 
Martin J. N. 1997. Systems Engineering Guidebook. CRC Press, 1997. ISBN 0849378370
 
 
===Primary References===
 
===Primary References===
Checkland, Peter. 1999. Systems Thinking, Systems Practice. New York: John Wiley & Sons.
+
Ackoff, R.L. 1971. "[[Towards a System of Systems Concept]]." ''Management Science.'' 17(11).  
  
Hitchins, Derek. 2009. What are the General Principles Applicable to Systems? Insight, 59-63.
+
Hitchins, D. 2009. "[[What are the General Principles Applicable to Systems?]]" INCOSE ''Insight.'' 12(4): 59-63.
  
 
===Additional References===
 
===Additional References===
Waring, A. (1996) Practical Systems Thinking. London: International Thomson Business Press (Chapter 1)
+
Edson, Robert. 2008. ''Systems Thinking. Applied. A Primer''. Arlington, VA, USA: Applied Systems Thinking Institute (ASysT), Analytic Services Inc.
 
 
Edson, Robert. 2008. Systems Thinking. Applied. A Primer. edited by AsysT Institute. Arlington, VA: Analytic Services.
 
  
Hitchins, Derek K. 2007. Systems Engineering: A 21st Century Systems Methodology Edited by A. P. Sage, Wiley Series in Systems Engineering and Management. Hoboken, NJ: John Wiley & Sons.
+
Jackson, S., D. Hitchins, and H. Eisner. 2010. "What is the Systems Approach?" INCOSE ''Insight.'' 13(1) (April 2010): 41-43.  
  
Jackson, Scott, Derek Hitchins, and Howard Eisner. 2010. What is the Systems Approach? INCOSE Insight, April, 41-43.
+
Waring, A. 1996. "Chapter 1."  ''Practical Systems Thinking''.  London, UK: International Thomson Business Press.
  
Lawson, Harold. 2010. A Journey Through the Systems Landscape. London: College  Publications, Kings College.
+
----
  
===Article Discussion===
+
<center>[[What is Systems Thinking?|< Previous Article]] | [[Systems Thinking|Parent Article]] | [[Principles of Systems Thinking|Next Article >]]</center>
Peter Checkland is an INCOSE Pioneer and one of the  most respected authorities on systems theory.
 
  
Derek Hitchins is an INCOSE Fellow and an author of several books on systems theory and systems engineering. He is a respected authority on both subjects.]]]
+
<center>'''SEBoK v. 2.1, released 31 October 2019'''</center>
<center>[[System Concepts|<- Previous Article]] | [[System Concepts|Parent Article]] | [[Complexity|Next Article ->]]</center>
 
  
==Signatures==
 
 
[[Category:Part 2]][[Category:Topic]]
 
[[Category:Part 2]][[Category:Topic]]
 +
[[Category:Systems Thinking]]

Latest revision as of 00:43, 26 October 2019


Lead Author: Rick Adcock, Contributing Authors: Scott Jackson, Janet Singer, Duane Hybertson


This article forms part of the Systems Thinking knowledge area (KA). It describes systems conceptssystems concepts, knowledge that can be used to understand problemsproblems and solutionssolutions to support systems thinking.

The conceptsconcepts below have been synthesized from a number of sources, which are themselves summaries of concepts from other authors. Ackoff (1971) proposed a system of system concepts as part of general system theorygeneral system theory (GST); Skyttner (2001) describes the main GST concepts from a number of systems sciencesystems science authors; Flood and Carlson (1993) give a description of concepts as an overview of systems thinking; Hitchins (2007) relates the concepts to systems engineeringsystems engineering practice; and Lawson (2010) describes a system of system concepts where systems are categorized according to fundamental concepts, types, topologies, focus, complexitycomplexity, and roles.

Wholeness and Interaction

A systemsystem is defined by a set of elementselements which exhibit sufficient cohesioncohesion, or "togetherness", to form a bounded whole (Hitchins 2007; Boardman and Sauser 2008).

According to Hitchins, interaction between elements is the "key" system concept (Hitchins 2009, 60). The focus on interactions and holismholism is a push-back against the perceived reductionistreductionist focus on parts and provides recognition that in complexcomplex systems, the interactions among parts is at least as important as the parts themselves.

An open systemopen system is defined by the interactions between system elementssystem elements within a system boundarysystem boundary and by the interaction between system elements and other systems within an environmentenvironment (see What is a System?). The remaining concepts below apply to open systems.

Regularity

RegularityRegularity is a uniformity or similarity that exists in multiple entities or at multiple times (Bertalanffy 1968). Regularities make science possible and engineeringengineering efficient and effective. Without regularities, we would be forced to consider every natural and artificial system problem and solution as unique. We would have no scientific laws, no categories or taxonomies, and each engineering effort would start from a clean slate.

Similarities and differences exist in any set or population. Every system problem or solution can be regarded as unique, but no problem/solution is in fact entirely unique. The nomothetic approach assumes regularities among entities and investigates what the regularities are. The idiographic approach assumes each entity is unique and investigates the unique qualities of entities, (Bertalanffy 1975).

A very large amount of regularity exists in both natural systems and engineered systemsengineered systems. Patterns of systems thinking capture and exploit that regularity.

State and Behavior

Any quality or property of a system elementsystem element is called an attributeattribute. The statestate of a system is a set of system attributes at a given time. A system event describes any change to theenvironmentenvironment of a system, and hence its state:

  • Static - A single state exists with no events.
  • Dynamic - Multiple possible stable states exist.
  • Homeostatic - System is static but its elements are dynamic. The system maintains its state by internal adjustments.

A stable state is one in which a system will remain until another event occurs.

State can be monitored using state variables, valuesvalues of attributes which indicate the system state. The set of possible values of state variables over time is called the "'state space'". State variables are generally continuous, but can be modeled using a finite state model (or, "state machine").

Ackoff (Ackoff 1971) considers "change" to be how a system is affected by events, and system behaviorbehavior as the effect a system has upon its environment. A system can

  • react to a request by turning on a light,
  • respond to darkness by deciding to turn on the light
  • act to turn on the lights at a fixed time, randomly or with discernible reasoning.

A stable system is one which has one or more stable states within an environment for a range of possible events:

  • Deterministic systems have a one-to-one mapping of state variables to state space, allowing future states to be predicted from past states.
  • Non-Deterministic systems have a many-to-many mapping of state variables; future state cannot be reliably predicted.

The relationship between determinism and system complexity, including the idea of chaoticchaotic systems, is further discussed in the Complexity article.

Survival Behavior

Systems often behave in a manner that allows them to sustain themselves in one or more alternative viable states. Many natural or social systems have this goal, either consciously or as a "self organizing" system, arising from the interaction between elementselements.

EntropyEntropy is the tendency of systems to move towards disorder or disorganization. In physics, entropy is used to describe how organized heat energy is “lost” into the random background energy of the surrounding environment (the 2nd Law of Thermodynamics). A similar effect can be seen in engineered systemsengineered systems. What happens to a building or garden left unused for any time? Entropy can be used as a metaphor for aging, skill fade, obsolescence, misuse, boredom, etc.

"Negentropy" describes the forces working in a system to hold off entropy. HomeostasisHomeostasis is the biological equivalent of this, describing behavior which maintains a "steady state" or "dynamic equilibrium". Examples in nature include human cells, which maintain the same function while replacing their physical content at regular intervals. Again, this can be used as a metaphor for the fight against entropy, e.g. training, discipline, maintenance, etc.

Hitchins (Hitchins 2007) describes the relationship between the viability of a system and the number of connections between its elements. Hitchins's concept of connected variety states that stability of a system increases with its connectivity (both internally and with its environment). (See varietyvariety.)

Goal Seeking Behavior

Some systems have reasons for existence beyond simple survival. Goal seeking is one of the defining characteristics of engineered systemsengineered systems:

  • A goal is a specific outcome which a system can achieve in a specified time
  • An objective is a longer term outcome which can be achieved through a series of goals.
  • An ideal is an objective which cannot be achieved with any certainty, but for which progress towards the objective has valuevalue.

Systems may be single goal seeking (perform set tasks), multi-goal seeking (perform related tasks), or reflective (set goals to tackle objectives or ideas). There are two types of goal seeking systems:

  • purposivepurposive systems have multiple goals with some shared outcome. Such a system can be used to provide pre-determined outcomes within an agreed time period. This system may have some freedom to choose how to achieve the goal. If it has memory it may develop processesprocesses describing the behaviors needed for defined goals. Most machines or softwaresoftware systems are purposive.
  • purposefulpurposeful systems are free to determine the goals needed to achieve an outcome. Such a system can be tasked to pursue objectives or ideals over a longer time through a series of goals. Humans and sufficiently complex machines are purposeful.

Control Behavior

CyberneticsCybernetics, the science of controlcontrol, defines two basic control mechanisms:

  • Negative feedback, maintaining system state against a set objectives or levels.
  • Positive feedback, forced growth or contraction to new levels.

One of the main concerns of cybernetics is the balance between stability and speed of response. A black-box system (glossary)black-box system (glossary) view looks at the whole system. Control can only be achieved by carefully balancing inputs with outputs, which reduces speed of response. A white-box system (glossary)white-box system (glossary) view considers the system elementssystem elements and their relationships; control mechanisms can be imbedded into this structure to provide more responsive control and associated risks to stability.

Another useful control concept is that of a "meta-system", which sits over the system and is responsible for controlling its functions, either as a black-box or white-box. In this case, behavior arises from the combination of system and meta-system.

Control behavior is a trade between

  • Specialization, the focus of system behavior to exploit particular features of its environment, and
  • flexibilityflexibility, the ability of a system to adapt quickly to environmental change.

While some system elements may be optimized for either specialization, a temperature sensitive switch, flexibility, or an autonomous human controller, complex systems must strike a balance between the two for best results. This is an example of the concept of dualismdualism, discussed in more detail in Principles of Systems Thinking.

VarietyVariety describes the number of different ways elements can be controlled, and is dependent on the different ways in which they can then be combined. The Law of Requisite Variety states that a control system must have at least as much variety as the system it is controlling (Ashby 1956).

Function

Ackoff defines functionfunction as outcomes which contribute to goals or objectives. To have a function, a system must be able to provide the outcome in two or more different ways. (This is called equifinality).

This view of function and behaviorbehavior is common in systems science. In this paradigmparadigm, all system elements have behavior of some kind; however, to be capable of functioning in certain ways requires a certain richness of behaviors.

In most hard systemshard systems approaches, a set of functions are described from the problem statement, and then associated with one or more alternative element structuresstructures (Flood and Carson 1993). This process may be repeated until a system componentcomponent (implementable combinations of function and structure) has been defined (Martin 1997). Here, function is defined as either a task or activity that must be performed to achieve a desired outcome or as a transformation of inputs to outputs. This transformation may be:

  • Synchronous, a regular interaction with a closely related system, or
  • Asynchronous, an irregular response to a demand from another system that often triggers a set response.

The behavior of the resulting system is then assessed as a combination of function and effectivenesseffectiveness. In this case behavior is seen as an external property of the system as a whole and is often described as analogous to human or organic behavior (Hitchins 2009).

Hierarchy, Emergence and Complexity

System behavior is related to combinations of element behaviors. Most systems exhibit increasing variety; i.e., they have behavior resulting from the combination of element behaviors. The term "synergy", or weak emergenceemergence, is used to describe the idea that the whole is greater than the sum of the parts. This is generally true; however, it is also possible to get reducing variety, in which the whole function is less than the sum of the parts, (Hitchins 2007).

Complexity frequently takes the form of hierarchies (glossary)hierarchies (glossary). Hierarchic systems have some common properties independent of their specific content, and they will evolve far more quickly than non-hierarchic systems of comparable size (Simon 1996). A natural system hierarchy is a consequence of wholeness, with strongly cohesive elements grouping together forming structures which reduce complexity and increase robustnessrobustness (Simons 1962).

EncapsulationEncapsulation is the enclosing of one thing within another. It may also be described as the degree to which it is enclosed. System encapsulation encloses system elements and their interactions from the external environment, and usually involves a system boundary that hides the internal from the external; for example, the internal organs of the human body can be optimized to work effectively within tightly defined conditions because they are protected from extremes of environmental change.

Socio-technical systems form what are known as control hierarchies, with systems at a higher level having some ownership of control over those at lower levels. Hitchins (2009) describes how systems form "preferred patterns" which can be used to the enhanced stability of interacting systems hierarchies.

Looking across a hierarchy of systems generally reveals increasing complexity at the higher level, relating to both the structure of the system and how it is used. The term emergenceemergence describes behaviors emerging across a complex system hierarchy.

Effectiveness, Adaptation and Learning

Systems effectivenesseffectiveness is a measure of the system's ability to perform the functions necessary to achieve goals or objectives. Ackoff (Ackoff 1971) defines this as the product of the number of combinations of behavior to reach a function and the efficiency of each combination.

Hitchins (2007) describes effectiveness as a combination of performance (how well a function is done in ideal conditions), availability (how often the function is there when needed), and survivability (how likely is it that the system will be able to use the function fully).

System elements and their environment change in a positive, neutral or negative way in individual situations. An adaptiveadaptive system is one that is able to change itself or its environment if its effectiveness is insufficient to achieve its current or future objectives. Ackoff (Ackoff 1971) defines four types of adaptation, changing the environment or the system in response to internal or external factors.

A system may also learn, improving its effectiveness over time, without any change in state or goal.

References

Works Cited

Ackoff, R.L. 1971. "Towards a System of Systems Concepts". Management Science. 17(11).

Ackoff, R. 1979. "The Future of Operational Research is Past." Journal of the Operational Research Society. 30(2): 93–104, Pergamon Press.

Ashby, W R. 1956. "Chapter 11". Introduction to Cybernetics. London, UK: Wiley.

Bertalanffy, L. von. 1968. General System Theory: Foundations, Development, Applications, Revised ed. New York, NY, USA: Braziller.

Bertalanffy, L. von. 1975. 'Perspectives on General System Theory. E. Taschdjian, ed. New York: George Braziller.

Boardman, J. and B. Sauser. 2008. Systems Thinking: Coping with 21st Century Problems. Boca Raton, FL, USA: Taylor & Francis.

Flood, R.L. and E.R. Carson. 1993. Dealing With Complexity: An Introduction to the Theory and Application of Systems Science. New York, NY, USA: Plenum Press.

Hitchins, D. 2007. Systems Engineering: A 21st Century Systems Methodology. Hoboken, NJ, USA: John Wiley and Sons.

Hitchins, D. 2009. "What are the General Principles Applicable to Systems?" INCOSE Insight. 12(4): 59-63.

Lawson, H. 2010. A Journey Through the Systems Landscape. London, UK: College Publications, Kings College.

Martin J. N. 1997. Systems Engineering Guidebook. Boca Raton, FL, USA: CRC Press.

Skyttner, L. 2001. General Systems Theory: Ideas and Applications. Singapore: World Scientific Publishing Co. p. 53-69.

Simon, H.A. 1962. "The Architecture of Complexity." Proceedings of the American Philosophical Society. 106(6) (Dec. 12, 1962): 467-482.

Simon, H. 1996. The Sciences of the Artificial, 3rd ed. Cambridge, MA: MIT Press.

Primary References

Ackoff, R.L. 1971. "Towards a System of Systems Concept." Management Science. 17(11).

Hitchins, D. 2009. "What are the General Principles Applicable to Systems?" INCOSE Insight. 12(4): 59-63.

Additional References

Edson, Robert. 2008. Systems Thinking. Applied. A Primer. Arlington, VA, USA: Applied Systems Thinking Institute (ASysT), Analytic Services Inc.

Jackson, S., D. Hitchins, and H. Eisner. 2010. "What is the Systems Approach?" INCOSE Insight. 13(1) (April 2010): 41-43.

Waring, A. 1996. "Chapter 1." Practical Systems Thinking. London, UK: International Thomson Business Press.


< Previous Article | Parent Article | Next Article >
SEBoK v. 2.1, released 31 October 2019