https://sebokwiki.org/w/api.php?action=feedcontributions&user=Cjones&feedformat=atomSEBoK - User contributions [en]2024-03-28T11:22:37ZUser contributionsMediaWiki 1.35.13https://sebokwiki.org/w/index.php?title=Stakeholder_Requirements_Definition&diff=36906Stakeholder Requirements Definition2012-07-26T20:21:40Z<p>Cjones: /* Presentation and Quality of Requirements */</p>
<hr />
<div><br />
The System requirements are all of the [[Requirement (glossary)|requirements]] at the “system level” that have been properly translated from the list of stakeholders' requirements. The System Requirements form the basis of system [[Design (glossary) |design]], verification, and stakeholder acceptance. <br />
<br />
System Requirements play major roles in the engineering activities. They serve:<br />
*as the essential inputs of the system design activities.<br />
*as reference for the system validation activities.<br />
*as a communication means, between the various technical staff that interact within the project.<br />
<br />
==Principles Governing System Requirements==<br />
===Origin in Stakeholder Requirements===<br />
The set of Stakeholder Requirements may contain vague, ambiguous, and qualitative “user-oriented” need statements that are difficult to use for design or to verify. Each of these requirements may need to be further clarified and translated into “engineering-oriented” language to enable proper design and verification activities. The System Requirements resulting from this translation are expressed in technical language useful for design; unambiguous, consistent, coherent, exhaustive, and verifiable. Of course, close coordination with the stakeholders is necessary to ensure the translation is accurate.<br />
<br />
As an example, a need or an expectation such as "To easily maneuver a car to park it," will be transformed in a set of Stakeholder Requirements such as, "Increase the drivability of the car", "Decrease the effort for handling", "Assist the piloting", "Protect the coachwork against shocks or scratch", etc. Translating, for example, the Stakeholder Requirement "Increase the drivability of the car", results in a set of System Requirements specifying measurable characteristics such as the turning circle (steering lock), the wheelbase, etc.<br />
<br />
===Traceability and assignment of System Requirements during design===<br />
Requirements traceability provides the ability to trace information from the origin of the Stakeholder Requirements at the top level to the lowest level of the system hierarchy - see section "Top-down and recursive approach to system decomposition" in the [[System Definition]] article. Traceability is also used to provide an understanding of the extent of a change as an input to impact analyses conducted with respect to proposed engineering improvements or requests for change.<br />
<br />
During [[Design (glossary) |design]], the assignment of requirements from one level to lower levels in the system hierarchy can be accomplished using several methods, as appropriate - see Table 1.<br />
<br />
<center>'''Table 1. Assignment types for a System Requirement (SEBoK Original)'''</center><br />
[[File:SEBoKv05_KA-SystDef_Assignment_Type_System_Requirement.png|thumb|center|600px]]<br />
<br />
===Classification of System Requirements===<br />
Several classifications of System Requirements are possible, depending on the requirements definition methods and/or the design methods used. (ISO/IEC. 2011) provides a classification summarized below – see references for other interesting classifications. An example is given in Table 2.<br />
<br />
<center>'''Table 2. Example of System Requirements Classification (SEBoK Original)'''</center><br />
[[File:SEBoKv05_KA-SystDef_Example_Sys_Requirements_Classification.png|thumb|700px|center]]<br />
<br />
==Process Approach – System Requirements==<br />
===Purpose and Principle of the Approach===<br />
The purpose of the system requirements analysis process is to transform the stakeholder, user-oriented view of desired services and properties into a technical view of the product that meets the operational needs of the user. This process builds a representation of the system that will meet Stakeholder Requirements and that, as far as constraints permit, does not imply any specific implementation. It results in measurable System Requirements that specify, from the supplier’s perspective, what performance and non-performance characteristics it must possess in order to satisfy stakeholders' requirements. (ISO/IEC. 2008)<br />
<br />
===Activities of the process===<br />
Major activities and tasks performed during this process include:<br />
#Analyze the Stakeholder Requirements to check completeness of expected services and [[Operational Scenario (glossary)|Operational Scenarios (glossary)]], conditions, Operational Modes, and constraints.<br />
#Define the System Requirements and its [[Rationale (glossary)]].<br />
#Classify the System Requirements using suggested classifications – see examples above.<br />
#Incorporate the derived requirements (coming from design) into the System Requirements baseline.<br />
#Establish the upward traceability with the Stakeholder Requirements.<br />
#Verify the quality and completeness of each System Requirement and the consistency of the set of System Requirements.<br />
#Validate the content and relevance of each System Requirement against the set of Stakeholder Requirements.<br />
#Identify potential [[Risk (glossary)|Risks (glossary)]] (or threats and hazards) that could be generated by the System Requirements.<br />
#Synthesize, record, and manage the System Requirements and potential associated Risks.<br />
<br />
===Artifacts and Ontology Elements===<br />
This process may create several artifacts such as:<br />
#System Requirements Document<br />
#System Requirements Justification Document (for traceability purpose)<br />
#System Requirements Database, including traceability, analysis, rationale, decisions, and attributes, where appropriate.<br />
#System External Interface Requirements Document (this document describes the interfaces of the system with external elements of its context of use; the interface requirements can be integrated or not to the System Requirements Document above).<br />
<br />
This process handles the ontology elements of Table 3.<br />
<br />
<center>'''Table 3. Main ontology elements as handled within system requirements definition (SEBoK Original)'''</center><br />
[[File:SEBoKv05_KA-SystDef_ontology_elements_system_requirements.png|thumb|600px|center]]<br />
<br />
===Checking and Correctness of System Requirements===<br />
System Requirements should be checked to gauge whether they are well expressed and appropriate. There are number of characteristics which can be used to check System Requirements. The set of System Requirements can be verified using standard peer review techniques and by comparing each requirement against the set of requirements characteristics listed in Table 2 and Table 3 of section "Presentation and Quality of Requirements".<br />
<br />
The requirements can be further validated using the requirements elicitation and rationale capture described in section "Methods and Modeling Techniques."<br />
<br />
===Methods and Modeling Techniques===<br />
====Requirements Elicitation and Prototyping====<br />
Requirements Elicitation requires user involvement, and can be effective in gaining stakeholder involvement and buy-in. QFD (Quality Function Deployment) and prototyping are two common techniques that can be applied and are defined in this section. In addition, interviews, focus groups, and Delphi techniques are often applied to elicit requirements.<br />
<br />
QFD is a powerful technique to elicit requirements and compare design characteristics against user needs (Hauser and Clausing, 1988). The inputs to the QFD application are user needs and operational concepts, so it is essential that the users participate. Users from across the life cycle should be included so that all aspects of user needs are accounted for and prioritized.<br />
<br />
Early prototyping can help the users and developers interactively identify functional and operational requirements and user interface constraints. The prototyping allows for realistic user interaction, discovery, and feedback, as well as some sensitivity analysis. This improves the user's understanding of the requirements and increases the probability of satisfying their actual needs.<br />
<br />
====Capturing Requirements Rationale====<br />
One powerful and cost-effective technique to translate Stakeholder Requirements to System Requirements is to capture the rationale for each requirement. Requirements rationale is merely a statement as to why the requirement exists, any assumptions made, the results of related design studies, or any other related supporting information. This supports further requirements analysis and decomposition. The rationale can be captured directly in the requirements database. (Hull, M. E. C., Jackson, K., Dick, A. J. J. 2010).<br />
<br />
Some of the benefits include:<br />
*'''Reducing the total number of requirements'''. The process aids in identifying duplicates. Reducing requirements count will reduce project cost and risk.<br />
*'''Early exposure of bad assumptions'''. <br />
*'''Removes design implementation'''. Many poorly written stakeholder requirements are design requirements in disguise, in that the customer is intentionally or unintentionally specifying a candidate implementation. <br />
*'''Improves communication with the stakeholder community'''. By capturing the requirements rationale for all Stakeholders Requirements, the line of communication between the users and the designers is greatly improved. Adapted from (Hooks, I. F., and K. A. Farry. 2000) Chapter 8.<br />
<br />
====Modeling Techniques====<br />
Modeling techniques that can be used when requirements must be detailed or refined, or when they address topics not considered during the Stakeholder Requirements Definition and Mission Analysis include:<br />
*State-charts models (ISO/IEC. 2011) Section 8.4.2<br />
*Scenarios modeling (ISO/IEC. 2011) Section 6.2.3.1<br />
*Simulations, prototyping (ISO/IEC. 2011) Section 6.3.3.2<br />
*Quality Function Deployment (INCOSE. 2010) p. 83<br />
*Sequence diagram, Activity diagram, Use case, State machine diagram, Requirements diagram of [[Acronyms|SysML]]<br />
*Functional Flow Block Diagram for Operational Scenario<br />
<br />
====Presentation and Quality of Requirements====<br />
Generally, requirements are provided in a textual form. Guidelines exist for writing good requirements; they include recommendations about syntax of requirements statements, wording (exclusions, representation of concepts, etc.), characteristics(specific, measurable, achievable, feasible, testable, etc.). Refer to (INCOSE. 2010) section 4.2.2.2 and (ISO/IEC. 2011).<br />
<br />
There are several characteristics of requirements and sets of requirements that ensure the quality of requirements. These are used both to aid the development of the requirements and to verify the implementation of requirements into the solution. Table 4 provides a list and descriptions of the characteristics for individual requirements and Table 5 provides a list and descriptions of characteristics for a set of requirements, as adapted from (ISO/IEC. 2011) sections 5.2.5 and 5.2.6.<br />
<br />
<center>'''Table 4. Characteristics of Individual Requirements (SEBoK Original)'''</center> <br />
[[File:Table._Charac_of_Ind_Req_AF_052312.png|thumb|center|650px]]<br />
<br />
<br />
<center>'''Table 5. Characteristics of a Set of Requirements (SEBoK Original)'''</center><br />
[[File:Table._Charac_of_Set_of_Req_AF_052312.png|thumb|center|650px]]<br />
<br />
====Requirements in Tables====<br />
Requirements may be provided in a table, especially when specifying a set of parameters for the system or a system element. It is good practice to make standard table templates available. For tables, the following conventions apply: <br />
*Invoke each requirements table in the requirements set that clearly points to the table.<br />
*Identify each table with a unique title and table number.<br />
*Include the word “requirements” in the table title.<br />
*Identify the purpose of the table in the text immediately preceding it and include an explanation of how to read and use the table, including context and units.<br />
*For independent-dependant variable situations, organize the table in a way that best accommodates the use of the information.<br />
*Each cell should contain, at most, a single requirement.<br />
<br />
====Requirements in Flow Charts====<br />
Flow charts often contain requirements in a graphical form. These requirements may include logic that must be incorporated into the system, operational requirements, process or procedural requirements, or other situations that are best defined graphically by a sequence of interrelated steps. For flow charts, the following conventions apply:<br />
*Invoke flow charts in the requirements set that clearly points to the flow chart.<br />
*Identify each flow chart with a unique title and figure number.<br />
*Include the word “requirements” in the title of the flow chart<br />
*Clearly indicate and explain unique symbols that represent requirements in the flow chart.<br />
<br />
====Requirements in Drawings====<br />
Drawings also provide a graphical means to define requirements. The type of requirement defined in a drawing depends on the type of drawing. Following conventions apply:<br />
*Drawings are used when they can aid in the description of the following:<br />
** Spatial requirements<br />
**Interface requirements<br />
**Layout requirements<br />
*Invoke drawings in the requirements set that clearly points to the drawing.<br />
<br />
<br />
<br />
==Practical Considerations about System Requirements==<br />
There are several '''pitfalls''' that will inhibit the generation and management of an optimal set of System Requirements. See Table 6.<br />
<br />
<center>'''Table 6. Major pitfalls with definition of System Requirements (SEBoK Original)'''</center><br />
[[File:SEBoKv05_KA-SystDef_pitfalls_System_Requirements.png|thumb|center|600px]]<br />
<br />
The following '''proven practices''' in system requirements engineering have repeatedly been shown to reduce project risk and cost, foster customer satisfaction, and produce successful system development. See Table 7.<br />
<br />
<center>'''Table 7. Proven practices with definition of System Requirements (SEBoK Original)'''</center><br />
[[File:SEBoKv05_KA-SystDef_practices_System_Requirements.png|thumb|600px|center]]<br />
<br />
==References== <br />
<br />
===Works Cited===<br />
Hauser, J. and D. Clausing. 1988. "The House of Quality." ''Harvard Business Review.'' (May - June 1988). <br />
<br />
Hooks, I. F., and K. A. Farry. 2000. ''Customer-centered products: Creating successful products through smart requirements management''. New York, NY, USA: American Management Association.<br />
<br />
Hull, M. E. C., Jackson, K., Dick, A. J. J. 2010. ''Systems Engineering''. 3rd ed. London, UK: Springer.<br />
<br />
INCOSE. 2011. ''Systems Engineering Handbook: A Guide for System Life Cycle Processes and Activities'' Version 3.2.1. San Diego, CA, USA: International Council on Systems Engineering (INCOSE), INCOSE-TP-2003-002-03.2.1.<br />
<br />
ISO/IEC/IEEE. 2011. ''Systems and Software Engineering - Requirements Engineering''. Geneva, Switzerland: International Organization for Standardization (ISO)/International Electrotechnical Commission/ Institute of Electrical and Electronics Engineers (IEEE), (IEC), ISO/IEC/IEEE 29148. <br />
<br />
ISO/IEC/IEEE. 2008. ''Systems and Software Engineering - System Life Cycle Processes.'' Geneva, Switzerland: International Organization for Standardization (ISO)/International Electronical Commission (IEC), Institute of Electrical and Electronics Engineers. ISO/IEC/IEEE 15288:2008 (E).<br />
<br />
===Primary References===<br />
<br />
ISO/IEC/IEEE. 2011. ''[[ISO/IEC/IEEE 29148|Systems and Software Engineering - Requirements Engineering]]''. Geneva, Switzerland: International Organization for Standardization (ISO)/International Electrotechnical Commission/ Institute of Electrical and Electronics Engineers (IEEE), (IEC), [[ISO/IEC/IEEE 29148]].<br />
<br />
ISO/IEC/IEEE. 2008. ''[[ISO/IEC/IEEE 15288|Systems and Software Engineering - System Life Cycle Processes]].'' Geneva, Switzerland: International Organization for Standardization (ISO)/International Electronical Commission (IEC), Institute of Electrical and Electronics Engineers. [[ISO/IEC/IEEE 15288]]:2008 (E).<br />
<br />
INCOSE. 2011. ''[[INCOSE Systems Engineering Handbook]]: A Guide for System Life Cycle Processes and Activities''. Version 3.2.1. San Diego, CA, USA: International Council on Systems Engineering (INCOSE), INCOSE-TP-2003-002-03.2.1.<br />
<br />
Lamsweerde, A. van. 2009. ''[[Requirements Engineering]]''. New York, NY, USA: Wiley.<br />
<br />
===Additional References===<br />
<br />
Faisandier, A. 2011 (unpublished). ''Engineering and Architecting Multidisciplinary Systems''. <br />
<br />
Hooks, I.F., and K.A. Farry. 2000. ''Customer-Centered Products: Creating Successful Products through Smart Requirements Management.'' New York, NY, USA: American Management Association. <br />
<br />
Hull, M.E.C., K. Jackson, A.J.J. Dick. 2010. ''Systems Engineering''. 3rd ed. London, UK: Springer.<br />
<br />
Roedler, G., D. Rhodes, C. Jones, and H. Schimmoller. 2010. ''Systems Engineering Leading Indicators Guide''. Version 2.0. San Diego, CA, USA: International Council on Systems Engineering (INCOSE), INCOSE-TP-2005-001-03.<br />
----<br />
<br />
<center>[[System Definition|< Previous Article]] | [[System Definition|Parent Article]] | [[Architectural Design: Logical|Next Article >]]</center><br />
<br />
{{5comments}}<br />
<br />
[[Category: Part 3]][[Category:Topic]]<br />
[[Category:System Definition]]<br />
{{DISQUS}}</div>Cjoneshttps://sebokwiki.org/w/index.php?title=Stakeholder_Requirements_Definition&diff=36905Stakeholder Requirements Definition2012-07-26T20:19:32Z<p>Cjones: /* Presentation and Quality of Requirements */</p>
<hr />
<div><br />
The System requirements are all of the [[Requirement (glossary)|requirements]] at the “system level” that have been properly translated from the list of stakeholders' requirements. The System Requirements form the basis of system [[Design (glossary) |design]], verification, and stakeholder acceptance. <br />
<br />
System Requirements play major roles in the engineering activities. They serve:<br />
*as the essential inputs of the system design activities.<br />
*as reference for the system validation activities.<br />
*as a communication means, between the various technical staff that interact within the project.<br />
<br />
==Principles Governing System Requirements==<br />
===Origin in Stakeholder Requirements===<br />
The set of Stakeholder Requirements may contain vague, ambiguous, and qualitative “user-oriented” need statements that are difficult to use for design or to verify. Each of these requirements may need to be further clarified and translated into “engineering-oriented” language to enable proper design and verification activities. The System Requirements resulting from this translation are expressed in technical language useful for design; unambiguous, consistent, coherent, exhaustive, and verifiable. Of course, close coordination with the stakeholders is necessary to ensure the translation is accurate.<br />
<br />
As an example, a need or an expectation such as "To easily maneuver a car to park it," will be transformed in a set of Stakeholder Requirements such as, "Increase the drivability of the car", "Decrease the effort for handling", "Assist the piloting", "Protect the coachwork against shocks or scratch", etc. Translating, for example, the Stakeholder Requirement "Increase the drivability of the car", results in a set of System Requirements specifying measurable characteristics such as the turning circle (steering lock), the wheelbase, etc.<br />
<br />
===Traceability and assignment of System Requirements during design===<br />
Requirements traceability provides the ability to trace information from the origin of the Stakeholder Requirements at the top level to the lowest level of the system hierarchy - see section "Top-down and recursive approach to system decomposition" in the [[System Definition]] article. Traceability is also used to provide an understanding of the extent of a change as an input to impact analyses conducted with respect to proposed engineering improvements or requests for change.<br />
<br />
During [[Design (glossary) |design]], the assignment of requirements from one level to lower levels in the system hierarchy can be accomplished using several methods, as appropriate - see Table 1.<br />
<br />
<center>'''Table 1. Assignment types for a System Requirement (SEBoK Original)'''</center><br />
[[File:SEBoKv05_KA-SystDef_Assignment_Type_System_Requirement.png|thumb|center|600px]]<br />
<br />
===Classification of System Requirements===<br />
Several classifications of System Requirements are possible, depending on the requirements definition methods and/or the design methods used. (ISO/IEC. 2011) provides a classification summarized below – see references for other interesting classifications. An example is given in Table 2.<br />
<br />
<center>'''Table 2. Example of System Requirements Classification (SEBoK Original)'''</center><br />
[[File:SEBoKv05_KA-SystDef_Example_Sys_Requirements_Classification.png|thumb|700px|center]]<br />
<br />
==Process Approach – System Requirements==<br />
===Purpose and Principle of the Approach===<br />
The purpose of the system requirements analysis process is to transform the stakeholder, user-oriented view of desired services and properties into a technical view of the product that meets the operational needs of the user. This process builds a representation of the system that will meet Stakeholder Requirements and that, as far as constraints permit, does not imply any specific implementation. It results in measurable System Requirements that specify, from the supplier’s perspective, what performance and non-performance characteristics it must possess in order to satisfy stakeholders' requirements. (ISO/IEC. 2008)<br />
<br />
===Activities of the process===<br />
Major activities and tasks performed during this process include:<br />
#Analyze the Stakeholder Requirements to check completeness of expected services and [[Operational Scenario (glossary)|Operational Scenarios (glossary)]], conditions, Operational Modes, and constraints.<br />
#Define the System Requirements and its [[Rationale (glossary)]].<br />
#Classify the System Requirements using suggested classifications – see examples above.<br />
#Incorporate the derived requirements (coming from design) into the System Requirements baseline.<br />
#Establish the upward traceability with the Stakeholder Requirements.<br />
#Verify the quality and completeness of each System Requirement and the consistency of the set of System Requirements.<br />
#Validate the content and relevance of each System Requirement against the set of Stakeholder Requirements.<br />
#Identify potential [[Risk (glossary)|Risks (glossary)]] (or threats and hazards) that could be generated by the System Requirements.<br />
#Synthesize, record, and manage the System Requirements and potential associated Risks.<br />
<br />
===Artifacts and Ontology Elements===<br />
This process may create several artifacts such as:<br />
#System Requirements Document<br />
#System Requirements Justification Document (for traceability purpose)<br />
#System Requirements Database, including traceability, analysis, rationale, decisions, and attributes, where appropriate.<br />
#System External Interface Requirements Document (this document describes the interfaces of the system with external elements of its context of use; the interface requirements can be integrated or not to the System Requirements Document above).<br />
<br />
This process handles the ontology elements of Table 3.<br />
<br />
<center>'''Table 3. Main ontology elements as handled within system requirements definition (SEBoK Original)'''</center><br />
[[File:SEBoKv05_KA-SystDef_ontology_elements_system_requirements.png|thumb|600px|center]]<br />
<br />
===Checking and Correctness of System Requirements===<br />
System Requirements should be checked to gauge whether they are well expressed and appropriate. There are number of characteristics which can be used to check System Requirements. The set of System Requirements can be verified using standard peer review techniques and by comparing each requirement against the set of requirements characteristics listed in Table 2 and Table 3 of section "Presentation and Quality of Requirements".<br />
<br />
The requirements can be further validated using the requirements elicitation and rationale capture described in section "Methods and Modeling Techniques."<br />
<br />
===Methods and Modeling Techniques===<br />
====Requirements Elicitation and Prototyping====<br />
Requirements Elicitation requires user involvement, and can be effective in gaining stakeholder involvement and buy-in. QFD (Quality Function Deployment) and prototyping are two common techniques that can be applied and are defined in this section. In addition, interviews, focus groups, and Delphi techniques are often applied to elicit requirements.<br />
<br />
QFD is a powerful technique to elicit requirements and compare design characteristics against user needs (Hauser and Clausing, 1988). The inputs to the QFD application are user needs and operational concepts, so it is essential that the users participate. Users from across the life cycle should be included so that all aspects of user needs are accounted for and prioritized.<br />
<br />
Early prototyping can help the users and developers interactively identify functional and operational requirements and user interface constraints. The prototyping allows for realistic user interaction, discovery, and feedback, as well as some sensitivity analysis. This improves the user's understanding of the requirements and increases the probability of satisfying their actual needs.<br />
<br />
====Capturing Requirements Rationale====<br />
One powerful and cost-effective technique to translate Stakeholder Requirements to System Requirements is to capture the rationale for each requirement. Requirements rationale is merely a statement as to why the requirement exists, any assumptions made, the results of related design studies, or any other related supporting information. This supports further requirements analysis and decomposition. The rationale can be captured directly in the requirements database. (Hull, M. E. C., Jackson, K., Dick, A. J. J. 2010).<br />
<br />
Some of the benefits include:<br />
*'''Reducing the total number of requirements'''. The process aids in identifying duplicates. Reducing requirements count will reduce project cost and risk.<br />
*'''Early exposure of bad assumptions'''. <br />
*'''Removes design implementation'''. Many poorly written stakeholder requirements are design requirements in disguise, in that the customer is intentionally or unintentionally specifying a candidate implementation. <br />
*'''Improves communication with the stakeholder community'''. By capturing the requirements rationale for all Stakeholders Requirements, the line of communication between the users and the designers is greatly improved. Adapted from (Hooks, I. F., and K. A. Farry. 2000) Chapter 8.<br />
<br />
====Modeling Techniques====<br />
Modeling techniques that can be used when requirements must be detailed or refined, or when they address topics not considered during the Stakeholder Requirements Definition and Mission Analysis include:<br />
*State-charts models (ISO/IEC. 2011) Section 8.4.2<br />
*Scenarios modeling (ISO/IEC. 2011) Section 6.2.3.1<br />
*Simulations, prototyping (ISO/IEC. 2011) Section 6.3.3.2<br />
*Quality Function Deployment (INCOSE. 2010) p. 83<br />
*Sequence diagram, Activity diagram, Use case, State machine diagram, Requirements diagram of [[Acronyms|SysML]]<br />
*Functional Flow Block Diagram for Operational Scenario<br />
<br />
====Presentation and Quality of Requirements====<br />
Generally, requirements are provided in a textual form. Guidelines exist to write good requirements; they include recommendations about sentence syntax, wording (exclusions, representation of concepts, etc.), semantics (specific, measurable, achievable, realistic, testable). Refer to (INCOSE. 2010) section 4.2.2.2 and (ISO/IEC. 2011).<br />
<br />
There are several characteristics of requirements and sets of requirements that ensure the quality of requirements. These are used both to aid the development of the requirements and to verify the implementation of requirements into the solution. Table 4 provides a list and descriptions of the characteristics for individual requirements and Table 5 provides a list and descriptions of characteristics for a set of requirements, as adapted from (ISO/IEC. 2011) sections 5.2.5 and 5.2.6.<br />
<br />
<center>'''Table 4. Characteristics of Individual Requirements (SEBoK Original)'''</center> <br />
[[File:Table._Charac_of_Ind_Req_AF_052312.png|thumb|center|650px]]<br />
<br />
<br />
<center>'''Table 5. Characteristics of a Set of Requirements (SEBoK Original)'''</center><br />
[[File:Table._Charac_of_Set_of_Req_AF_052312.png|thumb|center|650px]]<br />
<br />
====Requirements in Tables====<br />
Requirements may be provided in a table, especially when specifying a set of parameters for the system or a system element. It is good practice to make standard table templates available. For tables, the following conventions apply: <br />
*Invoke each requirements table in the requirements set that clearly points to the table.<br />
*Identify each table with a unique title and table number.<br />
*Include the word “requirements” in the table title.<br />
*Identify the purpose of the table in the text immediately preceding it and include an explanation of how to read and use the table, including context and units.<br />
*For independent-dependant variable situations, organize the table in a way that best accommodates the use of the information.<br />
*Each cell should contain, at most, a single requirement.<br />
<br />
====Requirements in Flow Charts====<br />
Flow charts often contain requirements in a graphical form. These requirements may include logic that must be incorporated into the system, operational requirements, process or procedural requirements, or other situations that are best defined graphically by a sequence of interrelated steps. For flow charts, the following conventions apply:<br />
*Invoke flow charts in the requirements set that clearly points to the flow chart.<br />
*Identify each flow chart with a unique title and figure number.<br />
*Include the word “requirements” in the title of the flow chart<br />
*Clearly indicate and explain unique symbols that represent requirements in the flow chart.<br />
<br />
====Requirements in Drawings====<br />
Drawings also provide a graphical means to define requirements. The type of requirement defined in a drawing depends on the type of drawing. Following conventions apply:<br />
*Drawings are used when they can aid in the description of the following:<br />
** Spatial requirements<br />
**Interface requirements<br />
**Layout requirements<br />
*Invoke drawings in the requirements set that clearly points to the drawing.<br />
<br />
<br />
<br />
==Practical Considerations about System Requirements==<br />
There are several '''pitfalls''' that will inhibit the generation and management of an optimal set of System Requirements. See Table 6.<br />
<br />
<center>'''Table 6. Major pitfalls with definition of System Requirements (SEBoK Original)'''</center><br />
[[File:SEBoKv05_KA-SystDef_pitfalls_System_Requirements.png|thumb|center|600px]]<br />
<br />
The following '''proven practices''' in system requirements engineering have repeatedly been shown to reduce project risk and cost, foster customer satisfaction, and produce successful system development. See Table 7.<br />
<br />
<center>'''Table 7. Proven practices with definition of System Requirements (SEBoK Original)'''</center><br />
[[File:SEBoKv05_KA-SystDef_practices_System_Requirements.png|thumb|600px|center]]<br />
<br />
==References== <br />
<br />
===Works Cited===<br />
Hauser, J. and D. Clausing. 1988. "The House of Quality." ''Harvard Business Review.'' (May - June 1988). <br />
<br />
Hooks, I. F., and K. A. Farry. 2000. ''Customer-centered products: Creating successful products through smart requirements management''. New York, NY, USA: American Management Association.<br />
<br />
Hull, M. E. C., Jackson, K., Dick, A. J. J. 2010. ''Systems Engineering''. 3rd ed. London, UK: Springer.<br />
<br />
INCOSE. 2011. ''Systems Engineering Handbook: A Guide for System Life Cycle Processes and Activities'' Version 3.2.1. San Diego, CA, USA: International Council on Systems Engineering (INCOSE), INCOSE-TP-2003-002-03.2.1.<br />
<br />
ISO/IEC/IEEE. 2011. ''Systems and Software Engineering - Requirements Engineering''. Geneva, Switzerland: International Organization for Standardization (ISO)/International Electrotechnical Commission/ Institute of Electrical and Electronics Engineers (IEEE), (IEC), ISO/IEC/IEEE 29148. <br />
<br />
ISO/IEC/IEEE. 2008. ''Systems and Software Engineering - System Life Cycle Processes.'' Geneva, Switzerland: International Organization for Standardization (ISO)/International Electronical Commission (IEC), Institute of Electrical and Electronics Engineers. ISO/IEC/IEEE 15288:2008 (E).<br />
<br />
===Primary References===<br />
<br />
ISO/IEC/IEEE. 2011. ''[[ISO/IEC/IEEE 29148|Systems and Software Engineering - Requirements Engineering]]''. Geneva, Switzerland: International Organization for Standardization (ISO)/International Electrotechnical Commission/ Institute of Electrical and Electronics Engineers (IEEE), (IEC), [[ISO/IEC/IEEE 29148]].<br />
<br />
ISO/IEC/IEEE. 2008. ''[[ISO/IEC/IEEE 15288|Systems and Software Engineering - System Life Cycle Processes]].'' Geneva, Switzerland: International Organization for Standardization (ISO)/International Electronical Commission (IEC), Institute of Electrical and Electronics Engineers. [[ISO/IEC/IEEE 15288]]:2008 (E).<br />
<br />
INCOSE. 2011. ''[[INCOSE Systems Engineering Handbook]]: A Guide for System Life Cycle Processes and Activities''. Version 3.2.1. San Diego, CA, USA: International Council on Systems Engineering (INCOSE), INCOSE-TP-2003-002-03.2.1.<br />
<br />
Lamsweerde, A. van. 2009. ''[[Requirements Engineering]]''. New York, NY, USA: Wiley.<br />
<br />
===Additional References===<br />
<br />
Faisandier, A. 2011 (unpublished). ''Engineering and Architecting Multidisciplinary Systems''. <br />
<br />
Hooks, I.F., and K.A. Farry. 2000. ''Customer-Centered Products: Creating Successful Products through Smart Requirements Management.'' New York, NY, USA: American Management Association. <br />
<br />
Hull, M.E.C., K. Jackson, A.J.J. Dick. 2010. ''Systems Engineering''. 3rd ed. London, UK: Springer.<br />
<br />
Roedler, G., D. Rhodes, C. Jones, and H. Schimmoller. 2010. ''Systems Engineering Leading Indicators Guide''. Version 2.0. San Diego, CA, USA: International Council on Systems Engineering (INCOSE), INCOSE-TP-2005-001-03.<br />
----<br />
<br />
<center>[[System Definition|< Previous Article]] | [[System Definition|Parent Article]] | [[Architectural Design: Logical|Next Article >]]</center><br />
<br />
{{5comments}}<br />
<br />
[[Category: Part 3]][[Category:Topic]]<br />
[[Category:System Definition]]<br />
{{DISQUS}}</div>Cjoneshttps://sebokwiki.org/w/index.php?title=Measurement&diff=36899Measurement2012-07-26T20:09:58Z<p>Cjones: /* Evaluate Systems Engineering Measurement */</p>
<hr />
<div>Systems engineering [[Acronyms|SE]] [[measurement (glossary)|measurement]] and the accompanying analysis are fundamental elements of SE and technical management. SE measurement provides information relating to the products developed, services provided, and processes implemented to support effective management of the processes and to objectively evaluate product or service quality. Measurement supports realistic planning, provides insight into actual performance, and facilitates assessment of suitable actions (Roedler and Jones 2005, 1-65; Frenz et al. 2010). <br />
<br />
Appropriate measures and indicators are essential inputs to tradeoff analyses to balance cost, schedule, and technical objectives. Periodic analysis of the relationships between measurement results and the requirements and attributes of the system provides insight that helps identify issues early, when they can be resolved with less impact. Historical data, together with project or organizational context information, forms the basis for predictive models and methods that should be used.<br />
<br />
==Fundamental Concepts==<br />
The discussion of measurement here is based on some fundamental concepts. Roedler, et al. states three key SE measurement concepts that are paraphrased here (Roedler and Jones 2005, 1-65):<br />
<br />
#'''SE measurement is a consistent but flexible process''' that is tailored to the unique information needs and characteristics of a particular project or organization and revised as information needs change. <br />
#'''Decision makers must understand what is being measured.''' Key decision makers must be able to connect “what is being measured” to “what they need to know." <br />
#'''Measurement must be used to be effective'''.<br />
<br />
==Measurement Process Overview==<br />
The measurement process as presented here consists of four activities from [[ISO/IEC/IEEE 15939|Practical Software and Systems Measurement (PSM)]] and described in (ISO/IEC/IEEE 15939)(McGarry et al. 2002; Murdoch 2006, 67). <br />
<br />
This process has been the basis for establishing a common process across the software and systems engineering communities. This measurement approach has been adopted by the Capability Maturity Model Integration (CMMI) measurement and analysis process area (SEI 2006, 10), and by international systems and software engineering standards, such as (ISO/IEC/IEEE 2008; ISO/IEC/IEEE 15939; ISO/IEC/IEEE 15288, 1). The International Council on Systems Engineering ([[Acronyms|INCOSE]]) Measurement Working Group has also adopted this measurement approach for several of their measurement assets, such as the [[Systems Engineering Measurement Primer|INCOSE SE Measurement Primer]] (Frenz et al. 2010) and [[Technical Measurement Guide]] (Roedler and Jones 2005). This approach has provided a consistent treatment of measurement that allows the engineering community to communicate more effectively about measurement. The process is illustrated in Figure 1 from Roedler and Jones (2005) and McGarry et al. (2002). <br />
<br />
[[File:Measurement_Process_Model-Figure_1.png|thumb|600px|center|Figure 1. Four Key Measurement Process Activities (PSM 2011)Reprinted with permission of Practical Software and Systems Measurement ([http://www.psmsc.com PSM])]] <br />
<br />
<br />
===Establish and Sustain Commitment===<br />
This activity focuses on establishing the resources, training, and tools to implement a measurement process and ensure that there is a management commitment to use the information that is produced. Refer to PSM (August 18, 2011) and SPC (2011) for additional detail.<br />
<br />
===Plan Systems Engineering Measurement===<br />
This activity focuses on defining measures that provide insight into project or organization [[Information Need (glossary)|information needs]]. This includes identifying what the decision makers need to know and when they need to know it, relating these information needs to those entities that can be measured, and then identifying, prioritizing, selecting, and specifying [[Measure (glossary)|measures]] based on project and organization processes (Jones 2003, 15-19). This activity also identifies the reporting format, forums, and target audience for the information provided by the measures.<br />
<br />
Here are a few widely used approaches to identify the information needs and derive associated measures. Each can be focused on identifying measures that are needed for SE management. These include:<br />
<br />
*The [[Acronyms|PSM]] approach, which uses a set of [[Information Category (glossary)|information categories]], [[Measurable Concept (glossary)|measurable concepts]], and candidate measures to aid the user in determining relevant information needs and aspects about the information needs on which to focus (PSM August 18, 2011). <br />
<br />
*The [[Acronyms|goal-question-metric (GQM)]] approach, which identifies explicit measurement goals. Each goal is decomposed into several questions that help in the selection of measures that address the question and provide insight into the goal achievement (Park, Goethert, and Florac 1996). <br />
<br />
*Software Productivity Center’s 8-step Metrics Program, which also includes stating the goals and defining measures needed to gain insight for achieving the goals (SPC 2011). <br />
<br />
The following are good sources for candidate measures that address the information needs and measurable concepts/questions:<br />
*PSM Web Site (PSM August 18, 2011)<br />
*PSM Guide, Version 4.0, Chapters 3 and 5 (PSM 2000)<br />
*SE Leading Indicators Guide, Version 2.0, Section 3 (Roedler et al. 2010)<br />
*Technical Measurement Guide, Version 1.0, Section 10 (Roedler and Jones 2005, 1-65)<br />
*Safety Measurement (PSM White Paper), Version 3.0, Section 3.4 (Murdoch 2006, 60)<br />
*Security Measurement (PSM White Paper), Version 3.0, Section 7 (Murdoch 2006, 67)<br />
*Measuring Systems Interoperability, Section 5 and Appendix C (Kasunic and Anderson 2004)<br />
*Measurement for Process Improvement (PSM Technical Report), Version 1.0, Appendix E (Statz 2005)<br />
<br />
The INCOSE SE Measurement Primer (Frenz et al. 2010) provides a list of attributes of a good measure with definitions for each [[Attribute (glossary)|attribute]]. The attributes include ''relevance, completeness, timeliness, simplicity, cost effectiveness, repeatability, and accuracy.'' Evaluating candidate measures against these attributes can help assure the selection of more effective measures. <br />
<br />
The details of the [[measure (glossary)|measure]] need to be unambiguously defined and documented. Templates for the specification of measures and indicators are available on the PSM website and in (Goethert and Siviy 2004).<br />
<br />
===Perform Systems Engineering Measurement===<br />
This activity focuses on the collection and preparation of measurement data, measurement analysis, and the presentation of the results to inform decision makers. The preparation of the measurement data includes verification, normalization, and aggregation of the data, as applicable. Analysis includes estimation, feasibility analysis of plans, and performance analysis of actual data against plans. <br />
<br />
The quality of the measurement results is dependent on the collection and preparation of valid, accurate, and unbiased data. Data verification, validation, preparation, and analysis techniques are discussed in PSM (August 18, 2011)Chapters 1 and 4 and SEI (2006, 10). Per TL 9000, Quality Management System Guidance, “The analysis step should integrate quantitative measurement results and other qualitative project information, in order to provide managers the feedback needed for effective decision making” (Quest 2010, 5-10). This provides richer information that gives the users the broader picture and puts the information in the appropriate context. <br />
<br />
There is a significant body of guidance available on good ways to present quantitative information. Edward Tufte has several books focused on the visualization of information, including ''The Visual Display of Quantitative Information'' (Tufte 2001). <br />
<br />
More information about understanding and using measurement results can be found in:<br />
*PSM (August 18, 2011)<br />
*ISO/IEC/IEEE 15939, clauses 4.3.3 and 4.3.4<br />
*Roedler and Jones (2005), sections 6.4, 7.2, and 7.3<br />
<br />
===Evaluate Systems Engineering Measurement===<br />
This activity includes the knowledge explaining the periodic evaluation and improvement of the measurement process and specific measures. One objective is to ensure that the measures continue to align with the business goals and information needs, and provide useful insight. This activity should also evaluate the SE measurement activities, resources, and infrastructure to make sure it supports the needs of the project and organization. Refer to PSM (August 18, 2011) and <br />
''Practical Software Measurement: Objective Information for Decision Makers'' (McGarry et al. 2002) for additional detail.<br />
<br />
==Systems Engineering Leading Indicators==<br />
Leading indicators are aimed at providing predictive insight regarding an information need. A systems engineering leading indicator “is a measure for evaluating the effectiveness of a how a specific activity is applied on a project in a manner that provides information about impacts that are likely to affect the system performance objectives.” Leading indicators may be individual measures or collections of measures and associated analysis that provide future systems engineering performance insight throughout the life cycle of the system. “Leading indicators support the effective management of systems engineering by providing visibility into expected project performance and potential future states.” <br />
<br />
As shown in Figure 2, a leading indicator is composed of characteristics, a condition, and a predicted behavior. The characteristics and condition are analyzed on a periodic or as-needed basis to predict behavior within a given confidence and within an accepted time range into the future. More information is found in (Roedler et al. 2010).<br />
<br />
[[File:Composition_of_Leading_Indicator-Figure_2.png|thumb|600px|center|Figure 2. Composition of a Leading Indicator (Roedler et al. 2010) Reprinted with permission of the International Council on Systems Engineering ([http://www.psmsc.com INCOSE]) and Practical Software and Systems Measurement ([http://www.psmsc.com PSM])]]<br />
<br />
==Technical Measurement==<br />
Technical measurement is the set of measurement activities used to provide information about progress in the definition and development of the technical solution, ongoing assessment of the associated risks and issues, and the likelihood of meeting the critical objectives of the [[Acquirer (glossary)|acquirer]]. This insight helps an engineer make better throughout the life cycle of a system to increase the probability of delivering a technical solution that meets both the specified requirements and the mission needs. The insight is also used in trade-off decisions when performance is not within the thresholds or goals.<br />
<br />
Technical measurement includes [[Measure of Effectiveness (MoE) (glossary)|measures of effectiveness]] ([[Acronyms|MOE]]s), [[Measure of Performance (MoP) (glossary)|measures of performance]] ([[Acronyms|MOP]]s), and [[Technical Performance Measure (TPM) (glossary)|technical performance measures]] ([[Acronyms|TPM]]s). (Roedler and Jones 2005, 1-65) The relationships between these types of technical measures are shown in Figure 3 and explained in the reference. Using the measurement process described above, technical measurement can be planned early in the life cycle and then performed throughout the life cycle with increasing levels of fidelity as the technical solution is developed, facilitating predictive insight and preventive or corrective actions. More information about technical measurement can be found in the ''[[NASA Systems Engineering Handbook]]'', ''System Analysis, Design, Development: Concepts, principles, and practices'', and the ''[[Systems Engineering Leading Indicators Guide]]'' (NASA December 2007, 1-360, Section 6.7.2.2; Wasson 2006, Chapter 34; Roedler and Jones 2005).<br />
<br />
[[File:Technical_Measures_Relationship-Figure_3.png|thumb|600px|center|Figure 3. Relationship of the Technical Measures (Roedler and Jones 2010) Reprinted with permission of the International Council on Systems Engineering ([http://www.psmsc.com INCOSE]) and Practical Software and Systems Measurement ([http://www.psmsc.com PSM])]]<br />
<br />
==Service Measurement==<br />
The same measurement activities can be applied for service measurement; however, the context and measures will be different. Service providers have a need to balance efficiency and effectiveness, which may be opposing objectives. Good service measures are outcome-based; focus on elements important to the customer; such as service availability, reliability, and performance; and provide timely, forward-looking information. <br />
<br />
For services, the terms [[Acronyms|critical success factors (CSF)]] and [[Acronyms|key performance indicators (KPI)]] are used often when discussing measurement. [[Acronyms|CSF]]s are the key elements of the service or service infrastructure that are most important to achieve the business objectives. Key performance indicators are specific values or characteristics measured to assess achievement of those objectives.<br />
More information about service measurement can be found in the Service Design and Continual Service Improvement volumes of BMP (2010, 1). Service SE can be found in the [[Service Systems Engineering]] article.<br />
<br />
==Linkages to Other Systems Engineering Management Topics==<br />
SE measurement has linkages to other [[Acronyms|SEM]] topics. The following are a few key linkages adapted from Roedler and Jones (2005):<br />
*[[Planning]] – SE measurement provides the historical data and supports the estimation for, and feasibility analysis of, the plans for realistic planning. <br />
*[[Assessment and Control]] – SE measurement provides the objective information needed to perform the assessment and determination of appropriate control actions. The use of leading indicators allows for early assessment and control actions that identify risks and/or provide insight to allow early treatment of risks to minimize potential impacts.<br />
*[[Risk Management]] – SE risk management identifies the information needs that can impact project and organizational performance. SE measurement data helps to quantify risks and subsequently provides information about whether risks have been successfully managed.<br />
*[[Decision Management]] – SE Measurement results inform decision making by providing objective insight.<br />
<br />
==Practical Considerations==<br />
Key pitfalls and good practices related to systems engineering measurement are described in the next two sections.<br />
<br />
===Pitfalls===<br />
Some of the key pitfalls encountered in planning and performing SE Measurement are provided in Table 1 (Developed for BKCASE):<br />
<br />
{| class="wikitable"<br />
|-<br />
! Pitfall Name<br />
! Pitfall Description<br />
|-<br />
| Golden Measures<br />
|<br />
*Looking for the one measure or small set of measures that applies to all projects. <br />
*No one-size-fits-all measure or measurement set exists. <br />
*Each project has unique information needs (e.g., objectives, risks, and issues). <br />
*The one exception is that, in some cases with consistent product lines, processes, and information needs, a small core set of measures may be defined for use across an organization.<br />
|-<br />
|Single-pass Perspective<br />
|<br />
*Viewing measurement as a single-pass activity.<br />
*To be effective, measurement needs to be performed continuously, including the periodic identification and prioritization of information needs and associated measures. <br />
|-<br />
|Unknown Information Need<br />
|<br />
*Performing measurement activities without the understanding of why the measures are needed and what information they provide. <br />
*This can lead to wasted effort. <br />
|-<br />
|Inappropriate Usage<br />
|<br />
*Using measurement inappropriately, to measure performance of individuals or make interpretations without context information. <br />
*This can lead to bias in the results or incorrect interpretations. <br />
|}<br />
<br />
===Good Practices===<br />
Some good practices, gathered from the references are provided in Table 2 (Developed for BKCASE):<br />
<br />
{| class="wikitable"<br />
|-<br />
! Good Practice Name<br />
! Good Practice Description<br />
|-<br />
| Periodic Review<br />
|<br />
*Regularly review each measure collected. <br />
|-<br />
|Action Driven<br />
|<br />
*Measurement by itself does not control or improve process performance. <br />
*Measurement results should be provided to decision makers for appropriate action. <br />
|-<br />
|Integration into Project Processes<br />
|<br />
*SE Measurement should be integrated into the project as part of the ongoing project business rhythm.<br />
*Data should be collected as processes are performed, not recreated as an afterthought. <br />
|-<br />
|Timely Information<br />
|<br />
*Information should be obtained early enough to allow necessary action to control or treat risks, adjust tactics and strategies, etc. <br />
*When such actions are not successful, measurement results need to help decision makers determine contingency actions or correct problems. <br />
|-<br />
|Relevance to Decision Makers<br />
|<br />
*Successful measurement requires the communication of meaningful information to the decision makers. <br />
*Results should be presented in the decision maker’s preferred format. <br />
*Allows accurate and expeditious interpretation of the results. <br />
|-<br />
|Data Availability<br />
|<br />
*Decisions can rarely wait for a complete or perfect set of data, so measurement information often needs to be derived from analysis of the best available data, complemented by real-time events and qualitative insight (including experience). <br />
|-<br />
|Historical Data<br />
|<br />
*Use historical data as the basis of plans, measure what is planned versus what is achieved, archive actual achieved results, and use archived data as a historical basis for the next planning effort. <br />
|-<br />
|Information Model<br />
|<br />
*The information model defined in ISO/IEC/IEEE (2007) provides a means to link the entities that are measured to the associated measures and to the identified information need, as well as how the measures are converted into indicators that provide insight to decision makers. <br />
|}<br />
<br />
Additional information can be found in the ''[[Systems Engineering Measurement Primer]]'', Section 4.2 (Frenz et al. 2010), and INCOSE, Section 5.7.1.5 (2010).<br />
<br />
==References== <br />
<br />
<br />
===Works Cited===<br />
Frenz, P., G. Roedler, D.J. Gantzer, P. Baxter. 2010. ''Systems Engineering Measurement Primer: A Basic Introduction to Measurement Concepts and Use for Systems Engineering''. Version 2.0. San Diego, CA: International Council on System Engineering (INCOSE). INCOSE‐TP‐2010‐005‐02. Available at: http://www.incose.org/ProductsPubs/pdf/INCOSE_SysEngMeasurementPrimer_2010-1205.pdf. <br />
<br />
ISO/IEC/IEEE. 2007. ''[[ISO/IEC/IEEE 15939|Systems and software engineering - Measurement process]]''. Geneva, Switzerland: International Organization for Standardization (ISO)/International Electrotechnical Commission (IEC), [[ISO/IEC/IEEE 15939]]:2007. <br />
<br />
Kasunic, M. and W. Anderson. 2004. ''Measuring Systems Interoperability: Challenges and Opportunities.'' Pittsburgh, PA, USA: Software Engineering Institute (SEI)/Carnegie Mellon University (CMU). <br />
<br />
McGarry, J. et al. 2002. ''Practical Software Measurement: Objective Information for Decision Makers''. Boston, MA, USA: Addison-Wesley <br />
<br />
Murdoch, J. et al. 2006. ''Safety Measurement''. Version 3.0. Practical Software and Systems Measurement. Available at: http://www.psmsc.com/Downloads/TechnologyPapers/SafetyWhitePaper_v3.0.pdf.<br />
<br />
Murdoch, J. et al. 2006. ''Security Measurement''. Version 3.0. Practical Software and Systems Measurement. Available at: http://www.psmsc.com/Downloads/TechnologyPapers/SecurityWhitePaper_v3.0.pdf.<br />
<br />
NASA. 2007. [[NASA Systems Engineering Handbook]]. Washington, DC, USA: National Aeronautics and Space Administration (NASA), December 2007. NASA/SP-2007-6105. <br />
<br />
Park, Goethert, and Florac. 1996. ''Goal-Driven Software Measurement – A Guidebook''. Pittsburgh, PA, USA: Software Engineering Institute (SEI)/Carnegie Mellon University (CMU), CMU/SEI-96-BH-002. <br />
<br />
PSM. 2011. "Practical Software and Systems Measurement." Accessed August 18, 2011. Available at: http://www.psmsc.com/.<br />
<br />
PSM. 2000. ''[[Practical Software and Systems Measurement (PSM) Guide]]''. Version 4.0c. Practical Software and System Measurement Support Center. Available at: http://www.psmsc.com/PSMGuide.asp. <br />
<br />
Roedler, G., D. Rhodes, C. Jones, and H. Schimmoller. 2010. ''[[Systems Engineering Leading Indicators Guide]]''. Version 2.0. San Diego, CA, USA: International Council on Systems Engineering (INCOSE), INCOSE-TP-2005-001-03. <br />
<br />
Roedler, G. and C. Jones. 2005. ''[[Technical Measurement Guide]]''. Version 1.0. San Diego, CA, USA: International Council on Systems Engineering (INCOSE), INCOSE-TP-2003-020-01.<br />
<br />
Software Productivity Center, Inc. 2011. Software Productivity Center web site. August 20, 2011. Available at: http://www.spc.ca/<br />
<br />
Statz, J. et al. 2005. ''Measurement for Process Improvement''. Version 1.0. York, UK: Practical Software and Systems Measurement (PSM).<br />
<br />
Tufte, E. 2006. ''The Visual Display of Quantitative Information.'' Cheshire, CT, USA: Graphics Press.<br />
<br />
Wasson, C. 2005. ''System Analysis, Design, Development: Concepts, Principles, and Practices''. Hoboken, NJ, USA: John Wiley and Sons.<br />
<br />
===Primary References===<br />
<br />
Frenz, P., G. Roedler, D.J. Gantzer, P. Baxter. 2010. ''[[Systems Engineering Measurement Primer]]: A Basic Introduction to Measurement Concepts and Use for Systems Engineering.'' Version 2.0. San Diego, CA: International Council on System Engineering (INCOSE). INCOSE‐TP‐2010‐005‐02. Available at: http://www.incose.org/ProductsPubs/pdf/INCOSE_SysEngMeasurementPrimer_2010-1205.pdf. <br />
<br />
ISO/IEC/IEEE. 2007. ''[[ISO/IEC/IEEE 15939|Systems and Software Engineering - Measurement Process]]''. Geneva, Switzerland: International Organization for Standardization (ISO)/International Electrotechnical Commission (IEC), [[ISO/IEC/IEEE 15939]]:2007. <br />
<br />
PSM. 2000. ''[[Practical Software and Systems Measurement (PSM) Guide]].'' Version 4.0c. Practical Software and System Measurement Support Center. Available at: http://www.psmsc.com.<br />
<br />
Roedler, G., D. Rhodes, C. Jones, and H. Schimmoller. 2010. ''[[Systems Engineering Leading Indicators Guide]].'' Version 2.0. San Diego, CA: International Council on Systems Engineering (INCOSE), INCOSE-TP-2005-001-03. <br />
<br />
Roedler, G. and C.Jones. 2005. ''[[Technical Measurement Guide]]''. Version 1.0. San Diego, CA: International Council on Systems Engineering (INCOSE), INCOSE-TP-2003-020-01.<br />
<br />
===Additional References===<br />
<br />
Kasunic, M. and W. Anderson. 2004. ''Measuring Systems Interoperability: Challenges and Opportunities.'' Pittsburgh, PA, USA: Software Engineering Institute (SEI)/Carnegie Mellon University (CMU). <br />
<br />
McGarry, J. et al. 2002. ''Practical Software Measurement: Objective Information for Decision Makers''. Boston, MA, USA: Addison-Wesley <br />
<br />
Murdoch, J. et al. 2006. ''Safety Measurement''. Version 3.0. Practical Software and Systems Measurement. Available at: http://www.psmsc.com/Downloads/TechnologyPapers/SafetyWhitePaper_v3.0.pdf.<br />
<br />
Murdoch, J. et al. 2006. ''Security Measurement''. Version 3.0. Practical Software and Systems Measurement. Available at: http://www.psmsc.com/Downloads/TechnologyPapers/SecurityWhitePaper_v3.0.pdf.<br />
<br />
NASA.2007. ''NASA Systems Engineering Handbook''. Washington, DC, USA: National Aeronautics and Space Administration (NASA), December 2007. NASA/SP-2007-6105. <br />
<br />
Park, Goethert, and Florac. 1996. ''Goal-Driven Software Measurement – A Guidebook''. Pittsburgh, PA, USA: Software Engineering Institute (SEI)/Carnegie Mellon University (CMU), CMU/SEI-96-BH-002. <br />
<br />
PSM. 2011. ''Practical Software and Systems Measurement (PSM) web site''. Accessed August 18, 2011. Available at: http://www.psmsc.com/.<br />
<br />
SEI. 2007. "Measurement and Analysis Process Area" in ''Capability Maturity Model Integrated (CMMI) for Development'', version 1.2. Pittsburgh, PA, USA: Software Engineering Institute (SEI)/Carnegie Mellon University (CMU). <br />
<br />
Software Productivity Center, Inc. 2011. "Software Productivity Center" main web page. Accessed on August 20, 2011. Available at: http://www.spc.ca/.<br />
<br />
Statz, J. et al. 2005. ''Measurement for Process Improvement''. Version 1.0. York, UK: Practical Software and Systems Measurement (PSM).<br />
<br />
Tufte, E. 2006. ''The Visual Display of Quantitative Information.'' Cheshire, CT, USA: Graphics Press.<br />
<br />
Wasson, C. 2005. ''System Analysis, Design, Development: Concepts, Principles, and Practices''. Hoboken, NJ, USA: John Wiley and Sons.<br />
<br />
----<br />
<center>[[Risk Management|< Previous Article]] | [[Systems Engineering Management|Parent Article]] | [[Decision Management|Next Article >]]</center><br />
<br />
{{5comments}}<br />
<br />
[[Category: Part 3]][[Category:Topic]]<br />
[[Category:Systems Engineering Management]]<br />
{{DISQUS}}</div>Cjoneshttps://sebokwiki.org/w/index.php?title=Measurement&diff=36898Measurement2012-07-26T20:08:32Z<p>Cjones: /* Perform Systems Engineering Measurement */</p>
<hr />
<div>Systems engineering [[Acronyms|SE]] [[measurement (glossary)|measurement]] and the accompanying analysis are fundamental elements of SE and technical management. SE measurement provides information relating to the products developed, services provided, and processes implemented to support effective management of the processes and to objectively evaluate product or service quality. Measurement supports realistic planning, provides insight into actual performance, and facilitates assessment of suitable actions (Roedler and Jones 2005, 1-65; Frenz et al. 2010). <br />
<br />
Appropriate measures and indicators are essential inputs to tradeoff analyses to balance cost, schedule, and technical objectives. Periodic analysis of the relationships between measurement results and the requirements and attributes of the system provides insight that helps identify issues early, when they can be resolved with less impact. Historical data, together with project or organizational context information, forms the basis for predictive models and methods that should be used.<br />
<br />
==Fundamental Concepts==<br />
The discussion of measurement here is based on some fundamental concepts. Roedler, et al. states three key SE measurement concepts that are paraphrased here (Roedler and Jones 2005, 1-65):<br />
<br />
#'''SE measurement is a consistent but flexible process''' that is tailored to the unique information needs and characteristics of a particular project or organization and revised as information needs change. <br />
#'''Decision makers must understand what is being measured.''' Key decision makers must be able to connect “what is being measured” to “what they need to know." <br />
#'''Measurement must be used to be effective'''.<br />
<br />
==Measurement Process Overview==<br />
The measurement process as presented here consists of four activities from [[ISO/IEC/IEEE 15939|Practical Software and Systems Measurement (PSM)]] and described in (ISO/IEC/IEEE 15939)(McGarry et al. 2002; Murdoch 2006, 67). <br />
<br />
This process has been the basis for establishing a common process across the software and systems engineering communities. This measurement approach has been adopted by the Capability Maturity Model Integration (CMMI) measurement and analysis process area (SEI 2006, 10), and by international systems and software engineering standards, such as (ISO/IEC/IEEE 2008; ISO/IEC/IEEE 15939; ISO/IEC/IEEE 15288, 1). The International Council on Systems Engineering ([[Acronyms|INCOSE]]) Measurement Working Group has also adopted this measurement approach for several of their measurement assets, such as the [[Systems Engineering Measurement Primer|INCOSE SE Measurement Primer]] (Frenz et al. 2010) and [[Technical Measurement Guide]] (Roedler and Jones 2005). This approach has provided a consistent treatment of measurement that allows the engineering community to communicate more effectively about measurement. The process is illustrated in Figure 1 from Roedler and Jones (2005) and McGarry et al. (2002). <br />
<br />
[[File:Measurement_Process_Model-Figure_1.png|thumb|600px|center|Figure 1. Four Key Measurement Process Activities (PSM 2011)Reprinted with permission of Practical Software and Systems Measurement ([http://www.psmsc.com PSM])]] <br />
<br />
<br />
===Establish and Sustain Commitment===<br />
This activity focuses on establishing the resources, training, and tools to implement a measurement process and ensure that there is a management commitment to use the information that is produced. Refer to PSM (August 18, 2011) and SPC (2011) for additional detail.<br />
<br />
===Plan Systems Engineering Measurement===<br />
This activity focuses on defining measures that provide insight into project or organization [[Information Need (glossary)|information needs]]. This includes identifying what the decision makers need to know and when they need to know it, relating these information needs to those entities that can be measured, and then identifying, prioritizing, selecting, and specifying [[Measure (glossary)|measures]] based on project and organization processes (Jones 2003, 15-19). This activity also identifies the reporting format, forums, and target audience for the information provided by the measures.<br />
<br />
Here are a few widely used approaches to identify the information needs and derive associated measures. Each can be focused on identifying measures that are needed for SE management. These include:<br />
<br />
*The [[Acronyms|PSM]] approach, which uses a set of [[Information Category (glossary)|information categories]], [[Measurable Concept (glossary)|measurable concepts]], and candidate measures to aid the user in determining relevant information needs and aspects about the information needs on which to focus (PSM August 18, 2011). <br />
<br />
*The [[Acronyms|goal-question-metric (GQM)]] approach, which identifies explicit measurement goals. Each goal is decomposed into several questions that help in the selection of measures that address the question and provide insight into the goal achievement (Park, Goethert, and Florac 1996). <br />
<br />
*Software Productivity Center’s 8-step Metrics Program, which also includes stating the goals and defining measures needed to gain insight for achieving the goals (SPC 2011). <br />
<br />
The following are good sources for candidate measures that address the information needs and measurable concepts/questions:<br />
*PSM Web Site (PSM August 18, 2011)<br />
*PSM Guide, Version 4.0, Chapters 3 and 5 (PSM 2000)<br />
*SE Leading Indicators Guide, Version 2.0, Section 3 (Roedler et al. 2010)<br />
*Technical Measurement Guide, Version 1.0, Section 10 (Roedler and Jones 2005, 1-65)<br />
*Safety Measurement (PSM White Paper), Version 3.0, Section 3.4 (Murdoch 2006, 60)<br />
*Security Measurement (PSM White Paper), Version 3.0, Section 7 (Murdoch 2006, 67)<br />
*Measuring Systems Interoperability, Section 5 and Appendix C (Kasunic and Anderson 2004)<br />
*Measurement for Process Improvement (PSM Technical Report), Version 1.0, Appendix E (Statz 2005)<br />
<br />
The INCOSE SE Measurement Primer (Frenz et al. 2010) provides a list of attributes of a good measure with definitions for each [[Attribute (glossary)|attribute]]. The attributes include ''relevance, completeness, timeliness, simplicity, cost effectiveness, repeatability, and accuracy.'' Evaluating candidate measures against these attributes can help assure the selection of more effective measures. <br />
<br />
The details of the [[measure (glossary)|measure]] need to be unambiguously defined and documented. Templates for the specification of measures and indicators are available on the PSM website and in (Goethert and Siviy 2004).<br />
<br />
===Perform Systems Engineering Measurement===<br />
This activity focuses on the collection and preparation of measurement data, measurement analysis, and the presentation of the results to inform decision makers. The preparation of the measurement data includes verification, normalization, and aggregation of the data, as applicable. Analysis includes estimation, feasibility analysis of plans, and performance analysis of actual data against plans. <br />
<br />
The quality of the measurement results is dependent on the collection and preparation of valid, accurate, and unbiased data. Data verification, validation, preparation, and analysis techniques are discussed in PSM (August 18, 2011)Chapters 1 and 4 and SEI (2006, 10). Per TL 9000, Quality Management System Guidance, “The analysis step should integrate quantitative measurement results and other qualitative project information, in order to provide managers the feedback needed for effective decision making” (Quest 2010, 5-10). This provides richer information that gives the users the broader picture and puts the information in the appropriate context. <br />
<br />
There is a significant body of guidance available on good ways to present quantitative information. Edward Tufte has several books focused on the visualization of information, including ''The Visual Display of Quantitative Information'' (Tufte 2001). <br />
<br />
More information about understanding and using measurement results can be found in:<br />
*PSM (August 18, 2011)<br />
*ISO/IEC/IEEE 15939, clauses 4.3.3 and 4.3.4<br />
*Roedler and Jones (2005), sections 6.4, 7.2, and 7.3<br />
<br />
===Evaluate Systems Engineering Measurement===<br />
This activity includes the knowledge explaining the periodic evaluation and improvement of the measurement process and specific measures. One objective is to ensure that the measures continue to align with the business goals and information needs, and provide useful insight. Refer to PSM (August 18, 2011) and <br />
''Practical Software Measurement: Objective Information for Decision Makers'' (McGarry et al. 2002) for additional detail.<br />
<br />
==Systems Engineering Leading Indicators==<br />
Leading indicators are aimed at providing predictive insight regarding an information need. A systems engineering leading indicator “is a measure for evaluating the effectiveness of a how a specific activity is applied on a project in a manner that provides information about impacts that are likely to affect the system performance objectives.” Leading indicators may be individual measures or collections of measures and associated analysis that provide future systems engineering performance insight throughout the life cycle of the system. “Leading indicators support the effective management of systems engineering by providing visibility into expected project performance and potential future states.” <br />
<br />
As shown in Figure 2, a leading indicator is composed of characteristics, a condition, and a predicted behavior. The characteristics and condition are analyzed on a periodic or as-needed basis to predict behavior within a given confidence and within an accepted time range into the future. More information is found in (Roedler et al. 2010).<br />
<br />
[[File:Composition_of_Leading_Indicator-Figure_2.png|thumb|600px|center|Figure 2. Composition of a Leading Indicator (Roedler et al. 2010) Reprinted with permission of the International Council on Systems Engineering ([http://www.psmsc.com INCOSE]) and Practical Software and Systems Measurement ([http://www.psmsc.com PSM])]]<br />
<br />
==Technical Measurement==<br />
Technical measurement is the set of measurement activities used to provide information about progress in the definition and development of the technical solution, ongoing assessment of the associated risks and issues, and the likelihood of meeting the critical objectives of the [[Acquirer (glossary)|acquirer]]. This insight helps an engineer make better throughout the life cycle of a system to increase the probability of delivering a technical solution that meets both the specified requirements and the mission needs. The insight is also used in trade-off decisions when performance is not within the thresholds or goals.<br />
<br />
Technical measurement includes [[Measure of Effectiveness (MoE) (glossary)|measures of effectiveness]] ([[Acronyms|MOE]]s), [[Measure of Performance (MoP) (glossary)|measures of performance]] ([[Acronyms|MOP]]s), and [[Technical Performance Measure (TPM) (glossary)|technical performance measures]] ([[Acronyms|TPM]]s). (Roedler and Jones 2005, 1-65) The relationships between these types of technical measures are shown in Figure 3 and explained in the reference. Using the measurement process described above, technical measurement can be planned early in the life cycle and then performed throughout the life cycle with increasing levels of fidelity as the technical solution is developed, facilitating predictive insight and preventive or corrective actions. More information about technical measurement can be found in the ''[[NASA Systems Engineering Handbook]]'', ''System Analysis, Design, Development: Concepts, principles, and practices'', and the ''[[Systems Engineering Leading Indicators Guide]]'' (NASA December 2007, 1-360, Section 6.7.2.2; Wasson 2006, Chapter 34; Roedler and Jones 2005).<br />
<br />
[[File:Technical_Measures_Relationship-Figure_3.png|thumb|600px|center|Figure 3. Relationship of the Technical Measures (Roedler and Jones 2010) Reprinted with permission of the International Council on Systems Engineering ([http://www.psmsc.com INCOSE]) and Practical Software and Systems Measurement ([http://www.psmsc.com PSM])]]<br />
<br />
==Service Measurement==<br />
The same measurement activities can be applied for service measurement; however, the context and measures will be different. Service providers have a need to balance efficiency and effectiveness, which may be opposing objectives. Good service measures are outcome-based; focus on elements important to the customer; such as service availability, reliability, and performance; and provide timely, forward-looking information. <br />
<br />
For services, the terms [[Acronyms|critical success factors (CSF)]] and [[Acronyms|key performance indicators (KPI)]] are used often when discussing measurement. [[Acronyms|CSF]]s are the key elements of the service or service infrastructure that are most important to achieve the business objectives. Key performance indicators are specific values or characteristics measured to assess achievement of those objectives.<br />
More information about service measurement can be found in the Service Design and Continual Service Improvement volumes of BMP (2010, 1). Service SE can be found in the [[Service Systems Engineering]] article.<br />
<br />
==Linkages to Other Systems Engineering Management Topics==<br />
SE measurement has linkages to other [[Acronyms|SEM]] topics. The following are a few key linkages adapted from Roedler and Jones (2005):<br />
*[[Planning]] – SE measurement provides the historical data and supports the estimation for, and feasibility analysis of, the plans for realistic planning. <br />
*[[Assessment and Control]] – SE measurement provides the objective information needed to perform the assessment and determination of appropriate control actions. The use of leading indicators allows for early assessment and control actions that identify risks and/or provide insight to allow early treatment of risks to minimize potential impacts.<br />
*[[Risk Management]] – SE risk management identifies the information needs that can impact project and organizational performance. SE measurement data helps to quantify risks and subsequently provides information about whether risks have been successfully managed.<br />
*[[Decision Management]] – SE Measurement results inform decision making by providing objective insight.<br />
<br />
==Practical Considerations==<br />
Key pitfalls and good practices related to systems engineering measurement are described in the next two sections.<br />
<br />
===Pitfalls===<br />
Some of the key pitfalls encountered in planning and performing SE Measurement are provided in Table 1 (Developed for BKCASE):<br />
<br />
{| class="wikitable"<br />
|-<br />
! Pitfall Name<br />
! Pitfall Description<br />
|-<br />
| Golden Measures<br />
|<br />
*Looking for the one measure or small set of measures that applies to all projects. <br />
*No one-size-fits-all measure or measurement set exists. <br />
*Each project has unique information needs (e.g., objectives, risks, and issues). <br />
*The one exception is that, in some cases with consistent product lines, processes, and information needs, a small core set of measures may be defined for use across an organization.<br />
|-<br />
|Single-pass Perspective<br />
|<br />
*Viewing measurement as a single-pass activity.<br />
*To be effective, measurement needs to be performed continuously, including the periodic identification and prioritization of information needs and associated measures. <br />
|-<br />
|Unknown Information Need<br />
|<br />
*Performing measurement activities without the understanding of why the measures are needed and what information they provide. <br />
*This can lead to wasted effort. <br />
|-<br />
|Inappropriate Usage<br />
|<br />
*Using measurement inappropriately, to measure performance of individuals or make interpretations without context information. <br />
*This can lead to bias in the results or incorrect interpretations. <br />
|}<br />
<br />
===Good Practices===<br />
Some good practices, gathered from the references are provided in Table 2 (Developed for BKCASE):<br />
<br />
{| class="wikitable"<br />
|-<br />
! Good Practice Name<br />
! Good Practice Description<br />
|-<br />
| Periodic Review<br />
|<br />
*Regularly review each measure collected. <br />
|-<br />
|Action Driven<br />
|<br />
*Measurement by itself does not control or improve process performance. <br />
*Measurement results should be provided to decision makers for appropriate action. <br />
|-<br />
|Integration into Project Processes<br />
|<br />
*SE Measurement should be integrated into the project as part of the ongoing project business rhythm.<br />
*Data should be collected as processes are performed, not recreated as an afterthought. <br />
|-<br />
|Timely Information<br />
|<br />
*Information should be obtained early enough to allow necessary action to control or treat risks, adjust tactics and strategies, etc. <br />
*When such actions are not successful, measurement results need to help decision makers determine contingency actions or correct problems. <br />
|-<br />
|Relevance to Decision Makers<br />
|<br />
*Successful measurement requires the communication of meaningful information to the decision makers. <br />
*Results should be presented in the decision maker’s preferred format. <br />
*Allows accurate and expeditious interpretation of the results. <br />
|-<br />
|Data Availability<br />
|<br />
*Decisions can rarely wait for a complete or perfect set of data, so measurement information often needs to be derived from analysis of the best available data, complemented by real-time events and qualitative insight (including experience). <br />
|-<br />
|Historical Data<br />
|<br />
*Use historical data as the basis of plans, measure what is planned versus what is achieved, archive actual achieved results, and use archived data as a historical basis for the next planning effort. <br />
|-<br />
|Information Model<br />
|<br />
*The information model defined in ISO/IEC/IEEE (2007) provides a means to link the entities that are measured to the associated measures and to the identified information need, as well as how the measures are converted into indicators that provide insight to decision makers. <br />
|}<br />
<br />
Additional information can be found in the ''[[Systems Engineering Measurement Primer]]'', Section 4.2 (Frenz et al. 2010), and INCOSE, Section 5.7.1.5 (2010).<br />
<br />
==References== <br />
<br />
<br />
===Works Cited===<br />
Frenz, P., G. Roedler, D.J. Gantzer, P. Baxter. 2010. ''Systems Engineering Measurement Primer: A Basic Introduction to Measurement Concepts and Use for Systems Engineering''. Version 2.0. San Diego, CA: International Council on System Engineering (INCOSE). INCOSE‐TP‐2010‐005‐02. Available at: http://www.incose.org/ProductsPubs/pdf/INCOSE_SysEngMeasurementPrimer_2010-1205.pdf. <br />
<br />
ISO/IEC/IEEE. 2007. ''[[ISO/IEC/IEEE 15939|Systems and software engineering - Measurement process]]''. Geneva, Switzerland: International Organization for Standardization (ISO)/International Electrotechnical Commission (IEC), [[ISO/IEC/IEEE 15939]]:2007. <br />
<br />
Kasunic, M. and W. Anderson. 2004. ''Measuring Systems Interoperability: Challenges and Opportunities.'' Pittsburgh, PA, USA: Software Engineering Institute (SEI)/Carnegie Mellon University (CMU). <br />
<br />
McGarry, J. et al. 2002. ''Practical Software Measurement: Objective Information for Decision Makers''. Boston, MA, USA: Addison-Wesley <br />
<br />
Murdoch, J. et al. 2006. ''Safety Measurement''. Version 3.0. Practical Software and Systems Measurement. Available at: http://www.psmsc.com/Downloads/TechnologyPapers/SafetyWhitePaper_v3.0.pdf.<br />
<br />
Murdoch, J. et al. 2006. ''Security Measurement''. Version 3.0. Practical Software and Systems Measurement. Available at: http://www.psmsc.com/Downloads/TechnologyPapers/SecurityWhitePaper_v3.0.pdf.<br />
<br />
NASA. 2007. [[NASA Systems Engineering Handbook]]. Washington, DC, USA: National Aeronautics and Space Administration (NASA), December 2007. NASA/SP-2007-6105. <br />
<br />
Park, Goethert, and Florac. 1996. ''Goal-Driven Software Measurement – A Guidebook''. Pittsburgh, PA, USA: Software Engineering Institute (SEI)/Carnegie Mellon University (CMU), CMU/SEI-96-BH-002. <br />
<br />
PSM. 2011. "Practical Software and Systems Measurement." Accessed August 18, 2011. Available at: http://www.psmsc.com/.<br />
<br />
PSM. 2000. ''[[Practical Software and Systems Measurement (PSM) Guide]]''. Version 4.0c. Practical Software and System Measurement Support Center. Available at: http://www.psmsc.com/PSMGuide.asp. <br />
<br />
Roedler, G., D. Rhodes, C. Jones, and H. Schimmoller. 2010. ''[[Systems Engineering Leading Indicators Guide]]''. Version 2.0. San Diego, CA, USA: International Council on Systems Engineering (INCOSE), INCOSE-TP-2005-001-03. <br />
<br />
Roedler, G. and C. Jones. 2005. ''[[Technical Measurement Guide]]''. Version 1.0. San Diego, CA, USA: International Council on Systems Engineering (INCOSE), INCOSE-TP-2003-020-01.<br />
<br />
Software Productivity Center, Inc. 2011. Software Productivity Center web site. August 20, 2011. Available at: http://www.spc.ca/<br />
<br />
Statz, J. et al. 2005. ''Measurement for Process Improvement''. Version 1.0. York, UK: Practical Software and Systems Measurement (PSM).<br />
<br />
Tufte, E. 2006. ''The Visual Display of Quantitative Information.'' Cheshire, CT, USA: Graphics Press.<br />
<br />
Wasson, C. 2005. ''System Analysis, Design, Development: Concepts, Principles, and Practices''. Hoboken, NJ, USA: John Wiley and Sons.<br />
<br />
===Primary References===<br />
<br />
Frenz, P., G. Roedler, D.J. Gantzer, P. Baxter. 2010. ''[[Systems Engineering Measurement Primer]]: A Basic Introduction to Measurement Concepts and Use for Systems Engineering.'' Version 2.0. San Diego, CA: International Council on System Engineering (INCOSE). INCOSE‐TP‐2010‐005‐02. Available at: http://www.incose.org/ProductsPubs/pdf/INCOSE_SysEngMeasurementPrimer_2010-1205.pdf. <br />
<br />
ISO/IEC/IEEE. 2007. ''[[ISO/IEC/IEEE 15939|Systems and Software Engineering - Measurement Process]]''. Geneva, Switzerland: International Organization for Standardization (ISO)/International Electrotechnical Commission (IEC), [[ISO/IEC/IEEE 15939]]:2007. <br />
<br />
PSM. 2000. ''[[Practical Software and Systems Measurement (PSM) Guide]].'' Version 4.0c. Practical Software and System Measurement Support Center. Available at: http://www.psmsc.com.<br />
<br />
Roedler, G., D. Rhodes, C. Jones, and H. Schimmoller. 2010. ''[[Systems Engineering Leading Indicators Guide]].'' Version 2.0. San Diego, CA: International Council on Systems Engineering (INCOSE), INCOSE-TP-2005-001-03. <br />
<br />
Roedler, G. and C.Jones. 2005. ''[[Technical Measurement Guide]]''. Version 1.0. San Diego, CA: International Council on Systems Engineering (INCOSE), INCOSE-TP-2003-020-01.<br />
<br />
===Additional References===<br />
<br />
Kasunic, M. and W. Anderson. 2004. ''Measuring Systems Interoperability: Challenges and Opportunities.'' Pittsburgh, PA, USA: Software Engineering Institute (SEI)/Carnegie Mellon University (CMU). <br />
<br />
McGarry, J. et al. 2002. ''Practical Software Measurement: Objective Information for Decision Makers''. Boston, MA, USA: Addison-Wesley <br />
<br />
Murdoch, J. et al. 2006. ''Safety Measurement''. Version 3.0. Practical Software and Systems Measurement. Available at: http://www.psmsc.com/Downloads/TechnologyPapers/SafetyWhitePaper_v3.0.pdf.<br />
<br />
Murdoch, J. et al. 2006. ''Security Measurement''. Version 3.0. Practical Software and Systems Measurement. Available at: http://www.psmsc.com/Downloads/TechnologyPapers/SecurityWhitePaper_v3.0.pdf.<br />
<br />
NASA.2007. ''NASA Systems Engineering Handbook''. Washington, DC, USA: National Aeronautics and Space Administration (NASA), December 2007. NASA/SP-2007-6105. <br />
<br />
Park, Goethert, and Florac. 1996. ''Goal-Driven Software Measurement – A Guidebook''. Pittsburgh, PA, USA: Software Engineering Institute (SEI)/Carnegie Mellon University (CMU), CMU/SEI-96-BH-002. <br />
<br />
PSM. 2011. ''Practical Software and Systems Measurement (PSM) web site''. Accessed August 18, 2011. Available at: http://www.psmsc.com/.<br />
<br />
SEI. 2007. "Measurement and Analysis Process Area" in ''Capability Maturity Model Integrated (CMMI) for Development'', version 1.2. Pittsburgh, PA, USA: Software Engineering Institute (SEI)/Carnegie Mellon University (CMU). <br />
<br />
Software Productivity Center, Inc. 2011. "Software Productivity Center" main web page. Accessed on August 20, 2011. Available at: http://www.spc.ca/.<br />
<br />
Statz, J. et al. 2005. ''Measurement for Process Improvement''. Version 1.0. York, UK: Practical Software and Systems Measurement (PSM).<br />
<br />
Tufte, E. 2006. ''The Visual Display of Quantitative Information.'' Cheshire, CT, USA: Graphics Press.<br />
<br />
Wasson, C. 2005. ''System Analysis, Design, Development: Concepts, Principles, and Practices''. Hoboken, NJ, USA: John Wiley and Sons.<br />
<br />
----<br />
<center>[[Risk Management|< Previous Article]] | [[Systems Engineering Management|Parent Article]] | [[Decision Management|Next Article >]]</center><br />
<br />
{{5comments}}<br />
<br />
[[Category: Part 3]][[Category:Topic]]<br />
[[Category:Systems Engineering Management]]<br />
{{DISQUS}}</div>Cjoneshttps://sebokwiki.org/w/index.php?title=Measurement&diff=36897Measurement2012-07-26T20:06:13Z<p>Cjones: /* Plan Systems Engineering Measurement */</p>
<hr />
<div>Systems engineering [[Acronyms|SE]] [[measurement (glossary)|measurement]] and the accompanying analysis are fundamental elements of SE and technical management. SE measurement provides information relating to the products developed, services provided, and processes implemented to support effective management of the processes and to objectively evaluate product or service quality. Measurement supports realistic planning, provides insight into actual performance, and facilitates assessment of suitable actions (Roedler and Jones 2005, 1-65; Frenz et al. 2010). <br />
<br />
Appropriate measures and indicators are essential inputs to tradeoff analyses to balance cost, schedule, and technical objectives. Periodic analysis of the relationships between measurement results and the requirements and attributes of the system provides insight that helps identify issues early, when they can be resolved with less impact. Historical data, together with project or organizational context information, forms the basis for predictive models and methods that should be used.<br />
<br />
==Fundamental Concepts==<br />
The discussion of measurement here is based on some fundamental concepts. Roedler, et al. states three key SE measurement concepts that are paraphrased here (Roedler and Jones 2005, 1-65):<br />
<br />
#'''SE measurement is a consistent but flexible process''' that is tailored to the unique information needs and characteristics of a particular project or organization and revised as information needs change. <br />
#'''Decision makers must understand what is being measured.''' Key decision makers must be able to connect “what is being measured” to “what they need to know." <br />
#'''Measurement must be used to be effective'''.<br />
<br />
==Measurement Process Overview==<br />
The measurement process as presented here consists of four activities from [[ISO/IEC/IEEE 15939|Practical Software and Systems Measurement (PSM)]] and described in (ISO/IEC/IEEE 15939)(McGarry et al. 2002; Murdoch 2006, 67). <br />
<br />
This process has been the basis for establishing a common process across the software and systems engineering communities. This measurement approach has been adopted by the Capability Maturity Model Integration (CMMI) measurement and analysis process area (SEI 2006, 10), and by international systems and software engineering standards, such as (ISO/IEC/IEEE 2008; ISO/IEC/IEEE 15939; ISO/IEC/IEEE 15288, 1). The International Council on Systems Engineering ([[Acronyms|INCOSE]]) Measurement Working Group has also adopted this measurement approach for several of their measurement assets, such as the [[Systems Engineering Measurement Primer|INCOSE SE Measurement Primer]] (Frenz et al. 2010) and [[Technical Measurement Guide]] (Roedler and Jones 2005). This approach has provided a consistent treatment of measurement that allows the engineering community to communicate more effectively about measurement. The process is illustrated in Figure 1 from Roedler and Jones (2005) and McGarry et al. (2002). <br />
<br />
[[File:Measurement_Process_Model-Figure_1.png|thumb|600px|center|Figure 1. Four Key Measurement Process Activities (PSM 2011)Reprinted with permission of Practical Software and Systems Measurement ([http://www.psmsc.com PSM])]] <br />
<br />
<br />
===Establish and Sustain Commitment===<br />
This activity focuses on establishing the resources, training, and tools to implement a measurement process and ensure that there is a management commitment to use the information that is produced. Refer to PSM (August 18, 2011) and SPC (2011) for additional detail.<br />
<br />
===Plan Systems Engineering Measurement===<br />
This activity focuses on defining measures that provide insight into project or organization [[Information Need (glossary)|information needs]]. This includes identifying what the decision makers need to know and when they need to know it, relating these information needs to those entities that can be measured, and then identifying, prioritizing, selecting, and specifying [[Measure (glossary)|measures]] based on project and organization processes (Jones 2003, 15-19). This activity also identifies the reporting format, forums, and target audience for the information provided by the measures.<br />
<br />
Here are a few widely used approaches to identify the information needs and derive associated measures. Each can be focused on identifying measures that are needed for SE management. These include:<br />
<br />
*The [[Acronyms|PSM]] approach, which uses a set of [[Information Category (glossary)|information categories]], [[Measurable Concept (glossary)|measurable concepts]], and candidate measures to aid the user in determining relevant information needs and aspects about the information needs on which to focus (PSM August 18, 2011). <br />
<br />
*The [[Acronyms|goal-question-metric (GQM)]] approach, which identifies explicit measurement goals. Each goal is decomposed into several questions that help in the selection of measures that address the question and provide insight into the goal achievement (Park, Goethert, and Florac 1996). <br />
<br />
*Software Productivity Center’s 8-step Metrics Program, which also includes stating the goals and defining measures needed to gain insight for achieving the goals (SPC 2011). <br />
<br />
The following are good sources for candidate measures that address the information needs and measurable concepts/questions:<br />
*PSM Web Site (PSM August 18, 2011)<br />
*PSM Guide, Version 4.0, Chapters 3 and 5 (PSM 2000)<br />
*SE Leading Indicators Guide, Version 2.0, Section 3 (Roedler et al. 2010)<br />
*Technical Measurement Guide, Version 1.0, Section 10 (Roedler and Jones 2005, 1-65)<br />
*Safety Measurement (PSM White Paper), Version 3.0, Section 3.4 (Murdoch 2006, 60)<br />
*Security Measurement (PSM White Paper), Version 3.0, Section 7 (Murdoch 2006, 67)<br />
*Measuring Systems Interoperability, Section 5 and Appendix C (Kasunic and Anderson 2004)<br />
*Measurement for Process Improvement (PSM Technical Report), Version 1.0, Appendix E (Statz 2005)<br />
<br />
The INCOSE SE Measurement Primer (Frenz et al. 2010) provides a list of attributes of a good measure with definitions for each [[Attribute (glossary)|attribute]]. The attributes include ''relevance, completeness, timeliness, simplicity, cost effectiveness, repeatability, and accuracy.'' Evaluating candidate measures against these attributes can help assure the selection of more effective measures. <br />
<br />
The details of the [[measure (glossary)|measure]] need to be unambiguously defined and documented. Templates for the specification of measures and indicators are available on the PSM website and in (Goethert and Siviy 2004).<br />
<br />
===Perform Systems Engineering Measurement===<br />
This activity focuses on the collection and preparation of measurement data, measurement analysis, and the presentation of the results to inform decision making. The preparation of the measurement data includes verification, normalization, and aggregation of the data, as applicable. Analysis includes estimation, feasibility analysis of plans, and performance analysis of actual data against plans. <br />
<br />
The quality of the measurement results is dependent on the collection and preparation of valid, accurate, and unbiased data. Data verification, validation, preparation, and analysis techniques are discussed in PSM (August 18, 2011)Chapters 1 and 4 and SEI (2006, 10). Per TL 9000, Quality Management System Guidance, “The analysis step should integrate quantitative measurement results and other qualitative project information, in order to provide managers the feedback needed for effective decision making” (Quest 2010, 5-10). This provides richer information that gives the users the broader picture and puts the information in the appropriate context. <br />
<br />
There is a significant body of guidance available on good ways to present quantitative information. Edward Tufte has several books focused on the visualization of information, including ''The Visual Display of Quantitative Information'' (Tufte 2001). <br />
<br />
More information about understanding and using measurement results can be found in:<br />
*PSM (August 18, 2011)<br />
*ISO/IEC/IEEE 15939, clauses 4.3.3 and 4.3.4<br />
*Roedler and Jones (2005), sections 6.4, 7.2, and 7.3<br />
<br />
===Evaluate Systems Engineering Measurement===<br />
This activity includes the knowledge explaining the periodic evaluation and improvement of the measurement process and specific measures. One objective is to ensure that the measures continue to align with the business goals and information needs, and provide useful insight. Refer to PSM (August 18, 2011) and <br />
''Practical Software Measurement: Objective Information for Decision Makers'' (McGarry et al. 2002) for additional detail.<br />
<br />
==Systems Engineering Leading Indicators==<br />
Leading indicators are aimed at providing predictive insight regarding an information need. A systems engineering leading indicator “is a measure for evaluating the effectiveness of a how a specific activity is applied on a project in a manner that provides information about impacts that are likely to affect the system performance objectives.” Leading indicators may be individual measures or collections of measures and associated analysis that provide future systems engineering performance insight throughout the life cycle of the system. “Leading indicators support the effective management of systems engineering by providing visibility into expected project performance and potential future states.” <br />
<br />
As shown in Figure 2, a leading indicator is composed of characteristics, a condition, and a predicted behavior. The characteristics and condition are analyzed on a periodic or as-needed basis to predict behavior within a given confidence and within an accepted time range into the future. More information is found in (Roedler et al. 2010).<br />
<br />
[[File:Composition_of_Leading_Indicator-Figure_2.png|thumb|600px|center|Figure 2. Composition of a Leading Indicator (Roedler et al. 2010) Reprinted with permission of the International Council on Systems Engineering ([http://www.psmsc.com INCOSE]) and Practical Software and Systems Measurement ([http://www.psmsc.com PSM])]]<br />
<br />
==Technical Measurement==<br />
Technical measurement is the set of measurement activities used to provide information about progress in the definition and development of the technical solution, ongoing assessment of the associated risks and issues, and the likelihood of meeting the critical objectives of the [[Acquirer (glossary)|acquirer]]. This insight helps an engineer make better throughout the life cycle of a system to increase the probability of delivering a technical solution that meets both the specified requirements and the mission needs. The insight is also used in trade-off decisions when performance is not within the thresholds or goals.<br />
<br />
Technical measurement includes [[Measure of Effectiveness (MoE) (glossary)|measures of effectiveness]] ([[Acronyms|MOE]]s), [[Measure of Performance (MoP) (glossary)|measures of performance]] ([[Acronyms|MOP]]s), and [[Technical Performance Measure (TPM) (glossary)|technical performance measures]] ([[Acronyms|TPM]]s). (Roedler and Jones 2005, 1-65) The relationships between these types of technical measures are shown in Figure 3 and explained in the reference. Using the measurement process described above, technical measurement can be planned early in the life cycle and then performed throughout the life cycle with increasing levels of fidelity as the technical solution is developed, facilitating predictive insight and preventive or corrective actions. More information about technical measurement can be found in the ''[[NASA Systems Engineering Handbook]]'', ''System Analysis, Design, Development: Concepts, principles, and practices'', and the ''[[Systems Engineering Leading Indicators Guide]]'' (NASA December 2007, 1-360, Section 6.7.2.2; Wasson 2006, Chapter 34; Roedler and Jones 2005).<br />
<br />
[[File:Technical_Measures_Relationship-Figure_3.png|thumb|600px|center|Figure 3. Relationship of the Technical Measures (Roedler and Jones 2010) Reprinted with permission of the International Council on Systems Engineering ([http://www.psmsc.com INCOSE]) and Practical Software and Systems Measurement ([http://www.psmsc.com PSM])]]<br />
<br />
==Service Measurement==<br />
The same measurement activities can be applied for service measurement; however, the context and measures will be different. Service providers have a need to balance efficiency and effectiveness, which may be opposing objectives. Good service measures are outcome-based; focus on elements important to the customer; such as service availability, reliability, and performance; and provide timely, forward-looking information. <br />
<br />
For services, the terms [[Acronyms|critical success factors (CSF)]] and [[Acronyms|key performance indicators (KPI)]] are used often when discussing measurement. [[Acronyms|CSF]]s are the key elements of the service or service infrastructure that are most important to achieve the business objectives. Key performance indicators are specific values or characteristics measured to assess achievement of those objectives.<br />
More information about service measurement can be found in the Service Design and Continual Service Improvement volumes of BMP (2010, 1). Service SE can be found in the [[Service Systems Engineering]] article.<br />
<br />
==Linkages to Other Systems Engineering Management Topics==<br />
SE measurement has linkages to other [[Acronyms|SEM]] topics. The following are a few key linkages adapted from Roedler and Jones (2005):<br />
*[[Planning]] – SE measurement provides the historical data and supports the estimation for, and feasibility analysis of, the plans for realistic planning. <br />
*[[Assessment and Control]] – SE measurement provides the objective information needed to perform the assessment and determination of appropriate control actions. The use of leading indicators allows for early assessment and control actions that identify risks and/or provide insight to allow early treatment of risks to minimize potential impacts.<br />
*[[Risk Management]] – SE risk management identifies the information needs that can impact project and organizational performance. SE measurement data helps to quantify risks and subsequently provides information about whether risks have been successfully managed.<br />
*[[Decision Management]] – SE Measurement results inform decision making by providing objective insight.<br />
<br />
==Practical Considerations==<br />
Key pitfalls and good practices related to systems engineering measurement are described in the next two sections.<br />
<br />
===Pitfalls===<br />
Some of the key pitfalls encountered in planning and performing SE Measurement are provided in Table 1 (Developed for BKCASE):<br />
<br />
{| class="wikitable"<br />
|-<br />
! Pitfall Name<br />
! Pitfall Description<br />
|-<br />
| Golden Measures<br />
|<br />
*Looking for the one measure or small set of measures that applies to all projects. <br />
*No one-size-fits-all measure or measurement set exists. <br />
*Each project has unique information needs (e.g., objectives, risks, and issues). <br />
*The one exception is that, in some cases with consistent product lines, processes, and information needs, a small core set of measures may be defined for use across an organization.<br />
|-<br />
|Single-pass Perspective<br />
|<br />
*Viewing measurement as a single-pass activity.<br />
*To be effective, measurement needs to be performed continuously, including the periodic identification and prioritization of information needs and associated measures. <br />
|-<br />
|Unknown Information Need<br />
|<br />
*Performing measurement activities without the understanding of why the measures are needed and what information they provide. <br />
*This can lead to wasted effort. <br />
|-<br />
|Inappropriate Usage<br />
|<br />
*Using measurement inappropriately, to measure performance of individuals or make interpretations without context information. <br />
*This can lead to bias in the results or incorrect interpretations. <br />
|}<br />
<br />
===Good Practices===<br />
Some good practices, gathered from the references are provided in Table 2 (Developed for BKCASE):<br />
<br />
{| class="wikitable"<br />
|-<br />
! Good Practice Name<br />
! Good Practice Description<br />
|-<br />
| Periodic Review<br />
|<br />
*Regularly review each measure collected. <br />
|-<br />
|Action Driven<br />
|<br />
*Measurement by itself does not control or improve process performance. <br />
*Measurement results should be provided to decision makers for appropriate action. <br />
|-<br />
|Integration into Project Processes<br />
|<br />
*SE Measurement should be integrated into the project as part of the ongoing project business rhythm.<br />
*Data should be collected as processes are performed, not recreated as an afterthought. <br />
|-<br />
|Timely Information<br />
|<br />
*Information should be obtained early enough to allow necessary action to control or treat risks, adjust tactics and strategies, etc. <br />
*When such actions are not successful, measurement results need to help decision makers determine contingency actions or correct problems. <br />
|-<br />
|Relevance to Decision Makers<br />
|<br />
*Successful measurement requires the communication of meaningful information to the decision makers. <br />
*Results should be presented in the decision maker’s preferred format. <br />
*Allows accurate and expeditious interpretation of the results. <br />
|-<br />
|Data Availability<br />
|<br />
*Decisions can rarely wait for a complete or perfect set of data, so measurement information often needs to be derived from analysis of the best available data, complemented by real-time events and qualitative insight (including experience). <br />
|-<br />
|Historical Data<br />
|<br />
*Use historical data as the basis of plans, measure what is planned versus what is achieved, archive actual achieved results, and use archived data as a historical basis for the next planning effort. <br />
|-<br />
|Information Model<br />
|<br />
*The information model defined in ISO/IEC/IEEE (2007) provides a means to link the entities that are measured to the associated measures and to the identified information need, as well as how the measures are converted into indicators that provide insight to decision makers. <br />
|}<br />
<br />
Additional information can be found in the ''[[Systems Engineering Measurement Primer]]'', Section 4.2 (Frenz et al. 2010), and INCOSE, Section 5.7.1.5 (2010).<br />
<br />
==References== <br />
<br />
<br />
===Works Cited===<br />
Frenz, P., G. Roedler, D.J. Gantzer, P. Baxter. 2010. ''Systems Engineering Measurement Primer: A Basic Introduction to Measurement Concepts and Use for Systems Engineering''. Version 2.0. San Diego, CA: International Council on System Engineering (INCOSE). INCOSE‐TP‐2010‐005‐02. Available at: http://www.incose.org/ProductsPubs/pdf/INCOSE_SysEngMeasurementPrimer_2010-1205.pdf. <br />
<br />
ISO/IEC/IEEE. 2007. ''[[ISO/IEC/IEEE 15939|Systems and software engineering - Measurement process]]''. Geneva, Switzerland: International Organization for Standardization (ISO)/International Electrotechnical Commission (IEC), [[ISO/IEC/IEEE 15939]]:2007. <br />
<br />
Kasunic, M. and W. Anderson. 2004. ''Measuring Systems Interoperability: Challenges and Opportunities.'' Pittsburgh, PA, USA: Software Engineering Institute (SEI)/Carnegie Mellon University (CMU). <br />
<br />
McGarry, J. et al. 2002. ''Practical Software Measurement: Objective Information for Decision Makers''. Boston, MA, USA: Addison-Wesley <br />
<br />
Murdoch, J. et al. 2006. ''Safety Measurement''. Version 3.0. Practical Software and Systems Measurement. Available at: http://www.psmsc.com/Downloads/TechnologyPapers/SafetyWhitePaper_v3.0.pdf.<br />
<br />
Murdoch, J. et al. 2006. ''Security Measurement''. Version 3.0. Practical Software and Systems Measurement. Available at: http://www.psmsc.com/Downloads/TechnologyPapers/SecurityWhitePaper_v3.0.pdf.<br />
<br />
NASA. 2007. [[NASA Systems Engineering Handbook]]. Washington, DC, USA: National Aeronautics and Space Administration (NASA), December 2007. NASA/SP-2007-6105. <br />
<br />
Park, Goethert, and Florac. 1996. ''Goal-Driven Software Measurement – A Guidebook''. Pittsburgh, PA, USA: Software Engineering Institute (SEI)/Carnegie Mellon University (CMU), CMU/SEI-96-BH-002. <br />
<br />
PSM. 2011. "Practical Software and Systems Measurement." Accessed August 18, 2011. Available at: http://www.psmsc.com/.<br />
<br />
PSM. 2000. ''[[Practical Software and Systems Measurement (PSM) Guide]]''. Version 4.0c. Practical Software and System Measurement Support Center. Available at: http://www.psmsc.com/PSMGuide.asp. <br />
<br />
Roedler, G., D. Rhodes, C. Jones, and H. Schimmoller. 2010. ''[[Systems Engineering Leading Indicators Guide]]''. Version 2.0. San Diego, CA, USA: International Council on Systems Engineering (INCOSE), INCOSE-TP-2005-001-03. <br />
<br />
Roedler, G. and C. Jones. 2005. ''[[Technical Measurement Guide]]''. Version 1.0. San Diego, CA, USA: International Council on Systems Engineering (INCOSE), INCOSE-TP-2003-020-01.<br />
<br />
Software Productivity Center, Inc. 2011. Software Productivity Center web site. August 20, 2011. Available at: http://www.spc.ca/<br />
<br />
Statz, J. et al. 2005. ''Measurement for Process Improvement''. Version 1.0. York, UK: Practical Software and Systems Measurement (PSM).<br />
<br />
Tufte, E. 2006. ''The Visual Display of Quantitative Information.'' Cheshire, CT, USA: Graphics Press.<br />
<br />
Wasson, C. 2005. ''System Analysis, Design, Development: Concepts, Principles, and Practices''. Hoboken, NJ, USA: John Wiley and Sons.<br />
<br />
===Primary References===<br />
<br />
Frenz, P., G. Roedler, D.J. Gantzer, P. Baxter. 2010. ''[[Systems Engineering Measurement Primer]]: A Basic Introduction to Measurement Concepts and Use for Systems Engineering.'' Version 2.0. San Diego, CA: International Council on System Engineering (INCOSE). INCOSE‐TP‐2010‐005‐02. Available at: http://www.incose.org/ProductsPubs/pdf/INCOSE_SysEngMeasurementPrimer_2010-1205.pdf. <br />
<br />
ISO/IEC/IEEE. 2007. ''[[ISO/IEC/IEEE 15939|Systems and Software Engineering - Measurement Process]]''. Geneva, Switzerland: International Organization for Standardization (ISO)/International Electrotechnical Commission (IEC), [[ISO/IEC/IEEE 15939]]:2007. <br />
<br />
PSM. 2000. ''[[Practical Software and Systems Measurement (PSM) Guide]].'' Version 4.0c. Practical Software and System Measurement Support Center. Available at: http://www.psmsc.com.<br />
<br />
Roedler, G., D. Rhodes, C. Jones, and H. Schimmoller. 2010. ''[[Systems Engineering Leading Indicators Guide]].'' Version 2.0. San Diego, CA: International Council on Systems Engineering (INCOSE), INCOSE-TP-2005-001-03. <br />
<br />
Roedler, G. and C.Jones. 2005. ''[[Technical Measurement Guide]]''. Version 1.0. San Diego, CA: International Council on Systems Engineering (INCOSE), INCOSE-TP-2003-020-01.<br />
<br />
===Additional References===<br />
<br />
Kasunic, M. and W. Anderson. 2004. ''Measuring Systems Interoperability: Challenges and Opportunities.'' Pittsburgh, PA, USA: Software Engineering Institute (SEI)/Carnegie Mellon University (CMU). <br />
<br />
McGarry, J. et al. 2002. ''Practical Software Measurement: Objective Information for Decision Makers''. Boston, MA, USA: Addison-Wesley <br />
<br />
Murdoch, J. et al. 2006. ''Safety Measurement''. Version 3.0. Practical Software and Systems Measurement. Available at: http://www.psmsc.com/Downloads/TechnologyPapers/SafetyWhitePaper_v3.0.pdf.<br />
<br />
Murdoch, J. et al. 2006. ''Security Measurement''. Version 3.0. Practical Software and Systems Measurement. Available at: http://www.psmsc.com/Downloads/TechnologyPapers/SecurityWhitePaper_v3.0.pdf.<br />
<br />
NASA.2007. ''NASA Systems Engineering Handbook''. Washington, DC, USA: National Aeronautics and Space Administration (NASA), December 2007. NASA/SP-2007-6105. <br />
<br />
Park, Goethert, and Florac. 1996. ''Goal-Driven Software Measurement – A Guidebook''. Pittsburgh, PA, USA: Software Engineering Institute (SEI)/Carnegie Mellon University (CMU), CMU/SEI-96-BH-002. <br />
<br />
PSM. 2011. ''Practical Software and Systems Measurement (PSM) web site''. Accessed August 18, 2011. Available at: http://www.psmsc.com/.<br />
<br />
SEI. 2007. "Measurement and Analysis Process Area" in ''Capability Maturity Model Integrated (CMMI) for Development'', version 1.2. Pittsburgh, PA, USA: Software Engineering Institute (SEI)/Carnegie Mellon University (CMU). <br />
<br />
Software Productivity Center, Inc. 2011. "Software Productivity Center" main web page. Accessed on August 20, 2011. Available at: http://www.spc.ca/.<br />
<br />
Statz, J. et al. 2005. ''Measurement for Process Improvement''. Version 1.0. York, UK: Practical Software and Systems Measurement (PSM).<br />
<br />
Tufte, E. 2006. ''The Visual Display of Quantitative Information.'' Cheshire, CT, USA: Graphics Press.<br />
<br />
Wasson, C. 2005. ''System Analysis, Design, Development: Concepts, Principles, and Practices''. Hoboken, NJ, USA: John Wiley and Sons.<br />
<br />
----<br />
<center>[[Risk Management|< Previous Article]] | [[Systems Engineering Management|Parent Article]] | [[Decision Management|Next Article >]]</center><br />
<br />
{{5comments}}<br />
<br />
[[Category: Part 3]][[Category:Topic]]<br />
[[Category:Systems Engineering Management]]<br />
{{DISQUS}}</div>Cjoneshttps://sebokwiki.org/w/index.php?title=Measurement&diff=36892Measurement2012-07-26T20:00:30Z<p>Cjones: </p>
<hr />
<div>Systems engineering [[Acronyms|SE]] [[measurement (glossary)|measurement]] and the accompanying analysis are fundamental elements of SE and technical management. SE measurement provides information relating to the products developed, services provided, and processes implemented to support effective management of the processes and to objectively evaluate product or service quality. Measurement supports realistic planning, provides insight into actual performance, and facilitates assessment of suitable actions (Roedler and Jones 2005, 1-65; Frenz et al. 2010). <br />
<br />
Appropriate measures and indicators are essential inputs to tradeoff analyses to balance cost, schedule, and technical objectives. Periodic analysis of the relationships between measurement results and the requirements and attributes of the system provides insight that helps identify issues early, when they can be resolved with less impact. Historical data, together with project or organizational context information, forms the basis for predictive models and methods that should be used.<br />
<br />
==Fundamental Concepts==<br />
The discussion of measurement here is based on some fundamental concepts. Roedler, et al. states three key SE measurement concepts that are paraphrased here (Roedler and Jones 2005, 1-65):<br />
<br />
#'''SE measurement is a consistent but flexible process''' that is tailored to the unique information needs and characteristics of a particular project or organization and revised as information needs change. <br />
#'''Decision makers must understand what is being measured.''' Key decision makers must be able to connect “what is being measured” to “what they need to know." <br />
#'''Measurement must be used to be effective'''.<br />
<br />
==Measurement Process Overview==<br />
The measurement process as presented here consists of four activities from [[ISO/IEC/IEEE 15939|Practical Software and Systems Measurement (PSM)]] and described in (ISO/IEC/IEEE 15939)(McGarry et al. 2002; Murdoch 2006, 67). <br />
<br />
This process has been the basis for establishing a common process across the software and systems engineering communities. This measurement approach has been adopted by the Capability Maturity Model Integration (CMMI) measurement and analysis process area (SEI 2006, 10), and by international systems and software engineering standards, such as (ISO/IEC/IEEE 2008; ISO/IEC/IEEE 15939; ISO/IEC/IEEE 15288, 1). The International Council on Systems Engineering ([[Acronyms|INCOSE]]) Measurement Working Group has also adopted this measurement approach for several of their measurement assets, such as the [[Systems Engineering Measurement Primer|INCOSE SE Measurement Primer]] (Frenz et al. 2010) and [[Technical Measurement Guide]] (Roedler and Jones 2005). This approach has provided a consistent treatment of measurement that allows the engineering community to communicate more effectively about measurement. The process is illustrated in Figure 1 from Roedler and Jones (2005) and McGarry et al. (2002). <br />
<br />
[[File:Measurement_Process_Model-Figure_1.png|thumb|600px|center|Figure 1. Four Key Measurement Process Activities (PSM 2011)Reprinted with permission of Practical Software and Systems Measurement ([http://www.psmsc.com PSM])]] <br />
<br />
<br />
===Establish and Sustain Commitment===<br />
This activity focuses on establishing the resources, training, and tools to implement a measurement process and ensure that there is a management commitment to use the information that is produced. Refer to PSM (August 18, 2011) and SPC (2011) for additional detail.<br />
<br />
===Plan Systems Engineering Measurement===<br />
This activity focuses on defining measures that provide insight into project or organization [[Information Need (glossary)|information needs]]. This includes identifying what the decision makers need to know, relating these information needs to those entities that can be measured, and then identifying, prioritizing, selecting, and specifying [[Measure (glossary)|measures]] based on project and organization processes (Jones 2003, 15-19). <br />
<br />
There are a few widely used approaches to identify the information needs and derive associated measures. Each focuses on identifying measures that are needed for SE management. These include:<br />
<br />
*The [[Acronyms|PSM]] approach, which uses a set of [[Information Category (glossary)|information categories]], [[Measurable Concept (glossary)|measurable concepts]], and candidate measures to aid the user in determining relevant information needs and aspects about the information needs on which to focus (PSM August 18, 2011). <br />
<br />
*The [[Acronyms|goal-question-metric (GQM)]] approach, which identifies explicit measurement goals. Each goal is decomposed into several questions that help in the selection of measures that address the question and provide insight into the goal achievement (Park, Goethert, and Florac 1996). <br />
<br />
*Software Productivity Center’s 8-step Metrics Program, which also includes stating the goals and defining measures needed to gain insight for achieving the goals (SPC 2011). <br />
<br />
The following are good sources for candidate measures that address the information needs and measurable concepts/questions:<br />
*PSM Web Site (PSM August 18, 2011)<br />
*PSM Guide, Version 4.0, Chapters 3 and 5 (PSM 2000)<br />
*SE Leading Indicators Guide, Version 2.0, Section 3 (Roedler et al. 2010)<br />
*Technical Measurement Guide, Version 1.0, Section 10 (Roedler and Jones 2005, 1-65)<br />
*Safety Measurement (PSM White Paper), Version 3.0, Section 3.4 (Murdoch 2006, 60)<br />
*Security Measurement (PSM White Paper), Version 3.0, Section 7 (Murdoch 2006, 67)<br />
*Measuring Systems Interoperability, Section 5 and Appendix C (Kasunic and Anderson 2004)<br />
*Measurement for Process Improvement (PSM Technical Report), Version 1.0, Appendix E (Statz 2005)<br />
<br />
The INCOSE SE Measurement Primer (Frenz et al. 2010) provides a list of attributes of a good measure with definitions for each [[Attribute (glossary)|attribute]]. The attributes include ''relevance, completeness, timeliness, simplicity, cost effectiveness, repeatability, and accuracy.'' Evaluating candidate measures against these attributes can help assure the selection of more effective measures. <br />
<br />
The details of the [[measure (glossary)|measure]] need to be unambiguously defined and documented. Templates for the specification of measures and indicators are available on the PSM website and in (Goethert and Siviy 2004).<br />
<br />
===Perform Systems Engineering Measurement===<br />
This activity focuses on the collection and preparation of measurement data, measurement analysis, and the presentation of the results to inform decision making. The preparation of the measurement data includes verification, normalization, and aggregation of the data, as applicable. Analysis includes estimation, feasibility analysis of plans, and performance analysis of actual data against plans. <br />
<br />
The quality of the measurement results is dependent on the collection and preparation of valid, accurate, and unbiased data. Data verification, validation, preparation, and analysis techniques are discussed in PSM (August 18, 2011)Chapters 1 and 4 and SEI (2006, 10). Per TL 9000, Quality Management System Guidance, “The analysis step should integrate quantitative measurement results and other qualitative project information, in order to provide managers the feedback needed for effective decision making” (Quest 2010, 5-10). This provides richer information that gives the users the broader picture and puts the information in the appropriate context. <br />
<br />
There is a significant body of guidance available on good ways to present quantitative information. Edward Tufte has several books focused on the visualization of information, including ''The Visual Display of Quantitative Information'' (Tufte 2001). <br />
<br />
More information about understanding and using measurement results can be found in:<br />
*PSM (August 18, 2011)<br />
*ISO/IEC/IEEE 15939, clauses 4.3.3 and 4.3.4<br />
*Roedler and Jones (2005), sections 6.4, 7.2, and 7.3<br />
<br />
===Evaluate Systems Engineering Measurement===<br />
This activity includes the knowledge explaining the periodic evaluation and improvement of the measurement process and specific measures. One objective is to ensure that the measures continue to align with the business goals and information needs, and provide useful insight. Refer to PSM (August 18, 2011) and <br />
''Practical Software Measurement: Objective Information for Decision Makers'' (McGarry et al. 2002) for additional detail.<br />
<br />
==Systems Engineering Leading Indicators==<br />
Leading indicators are aimed at providing predictive insight regarding an information need. A systems engineering leading indicator “is a measure for evaluating the effectiveness of a how a specific activity is applied on a project in a manner that provides information about impacts that are likely to affect the system performance objectives.” Leading indicators may be individual measures or collections of measures and associated analysis that provide future systems engineering performance insight throughout the life cycle of the system. “Leading indicators support the effective management of systems engineering by providing visibility into expected project performance and potential future states.” <br />
<br />
As shown in Figure 2, a leading indicator is composed of characteristics, a condition, and a predicted behavior. The characteristics and condition are analyzed on a periodic or as-needed basis to predict behavior within a given confidence and within an accepted time range into the future. More information is found in (Roedler et al. 2010).<br />
<br />
[[File:Composition_of_Leading_Indicator-Figure_2.png|thumb|600px|center|Figure 2. Composition of a Leading Indicator (Roedler et al. 2010) Reprinted with permission of the International Council on Systems Engineering ([http://www.psmsc.com INCOSE]) and Practical Software and Systems Measurement ([http://www.psmsc.com PSM])]]<br />
<br />
==Technical Measurement==<br />
Technical measurement is the set of measurement activities used to provide information about progress in the definition and development of the technical solution, ongoing assessment of the associated risks and issues, and the likelihood of meeting the critical objectives of the [[Acquirer (glossary)|acquirer]]. This insight helps an engineer make better throughout the life cycle of a system to increase the probability of delivering a technical solution that meets both the specified requirements and the mission needs. The insight is also used in trade-off decisions when performance is not within the thresholds or goals.<br />
<br />
Technical measurement includes [[Measure of Effectiveness (MoE) (glossary)|measures of effectiveness]] ([[Acronyms|MOE]]s), [[Measure of Performance (MoP) (glossary)|measures of performance]] ([[Acronyms|MOP]]s), and [[Technical Performance Measure (TPM) (glossary)|technical performance measures]] ([[Acronyms|TPM]]s). (Roedler and Jones 2005, 1-65) The relationships between these types of technical measures are shown in Figure 3 and explained in the reference. Using the measurement process described above, technical measurement can be planned early in the life cycle and then performed throughout the life cycle with increasing levels of fidelity as the technical solution is developed, facilitating predictive insight and preventive or corrective actions. More information about technical measurement can be found in the ''[[NASA Systems Engineering Handbook]]'', ''System Analysis, Design, Development: Concepts, principles, and practices'', and the ''[[Systems Engineering Leading Indicators Guide]]'' (NASA December 2007, 1-360, Section 6.7.2.2; Wasson 2006, Chapter 34; Roedler and Jones 2005).<br />
<br />
[[File:Technical_Measures_Relationship-Figure_3.png|thumb|600px|center|Figure 3. Relationship of the Technical Measures (Roedler and Jones 2010) Reprinted with permission of the International Council on Systems Engineering ([http://www.psmsc.com INCOSE]) and Practical Software and Systems Measurement ([http://www.psmsc.com PSM])]]<br />
<br />
==Service Measurement==<br />
The same measurement activities can be applied for service measurement; however, the context and measures will be different. Service providers have a need to balance efficiency and effectiveness, which may be opposing objectives. Good service measures are outcome-based; focus on elements important to the customer; such as service availability, reliability, and performance; and provide timely, forward-looking information. <br />
<br />
For services, the terms [[Acronyms|critical success factors (CSF)]] and [[Acronyms|key performance indicators (KPI)]] are used often when discussing measurement. [[Acronyms|CSF]]s are the key elements of the service or service infrastructure that are most important to achieve the business objectives. Key performance indicators are specific values or characteristics measured to assess achievement of those objectives.<br />
More information about service measurement can be found in the Service Design and Continual Service Improvement volumes of BMP (2010, 1). Service SE can be found in the [[Service Systems Engineering]] article.<br />
<br />
==Linkages to Other Systems Engineering Management Topics==<br />
SE measurement has linkages to other [[Acronyms|SEM]] topics. The following are a few key linkages adapted from Roedler and Jones (2005):<br />
*[[Planning]] – SE measurement provides the historical data and supports the estimation for, and feasibility analysis of, the plans for realistic planning. <br />
*[[Assessment and Control]] – SE measurement provides the objective information needed to perform the assessment and determination of appropriate control actions. The use of leading indicators allows for early assessment and control actions that identify risks and/or provide insight to allow early treatment of risks to minimize potential impacts.<br />
*[[Risk Management]] – SE risk management identifies the information needs that can impact project and organizational performance. SE measurement data helps to quantify risks and subsequently provides information about whether risks have been successfully managed.<br />
*[[Decision Management]] – SE Measurement results inform decision making by providing objective insight.<br />
<br />
==Practical Considerations==<br />
Key pitfalls and good practices related to systems engineering measurement are described in the next two sections.<br />
<br />
===Pitfalls===<br />
Some of the key pitfalls encountered in planning and performing SE Measurement are provided in Table 1 (Developed for BKCASE):<br />
<br />
{| class="wikitable"<br />
|-<br />
! Pitfall Name<br />
! Pitfall Description<br />
|-<br />
| Golden Measures<br />
|<br />
*Looking for the one measure or small set of measures that applies to all projects. <br />
*No one-size-fits-all measure or measurement set exists. <br />
*Each project has unique information needs (e.g., objectives, risks, and issues). <br />
*The one exception is that, in some cases with consistent product lines, processes, and information needs, a small core set of measures may be defined for use across an organization.<br />
|-<br />
|Single-pass Perspective<br />
|<br />
*Viewing measurement as a single-pass activity.<br />
*To be effective, measurement needs to be performed continuously, including the periodic identification and prioritization of information needs and associated measures. <br />
|-<br />
|Unknown Information Need<br />
|<br />
*Performing measurement activities without the understanding of why the measures are needed and what information they provide. <br />
*This can lead to wasted effort. <br />
|-<br />
|Inappropriate Usage<br />
|<br />
*Using measurement inappropriately, to measure performance of individuals or make interpretations without context information. <br />
*This can lead to bias in the results or incorrect interpretations. <br />
|}<br />
<br />
===Good Practices===<br />
Some good practices, gathered from the references are provided in Table 2 (Developed for BKCASE):<br />
<br />
{| class="wikitable"<br />
|-<br />
! Good Practice Name<br />
! Good Practice Description<br />
|-<br />
| Periodic Review<br />
|<br />
*Regularly review each measure collected. <br />
|-<br />
|Action Driven<br />
|<br />
*Measurement by itself does not control or improve process performance. <br />
*Measurement results should be provided to decision makers for appropriate action. <br />
|-<br />
|Integration into Project Processes<br />
|<br />
*SE Measurement should be integrated into the project as part of the ongoing project business rhythm.<br />
*Data should be collected as processes are performed, not recreated as an afterthought. <br />
|-<br />
|Timely Information<br />
|<br />
*Information should be obtained early enough to allow necessary action to control or treat risks, adjust tactics and strategies, etc. <br />
*When such actions are not successful, measurement results need to help decision makers determine contingency actions or correct problems. <br />
|-<br />
|Relevance to Decision Makers<br />
|<br />
*Successful measurement requires the communication of meaningful information to the decision makers. <br />
*Results should be presented in the decision maker’s preferred format. <br />
*Allows accurate and expeditious interpretation of the results. <br />
|-<br />
|Data Availability<br />
|<br />
*Decisions can rarely wait for a complete or perfect set of data, so measurement information often needs to be derived from analysis of the best available data, complemented by real-time events and qualitative insight (including experience). <br />
|-<br />
|Historical Data<br />
|<br />
*Use historical data as the basis of plans, measure what is planned versus what is achieved, archive actual achieved results, and use archived data as a historical basis for the next planning effort. <br />
|-<br />
|Information Model<br />
|<br />
*The information model defined in ISO/IEC/IEEE (2007) provides a means to link the entities that are measured to the associated measures and to the identified information need, as well as how the measures are converted into indicators that provide insight to decision makers. <br />
|}<br />
<br />
Additional information can be found in the ''[[Systems Engineering Measurement Primer]]'', Section 4.2 (Frenz et al. 2010), and INCOSE, Section 5.7.1.5 (2010).<br />
<br />
==References== <br />
<br />
<br />
===Works Cited===<br />
Frenz, P., G. Roedler, D.J. Gantzer, P. Baxter. 2010. ''Systems Engineering Measurement Primer: A Basic Introduction to Measurement Concepts and Use for Systems Engineering''. Version 2.0. San Diego, CA: International Council on System Engineering (INCOSE). INCOSE‐TP‐2010‐005‐02. Available at: http://www.incose.org/ProductsPubs/pdf/INCOSE_SysEngMeasurementPrimer_2010-1205.pdf. <br />
<br />
ISO/IEC/IEEE. 2007. ''[[ISO/IEC/IEEE 15939|Systems and software engineering - Measurement process]]''. Geneva, Switzerland: International Organization for Standardization (ISO)/International Electrotechnical Commission (IEC), [[ISO/IEC/IEEE 15939]]:2007. <br />
<br />
Kasunic, M. and W. Anderson. 2004. ''Measuring Systems Interoperability: Challenges and Opportunities.'' Pittsburgh, PA, USA: Software Engineering Institute (SEI)/Carnegie Mellon University (CMU). <br />
<br />
McGarry, J. et al. 2002. ''Practical Software Measurement: Objective Information for Decision Makers''. Boston, MA, USA: Addison-Wesley <br />
<br />
Murdoch, J. et al. 2006. ''Safety Measurement''. Version 3.0. Practical Software and Systems Measurement. Available at: http://www.psmsc.com/Downloads/TechnologyPapers/SafetyWhitePaper_v3.0.pdf.<br />
<br />
Murdoch, J. et al. 2006. ''Security Measurement''. Version 3.0. Practical Software and Systems Measurement. Available at: http://www.psmsc.com/Downloads/TechnologyPapers/SecurityWhitePaper_v3.0.pdf.<br />
<br />
NASA. 2007. [[NASA Systems Engineering Handbook]]. Washington, DC, USA: National Aeronautics and Space Administration (NASA), December 2007. NASA/SP-2007-6105. <br />
<br />
Park, Goethert, and Florac. 1996. ''Goal-Driven Software Measurement – A Guidebook''. Pittsburgh, PA, USA: Software Engineering Institute (SEI)/Carnegie Mellon University (CMU), CMU/SEI-96-BH-002. <br />
<br />
PSM. 2011. "Practical Software and Systems Measurement." Accessed August 18, 2011. Available at: http://www.psmsc.com/.<br />
<br />
PSM. 2000. ''[[Practical Software and Systems Measurement (PSM) Guide]]''. Version 4.0c. Practical Software and System Measurement Support Center. Available at: http://www.psmsc.com/PSMGuide.asp. <br />
<br />
Roedler, G., D. Rhodes, C. Jones, and H. Schimmoller. 2010. ''[[Systems Engineering Leading Indicators Guide]]''. Version 2.0. San Diego, CA, USA: International Council on Systems Engineering (INCOSE), INCOSE-TP-2005-001-03. <br />
<br />
Roedler, G. and C. Jones. 2005. ''[[Technical Measurement Guide]]''. Version 1.0. San Diego, CA, USA: International Council on Systems Engineering (INCOSE), INCOSE-TP-2003-020-01.<br />
<br />
Software Productivity Center, Inc. 2011. Software Productivity Center web site. August 20, 2011. Available at: http://www.spc.ca/<br />
<br />
Statz, J. et al. 2005. ''Measurement for Process Improvement''. Version 1.0. York, UK: Practical Software and Systems Measurement (PSM).<br />
<br />
Tufte, E. 2006. ''The Visual Display of Quantitative Information.'' Cheshire, CT, USA: Graphics Press.<br />
<br />
Wasson, C. 2005. ''System Analysis, Design, Development: Concepts, Principles, and Practices''. Hoboken, NJ, USA: John Wiley and Sons.<br />
<br />
===Primary References===<br />
<br />
Frenz, P., G. Roedler, D.J. Gantzer, P. Baxter. 2010. ''[[Systems Engineering Measurement Primer]]: A Basic Introduction to Measurement Concepts and Use for Systems Engineering.'' Version 2.0. San Diego, CA: International Council on System Engineering (INCOSE). INCOSE‐TP‐2010‐005‐02. Available at: http://www.incose.org/ProductsPubs/pdf/INCOSE_SysEngMeasurementPrimer_2010-1205.pdf. <br />
<br />
ISO/IEC/IEEE. 2007. ''[[ISO/IEC/IEEE 15939|Systems and Software Engineering - Measurement Process]]''. Geneva, Switzerland: International Organization for Standardization (ISO)/International Electrotechnical Commission (IEC), [[ISO/IEC/IEEE 15939]]:2007. <br />
<br />
PSM. 2000. ''[[Practical Software and Systems Measurement (PSM) Guide]].'' Version 4.0c. Practical Software and System Measurement Support Center. Available at: http://www.psmsc.com.<br />
<br />
Roedler, G., D. Rhodes, C. Jones, and H. Schimmoller. 2010. ''[[Systems Engineering Leading Indicators Guide]].'' Version 2.0. San Diego, CA: International Council on Systems Engineering (INCOSE), INCOSE-TP-2005-001-03. <br />
<br />
Roedler, G. and C.Jones. 2005. ''[[Technical Measurement Guide]]''. Version 1.0. San Diego, CA: International Council on Systems Engineering (INCOSE), INCOSE-TP-2003-020-01.<br />
<br />
===Additional References===<br />
<br />
Kasunic, M. and W. Anderson. 2004. ''Measuring Systems Interoperability: Challenges and Opportunities.'' Pittsburgh, PA, USA: Software Engineering Institute (SEI)/Carnegie Mellon University (CMU). <br />
<br />
McGarry, J. et al. 2002. ''Practical Software Measurement: Objective Information for Decision Makers''. Boston, MA, USA: Addison-Wesley <br />
<br />
Murdoch, J. et al. 2006. ''Safety Measurement''. Version 3.0. Practical Software and Systems Measurement. Available at: http://www.psmsc.com/Downloads/TechnologyPapers/SafetyWhitePaper_v3.0.pdf.<br />
<br />
Murdoch, J. et al. 2006. ''Security Measurement''. Version 3.0. Practical Software and Systems Measurement. Available at: http://www.psmsc.com/Downloads/TechnologyPapers/SecurityWhitePaper_v3.0.pdf.<br />
<br />
NASA.2007. ''NASA Systems Engineering Handbook''. Washington, DC, USA: National Aeronautics and Space Administration (NASA), December 2007. NASA/SP-2007-6105. <br />
<br />
Park, Goethert, and Florac. 1996. ''Goal-Driven Software Measurement – A Guidebook''. Pittsburgh, PA, USA: Software Engineering Institute (SEI)/Carnegie Mellon University (CMU), CMU/SEI-96-BH-002. <br />
<br />
PSM. 2011. ''Practical Software and Systems Measurement (PSM) web site''. Accessed August 18, 2011. Available at: http://www.psmsc.com/.<br />
<br />
SEI. 2007. "Measurement and Analysis Process Area" in ''Capability Maturity Model Integrated (CMMI) for Development'', version 1.2. Pittsburgh, PA, USA: Software Engineering Institute (SEI)/Carnegie Mellon University (CMU). <br />
<br />
Software Productivity Center, Inc. 2011. "Software Productivity Center" main web page. Accessed on August 20, 2011. Available at: http://www.spc.ca/.<br />
<br />
Statz, J. et al. 2005. ''Measurement for Process Improvement''. Version 1.0. York, UK: Practical Software and Systems Measurement (PSM).<br />
<br />
Tufte, E. 2006. ''The Visual Display of Quantitative Information.'' Cheshire, CT, USA: Graphics Press.<br />
<br />
Wasson, C. 2005. ''System Analysis, Design, Development: Concepts, Principles, and Practices''. Hoboken, NJ, USA: John Wiley and Sons.<br />
<br />
----<br />
<center>[[Risk Management|< Previous Article]] | [[Systems Engineering Management|Parent Article]] | [[Decision Management|Next Article >]]</center><br />
<br />
{{5comments}}<br />
<br />
[[Category: Part 3]][[Category:Topic]]<br />
[[Category:Systems Engineering Management]]<br />
{{DISQUS}}</div>Cjones