Difference between pages "Planning" and "Measurement"

From SEBoK
(Difference between pages)
Jump to: navigation, search
(Robot improving glossary links)
 
(Reverted edits by Mhaas (talk) to last revision by Bkcase)
(Tag: Rollback)
 
Line 1: Line 1:
Planning is an important aspect of [[Systems Engineering Management|systems engineering management]] (SEM). {{Term|Systems Engineering (glossary)|Systems engineering}} (SE) planning is performed concurrently and collaboratively with project planning. It involves developing and integrating technical plans to achieve the technical project objectives within the resource constraints and {{Term|Risk (glossary)|risk}} thresholds. The planning involves the success-critical stakeholders to ensure that necessary tasks are defined with the right timing in the {{Term|Life Cycle (glossary)|life cycle}} in order to manage acceptable risks levels, meet schedules, and avoid costly omissions.  
+
[[Measurement (glossary)|Measurement]] and the accompanying analysis are fundamental elements of [[Systems Engineering (glossary)|systems engineering]] (SE) and technical management. SE measurement provides information relating to the products developed, services provided, and processes implemented to support effective management of the processes and to objectively evaluate product or service quality. Measurement supports realistic planning, provides insight into actual performance, and facilitates assessment of suitable actions (Roedler and Jones 2005, 1-65; Frenz et al. 2010).  
 
==SE Planning Process Overview==
 
SE planning provides the following elements:
 
* Definition of the {{Term|Project (glossary)|project}} from a technical perspective.
 
* Definition or tailoring of engineering processes, practices, methods, and supporting enabling environments to be used to develop products or services, as well as plans for transition and implementation of the products or services, as required by agreements.
 
* Definition of the technical organizational, personnel, and team functions and responsibilities, as well as all disciplines required during the project life cycle.
 
* Definition of the appropriate {{Term|Life Cycle Model (glossary)|life cycle model}} or approach for the {{Term|Product (glossary)|products}} or {{Term|Service (glossary)|services}}.
 
* Definition and timing of technical reviews, product or service assessments, and control mechanisms across the life cycle, including the success criteria such as {{Term|Cost (glossary)|cost}}, schedule, and technical performance at identified project milestones.
 
*Estimation of technical cost and schedule based on the effort needed to meet the requirements; this estimation becomes input to project cost and schedule planning.
 
*Determination of critical technologies, as well as the associated risks and actions needed to manage and transition these technologies.
 
*Identification of linkages to other project management efforts.
 
*Documentation of and commitment to the technical planning.
 
===Scope===
 
SE planning begins with analyzing the {{Term|Scope (glossary)|scope}} of technical work to be performed and gaining an understanding the constraints, risks, and objectives that define and bound the solution space for the product or service. The planning includes estimating the size of the work products, establishing a schedule (or integrating the technical tasks into the project schedule), identification of risks, and negotiating commitments. Iteration of these planning tasks may be necessary to establish a balanced plan with respect to cost, schedule, technical performance, and quality.  The planning continues to evolve with each successive life cycle phase of the project (NASA 2007, 1-360; SEI 1995, 12).
 
  
SE planning addresses all programmatic and technical elements of the project to ensure a comprehensive and integrated plan for all of the project's technical aspects and should account for the full scope of technical activities, including [[System Definition|system development and definition]], [[Risk Management|risk management]], [[Quality Management|quality management]], [[Configuration Management|configuration management]], [[measurement]], [[Information Management|information management]], [[System Realization|production]], [[System Verification|verification and testing]], [[System Integration|integration]], [[System Validation|validation]], and [[System Deployment and Use|deployment]]. SE planning integrates all SE functions to ensure that plans, requirements, operational concepts, and architectures are consistent and feasible.
+
Appropriate measures and indicators are essential inputs to tradeoff analyses to balance cost, schedule, and technical objectives. Periodic analysis of the relationships between measurement results and review of the requirements and attributes of the system provides insights that help to identify issues early, when they can be resolved with less impact. Historical data, together with project or organizational context information, forms the basis for the predictive models and methods that should be used.
  
The scope of planning can vary from planning a specific task to developing a major technical plan. The integrated planning effort will determine what level of planning and accompanying documentation is appropriate for the project.
+
==Fundamental Concepts==
===Integration===
+
The discussion of measurement in this article is based on some fundamental concepts. Roedler et al. (2005, 1-65) states three key SE measurement concepts that are paraphrased here:
The integration of each plan with other higher-level, peer, or subordinate plans is an essential part of SE planning. For the technical effort, the {{Term|Systems Engineering Plan (SEP) (glossary)|systems engineering management plan}} (SEMP), also frequently reffered to as the {{Term|Systems Engineering Plan (SEP) (glossary)|systems engineering plan}} (SEP), is the highest level technical plan. It is subordinate to the project plan and often has a number of subordinate technical plans providing detail on specific technical focus areas (INCOSE 2011, sec. 5.1.2.2; NASA 2007, appendix J).
 
  
In U.S. defense work, the terms SEP and SEMP are not interchangeable. The SEP is a high-level plan that is made before the system acquisition and development begins. It is written by the government customer. The SEMP is the specific development plan written by the developer (or contractor). In this context, intent, and content of these documents are quite different. For example, a SEP will have an acquisition plan that would not be included in a SEMP. Figure 1 below shows the SEMP and integrated plans.
+
# '''SE measurement is a consistent but flexible process''' that is tailored to the unique information needs and characteristics of a particular project or organization and revised as information needs change.
 +
# '''Decision makers must understand what is being measured.''' Key decision-makers must be able to connect ''what is being measured'' to ''what they need to know'' and ''what decisions they need to make ''as part of a closed-loop, feedback control process (Frenz et al. 2010)''.'' 
 +
# '''Measurement must be used to be effective.'''
  
[[File:semp_and_integrated_plans.png|thumb|400px|center|'''Figure 1. SEMP and Integrated Plans.''' (SEBoK Original)]]
+
==Measurement Process Overview==
 +
The measurement process as presented here consists of four activities from Practical Software and Systems Measurement (PSM) (2011) and described in (ISO/IEC/IEEE 15939; McGarry et al. 2002):
 +
# establish and sustain commitment
 +
# plan measurement
 +
# perform measurement
 +
# evaluate measurement
  
Task planning identifies the specific work products, deliverables, and success criteria for systems engineering efforts in support of integrated planning and project objectives. The success criteria are defined in terms of cost, schedule, and technical performance at identified project milestones. Detailed task planning identifies specific resource requirements (e.g., skills, equipment, facilities, and funding) as a function of time and project milestones.
+
This approach has been the basis for establishing a common process across the software and systems engineering communities. This measurement approach has been adopted by the Capability Maturity Model Integration (CMMI) measurement and analysis process area (SEI 2006, 10), as well as by international systems and software engineering standards (ISO/IEC/IEEE 15939; ISO/IEC/IEEE 15288, 1). The International Council on Systems Engineering (INCOSE) Measurement Working Group has also adopted this measurement approach for several of their measurement assets, such as the [[Systems Engineering Measurement Primer|INCOSE SE Measurement Primer]] (Frenz et al. 2010) and [[Technical Measurement Guide]] (Roedler and Jones 2005). This approach has provided a consistent treatment of measurement that allows the engineering community to communicate more effectively about measurement. The process is illustrated in Figure 1 from Roedler and Jones (2005) and McGarry et al. (2002).  
  
SE planning is accomplished by both the {{Term|Acquirer (glossary)|acquirer}} and {{Term|Supplier (glossary)|supplier}} and the activities for SE planning are performed in the context of the respective enterprise. The activities establish and identify relevant policies and procedures for managing and executing the project management and technical effort, identifying the management and technical tasks, their interdependencies, risks, and opportunities, and providing estimates of needed resources/budgets. Plans are updated and refined throughout the development process based on status updates and evolving project requirements (SEI 2007).
+
[[File:Measurement_Process_Model-Figure_1.png|thumb|600px|center|'''Figure 1. Four Key Measurement Process Activities (PSM 2011).''' Reprinted with permission of Practical Software and Systems Measurement ([http://www.psmsc.com PSM]). All other rights are reserved by the copyright owner.]]
  
==Linkages to Other Systems Engineering Management Topics==
+
===Establish and Sustain Commitment===
The project planning process is closely coupled with the [[Measurement|measurement]], [[Assessment and Control|assessment and control]], [[Decision Management|decision management]], and [[Risk Management|risk management]] processes.  
+
This activity focuses on establishing the resources, training, and tools to implement a measurement process and ensure that there is a management commitment to use the information that is produced. Refer to PSM (August 18, 2011) and SPC (2011) for additional detail.
 +
 
 +
===Plan Measurement===
 +
This activity focuses on defining measures that provide insight into project or organization [[Information Need (glossary)|information needs]]. This includes identifying what the decision-makers need to know and when they need to know it, relaying these information needs to those entities in a manner that can be measured, and identifying, prioritizing, selecting, and specifying [[Measure (glossary)|measures]] based on project and organization processes (Jones 2003, 15-19). This activity also identifies the reporting format, forums, and target audience for the information provided by the measures.
 +
 
 +
Here are a few widely used approaches to identify the information needs and derive associated measures, where each can be focused on identifying measures that are needed for SE management:
 +
 
 +
* The PSM approach, which uses a set of [[Information Category (glossary)|information categories]], [[Measurable Concept (glossary)|measurable concepts]], and candidate measures to aid the user in determining relevant information needs and the characteristics of those needs on which to focus (PSM August 18, 2011).
 +
* The (GQM) approach, which identifies explicit measurement goals. Each goal is decomposed into several questions that help in the selection of measures that address the question and provide insight into the goal achievement (Park, Goethert, and Florac 1996). 
 +
* Software Productivity Center’s (SPC's) 8-step Metrics Program, which also includes stating the goals and defining measures needed to gain insight for achieving the goals (SPC 2011). 
 +
 
 +
The following are good sources for candidate measures that address information needs and measurable concepts/questions:
 +
* PSM Web Site (PSM 2011)
 +
* PSM Guide, Version 4.0, Chapters 3 and 5 (PSM 2000)
 +
* SE Leading Indicators Guide, Version 2.0, Section 3  (Roedler et al. 2010)
 +
* Technical Measurement Guide, Version 1.0, Section 10 (Roedler and Jones 2005, 1-65)
 +
* Safety Measurement (PSM White Paper), Version 3.0, Section 3.4 (Murdoch 2006, 60)
 +
* Security Measurement (PSM White Paper), Version 3.0, Section 7 (Murdoch 2006, 67)
 +
* Measuring Systems Interoperability, Section 5 and Appendix C (Kasunic and Anderson 2004)
 +
* Measurement for Process Improvement (PSM Technical Report), version 1.0, Appendix E (Statz 2005)
 +
 
 +
The INCOSE ''SE Measurement Primer'' (Frenz et al. 2010) provides a list of attributes of a good measure with definitions for each [[Attribute (glossary)|attribute]]; these attributes include ''relevance, completeness, timeliness, simplicity, cost effectiveness, repeatability, and accuracy.''  Evaluating candidate measures against these attributes can help assure the selection of more effective measures.
 +
 
 +
The details of each measure need to be unambiguously defined and documented. Templates for the specification of measures and indicators are available on the PSM website (2011) and in Goethert and Siviy (2004).
 +
 
 +
===Perform Measurement===
 +
This activity focuses on the collection and preparation of measurement data, measurement analysis, and the presentation of the results to inform decision makers. The preparation of the measurement data includes verification, normalization, and aggregation of the data, as applicable. Analysis includes estimation, feasibility analysis of plans, and performance analysis of actual data against plans.
 +
 
 +
The quality of the measurement results is dependent on the collection and preparation of valid, accurate, and unbiased data. Data verification, validation, preparation, and analysis techniques are discussed in PSM (2011) and SEI (2010). Per TL 9000, ''Quality Management System Guidance'', ''The analysis step should integrate quantitative measurement results and other qualitative project information, in order to provide managers the feedback needed for effective decision making'' (QuEST Forum 2012, 5-10). This provides richer information that gives the users the broader picture and puts the information in the appropriate context.
 +
 
 +
There is a significant body of guidance available on good ways to present quantitative information. Edward Tufte has several books focused on the visualization of information, including ''The Visual Display of Quantitative Information'' (Tufte 2001).
 +
 
 +
Other resources that contain further information pertaining to understanding and using measurement results include
 +
* PSM (2011)
 +
* ISO/IEC/IEEE 15939, clauses 4.3.3 and 4.3.4
 +
* Roedler and Jones (2005), sections 6.4, 7.2, and 7.3
 +
 
 +
===Evaluate Measurement===
 +
This activity involves the analysis of information that explains the periodic evaluation and improvement of the measurement process and specific measures. One objective is to ensure that the measures continue to align with the business goals and information needs, as well as provide useful insight. This activity should also evaluate the SE measurement activities, resources, and infrastructure to make sure it supports the needs of the project and organization. Refer to PSM (2011) and ''Practical Software Measurement: Objective Information for Decision Makers'' (McGarry et al. 2002) for additional detail.
 +
 
 +
==Systems Engineering Leading Indicators==
 +
Leading indicators are aimed at providing predictive insight that pertains to an information need. A SE leading indicator is ''a measure for evaluating the effectiveness of a how a specific activity is applied on a project in a manner that provides information about impacts that are likely to affect the system performance objectives'' (Roedler et al. 2010). Leading indicators may be individual measures or collections of measures and associated analysis that provide future systems engineering performance insight throughout the life cycle of the system; they ''support the effective management of systems engineering by providing visibility into expected project performance and potential future states'' (Roedler et al. 2010).
 +
 
 +
As shown in Figure 2, a leading indicator is composed of characteristics, a condition, and a predicted behavior.  The characteristics and conditions are analyzed on a periodic or as-needed basis to predict behavior within a given confidence level and within an accepted time range into the future. More information is also provided by Roedler et al. (2010).
 +
 
 +
[[File:Composition_of_Leading_Indicator-Figure_2.png|thumb|500px|center|'''Figure 2. Composition of a Leading Indicator (Roedler et al. 2010).''' Reprinted with permission of the International Council on Systems Engineering ([http://www.incose.com INCOSE]) and Practical Software and Systems Measurement ([http://www.psmsc.com PSM]). All other rights are reserved by the copyright owner.]]
 +
 
 +
==Technical Measurement==
 +
Technical measurement is the set of measurement activities used to provide information about progress in the definition and development of the technical solution, ongoing assessment of the associated risks and issues, and the likelihood of meeting the critical objectives of the [[Acquirer (glossary)|acquirer]]. This insight helps an engineer make better decisions throughout the life cycle of a system and increase the probability of delivering a technical solution that meets both the specified requirements and the mission needs. The insight is also used in trade-off decisions when performance is not within the thresholds or goals.
 +
 
 +
Technical measurement includes [[Measure of Effectiveness (MoE) (glossary)|measures of effectiveness]] (MOEs), [[Measure of Performance (MoP) (glossary)|measures of performance]] (MOPs), and [[Technical Performance Measure (TPM) (glossary)|technical performance measures]] (TPMs) (Roedler and Jones 2005, 1-65). The relationships between these types of technical measures are shown in Figure 3 and explained in the reference for Figure 3. Using the measurement process described above, technical measurement can be planned early in the life cycle and then performed throughout the life cycle with increasing levels of fidelity as the technical solution is developed, facilitating predictive insight and preventive or corrective actions. More information about technical measurement can be found in the ''[[NASA Systems Engineering Handbook]]'', ''System Analysis, Design, Development: Concepts, Principles, and Practices'', and the ''[[Systems Engineering Leading Indicators Guide]]'' (NASA December 2007, 1-360, Section 6.7.2.2; Wasson 2006, Chapter 34; Roedler and Jones 2005).
 +
 
 +
[[File:Technical_Measures_Relationship-Figure_3.png|thumb|600px|center|'''Figure 3. Relationship of the Technical Measures (Roedler et al 2010).''' Reprinted with permission of the International Council on Systems Engineering ([http://www.psmsc.com INCOSE]) and Practical Software and Systems Measurement ([http://www.psmsc.com PSM]). All other rights are reserved by the copyright owner.]]
 +
 
 +
==Service Measurement==
 +
The same measurement activities can be applied for service measurement; however, the context and measures will be different. Service providers have a need to balance efficiency and effectiveness, which may be opposing objectives. Good service measures are outcome-based, focus on elements important to the customer (e.g., service availability, reliability, performance, etc.), and provide timely, forward-looking information.
 +
 
 +
For services, the terms critical success factors (CSF) and key performance indicators (KPI) are used often when discussing measurement.  CSFs are the key elements of the service or service infrastructure that are most important to achieve the business objectives.  KPIs are specific values or characteristics measured to assess achievement of those objectives.  
  
The measurement process provides inputs for estimation models. Estimates and other products from planning are used in decision management. SE assessment and control processes use planning results for setting milestones and assessing progress. Risk management uses the planning cost models, schedule estimates, and uncertainty distributions to support quantitative risk analysis (as desired).
+
More information about service measurement can be found in the ''Service Design'' and ''Continual Service Improvement'' volumes of BMP (2010, 1).  More information on service SE can be found in the [[Service Systems Engineering]] article.
  
Additionally, planning needs to use the outputs from assessment and control as well as risk management to ensure corrective actions have been accounted for in planning future activities. The planning may need to be updated based on results from technical reviews (from assessment and control) addressing issues pertaining to: measurement, problems that were identified during the performance of risk management activities, or decisions made as a result of the decision management activities (INCOSE 2010, sec. 6.1).
+
==Linkages to Other Systems Engineering Management Topics==
 +
SE measurement has linkages to other SEM topics. The following are a few key linkages adapted from Roedler and Jones (2005):
 +
* [[Planning]] – SE measurement provides the historical data and supports the estimation for, and feasibility analysis of, the plans for realistic planning
 +
* [[Assessment and Control]] – SE measurement provides the objective information needed to perform the assessment and determination of appropriate control actions. The use of leading indicators allows for early assessment and control actions that identify risks and/or provide insight to allow early treatment of risks to minimize potential impacts.
 +
* [[Risk Management]] – SE risk management identifies the information needs that can impact project and organizational performance. SE measurement data helps to quantify risks and subsequently provides information about whether risks have been successfully managed.
 +
*[[Decision Management]] – SE Measurement results inform decision making by providing objective insight.
  
 
==Practical Considerations==
 
==Practical Considerations==
 +
Key pitfalls and good practices related to SE measurement are described in the next two sections.
  
 
===Pitfalls===
 
===Pitfalls===
Some of the key pitfalls encountered in planning and performing SE planning are listed in Table 1.  
+
Some of the key pitfalls encountered in planning and performing SE Measurement are provided in Table 1.
  
{|
+
{|
|+'''Table 1. Major Pitfalls with Planning.''' (SEBoK Original)
+
|+'''Table 1. Measurement Pitfalls.''' (SEBoK Original)
 
! Name
 
! Name
 
! Description
 
! Description
 
|-
 
|-
| Incomplete and Rushed Planning
+
| Golden Measures
| Inadequate SE planning causes significant adverse impacts on all other engineering activitiesAlthough one may be tempted to save time by rushing the planning, inadequate planning can create additional costs and interfere with the schedule due to planning omissions, lack of detail, lack of integration of efforts, infeasible cost and schedules, etc.  
+
|
 +
* Looking for the one measure or small set of measures that applies to all projects.   
 +
* No one-size-fits-all measure or measurement set exists. 
 +
* Each project has unique information needs (e.g., objectives, risks, and issues).
 +
* The one exception is that, in some cases with consistent product lines, processes, and information needs, a small core set of measures may be defined for use across an organization.
 +
|-
 +
|Single-Pass Perspective
 +
|
 +
* Viewing measurement as a single-pass activity.
 +
* To be effective, measurement needs to be performed continuously, including the periodic identification and prioritization of information needs and associated measures.  
 
|-
 
|-
| Inexperienced Staff
+
|Unknown Information Need
| Lack of highly experienced engineering staff members, especially in similar projects, will likely result in inadequate planning. Less experienced engineers are often assigned significant roles in the SE planning; however, they may not have the appropriate judgment to lay out realistic and achievable plans. It is essential to assign the SE planning tasks to those with a good amount of relevant experience.  
+
|
 +
*Performing measurement activities without the understanding of why the measures are needed and what information they provide. 
 +
*This can lead to wasted effort.  
 +
|-
 +
|Inappropriate Usage
 +
|
 +
*Using measurement inappropriately, such as measuring the performance of individuals or makinng interpretations without context information.  
 +
*This can lead to bias in the results or incorrect interpretations.
 
|}
 
|}
  
 
===Good Practices===
 
===Good Practices===
Some good practices gathered from the references are in Table 2.
+
Some good practices, gathered from the references are provided in Table 2.
  
{|
+
{|
|+'''Table 2. Proven Practices with Planning.''' (SEBoK Original)
+
|+'''Table 2. Measurement Good Practices.''' (SEBoK Original)
 
! Name
 
! Name
 
! Description
 
! Description
 
|-
 
|-
| Use Multiple Disciplines
+
| Periodic Review
| Get technical resources from all disciplines involved in the planning process.
+
|
 +
* Regularly review each measure collected.
 
|-
 
|-
| Early Conflict Resolution
+
|Action Driven
|Resolve schedule and resource conflicts early.
+
|
 +
* Measurement by itself does not control or improve process performance.
 +
* Measurement results should be provided to decision makers for appropriate action.
 
|-
 
|-
| Task Independence
+
|Integration into Project Processes
|Tasks should be as independent as possible.
+
|
 +
* SE Measurement should be integrated into the project as part of the ongoing project business rhythm.
 +
* Data should be collected as processes are performed, not recreated as an afterthought.  
 
|-
 
|-
| Define Interdependencies
+
| Timely Information
|Define task interdependencies, using dependency networks or other approaches.  
+
|
 +
* Information should be obtained early enough to allow necessary action to control or treat risks, adjust tactics and strategies, etc. 
 +
* When such actions are not successful, measurement results need to help decision-makers determine contingency actions or correct problems.
 
|-
 
|-
| Risk Management
+
|Relevance to Decision Makers
|Integrate risk management with the SE planning to identify areas that require special attention and/or trades.  
+
|
 +
* Successful measurement requires the communication of meaningful information to the decision-makers. 
 +
* Results should be presented in the decision-makers preferred format.
 +
*Allows accurate and expeditious interpretation of the results.  
 
|-
 
|-
|Management Reserve
+
|Data Availability
|The amount of management reserve should be based on the risk associated with the plan.
+
|
 +
* Decisions can rarely wait for a complete or perfect set of data, so measurement information often needs to be derived from analysis of the best available data, complemented by real-time events and qualitative insight (including experience).  
 
|-
 
|-
|Use Historical Data
+
|Historical Data
|Use historical data for estimates and adjust for differences in the project.
+
|
 +
* Use historical data as the basis of plans, measure what is planned versus what is achieved, archive actual achieved results, and use archived data as a historical basis for the next planning effort.  
 
|-
 
|-
|Consider Lead Times
+
|Information Model
|Identify lead times and ensure that you account for them in the planning (e.g., the development of analytical tools).
+
|
|-
+
* The information model defined in ISO/IEC/IEEE (2007) provides a means to link the entities that are measured to the associated measures and to the identified information need, and also describes how the measures are converted into indicators that provide insight to decision-makers.  
|Update Plans
 
|Prepare to update plans as additional information becomes available or changes are needed.
 
|-
 
|Use IPDTs
 
|An integrated product development team (IPDT) (or {{Term|Integrated Product Team (IPT) (glossary)|integrated product team (IPT)}}) is often useful to ensure adequate communication across the necessary disciplines, timely integration of all design considerations, as well as integration, testing, and consideration of the full range of risks that need to be addressed.  Although there are some issues that need to be managed with them, IPDTs tend to break down the communication and knowledge stovepipes that often exist.  
 
 
|}
 
|}
  
Additional good practices can be found in the ''[[Systems Engineering Guidebook for Intelligent Transportation Systems (ITS)]]'', ''[[NASA Systems Engineering Handbook]]'', the ''[[INCOSE Systems Engineering Handbook]]'', and ''[[ISO/IEC/IEEE 16326|Systems and Software Engineering - Life Cycle Processes - Project Management]]'' (Caltrans and USDOT 2005, 278; NASA December 2007, 1-360, sec. 6.1; INCOSE 2011, sec. 5.1; ISO/IEC/IEEE 2009, Clause 6.1).
+
Additional information can be found in the ''[[Systems Engineering Measurement Primer]]'', Section 4.2 (Frenz et al. 2010), and INCOSE ''Systems Engineering Handbook'', Section 5.7.1.5 (2012).
  
 
==References==  
 
==References==  
 +
  
 
===Works Cited===
 
===Works Cited===
Caltrans and USDOT. 2005. ''[[Systems Engineering Guidebook for Intelligent Transportation Systems (ITS)]],'' version 1.1. Sacramento, CA, USA: California Department of Transportation (Caltrans) Division of Reserach & Innovation/U.S. Department of Transportation (USDOT), SEG for ITS 1.1.  
+
Frenz, P., G. Roedler, D.J. Gantzer, P. Baxter. 2010. ''[[Systems Engineering Measurement Primer]]: A Basic Introduction to Measurement Concepts and Use for Systems Engineering.''  Version 2.0. San Diego, CA: International Council on System Engineering (INCOSE).  INCOSE‐TP‐2010‐005‐02. Accessed April 13, 2015 at  http://www.incose.org/ProductsPublications/techpublications/PrimerMeasurement
 +
 
 +
INCOSE. 2012. ''Systems Engineering Handbook: A Guide for System Life Cycle Processes and Activities,'' version 3.2.2. San Diego, CA, USA: International Council on Systems Engineering (INCOSE), INCOSE-TP-2003-002-03.2.2.
 +
 
 +
ISO/IEC/IEEE. 2007. ''[[ISO/IEC/IEEE 15939|Systems and software engineering - Measurement process]]''. Geneva, Switzerland: International Organization for Standardization (ISO)/International Electrotechnical Commission (IEC), [[ISO/IEC/IEEE 15939]]:2007.
 +
 
 +
ISO/IEC/IEEE. 2015. ''[[ISO/IEC/IEEE 15288|Systems and Software Engineering -- System Life Cycle Processes]]''. Geneva, Switzerland: International Organisation for Standardisation / International Electrotechnical Commissions / Institute of Electrical and Electronics Engineers. ISO/IEC/IEEE 15288:2015.
 +
 
 +
Kasunic, M. and W. Anderson. 2004. ''Measuring Systems Interoperability: Challenges and Opportunities.'' Pittsburgh, PA, USA: Software Engineering Institute (SEI)/Carnegie Mellon University (CMU).
 +
 
 +
McGarry, J., D. Card, C. Jones, B. Layman, E. Clark, J.Dean, F. Hall. 2002. ''Practical Software Measurement: Objective Information for Decision Makers''. Boston, MA, USA: Addison-Wesley.
 +
 
 +
NASA. 2007. ''[[NASA Systems Engineering Handbook|Systems Engineering Handbook]].'' Washington, DC, USA: National Aeronautics and Space Administration (NASA), December 2007. NASA/SP-2007-6105.
 +
 
 +
Park, R.E., W.B. Goethert, and W.A. Florac. 1996. ''Goal-Driven Software Measurement – A Guidebook''. Pittsburgh, PA, USA: Software Engineering Institute (SEI)/Carnegie Mellon University (CMU), CMU/SEI-96-BH-002.  
  
DAU. 2010. ''[[Defense Acquisition Guidebook (DAG)]].'' Ft. Belvoir, VA, USA: Defense Acquisition University (DAU)/U.S. Department of Defense, February 19.
+
PSM. 2011. "Practical Software and Systems Measurement." Accessed August 18, 2011. Available at: http://www.psmsc.com/.
  
INCOSE. 2012. ''[[INCOSE Systems Engineering Handbook]]: A Guide for System Life Cycle Processes and Activities.'' Version 3.2.2. San Diego, CA, USA: International Council on Systems Engineering (INCOSE), INCOSE-TP-2003-002-03.2.2.
+
PSM. 2000. ''[[Practical Software and Systems Measurement (PSM) Guide]],'' version 4.0c. Practical Software and System Measurement Support Center. Available at: http://www.psmsc.com/PSMGuide.asp.
  
ISO/IEC/IEEE. 2009. ''[[ISO/IEC/IEEE 16326|Systems and Software Engineering - Life Cycle Processes - Project Management]]''. Geneva, Switzerland: International Organization for Standardization (ISO)/International Electrotechnical Commission (IEC)/Institute of Electrical and Electronics Engineers (IEEE), [[ISO/IEC/IEEE 16326]]:2009(E).  
+
PSM Safety & Security TWG. 2006. ''Safety Measurement,'' version 3.0. Practical Software and Systems Measurement. Available at: http://www.psmsc.com/Downloads/TechnologyPapers/SafetyWhitePaper_v3.0.pdf.
  
NASA. 2007. ''[[NASA Systems Engineering Handbook]].'' Washington, D.C.: National Aeronautics and Space Administration (NASA), NASA/SP-2007-6105.  
+
PSM Safety & Security TWG. 2006. ''Security Measurement,'' version 3.0. Practical Software and Systems Measurement. Available at: http://www.psmsc.com/Downloads/TechnologyPapers/SecurityWhitePaper_v3.0.pdf.
  
SEI. 1995. ''A systems engineering capability maturity model.'' Version 1.1. Pittsburgh, PA, USA: Software Engineering Institute (SEI)/Carnegie-Mellon University (CMU), CMU/SEI-95-MM-003.
+
QuEST Forum. 2012. ''Quality Management System (QMS) Measurements Handbook,'' Release 5.0. Plano, TX, USA: Quest Forum.
 +
 
 +
Roedler, G., D. Rhodes, C. Jones, and H. Schimmoller. 2010. ''[[Systems Engineering Leading Indicators Guide]],'' version 2.0. San Diego, CA, USA: International Council on Systems Engineering (INCOSE), INCOSE-TP-2005-001-03.
 +
 
 +
Roedler, G. and C. Jones. 2005. ''[[Technical Measurement Guide]],'' version 1.0. San Diego, CA, USA: International Council on Systems Engineering (INCOSE), INCOSE-TP-2003-020-01.
 +
 
 +
SEI. 2010. "Measurement and Analysis Process Area" in ''Capability Maturity Model Integrated (CMMI) for Development'', version 1.3. Pittsburgh, PA, USA: Software Engineering Institute (SEI)/Carnegie Mellon University (CMU).
 +
 
 +
Software Productivity Center, Inc. 2011. Software Productivity Center web site. August 20, 2011. Available at: http://www.spc.ca/
 +
 
 +
Statz, J. et al. 2005. ''Measurement for Process Improvement,'' version 1.0. York, UK: Practical Software and Systems Measurement (PSM).
 +
 
 +
Tufte, E. 2006. ''The Visual Display of Quantitative Information.'' Cheshire, CT, USA: Graphics Press.
 +
 
 +
Wasson, C. 2005. ''System Analysis, Design, Development: Concepts, Principles, and Practices''.  Hoboken, NJ, USA: John Wiley and Sons.
  
 
===Primary References===
 
===Primary References===
Caltrans and USDOT. 2005. ''[[Systems Engineering Guidebook for Intelligent Transportation Systems (ITS)]],'' version 1.1. Sacramento, CA, USA: California Department of Transportation (Caltrans) Division of Reserach & Innovation/U.S. Department of Transportation (USDOT), SEG for ITS 1.1.
 
  
DAU. 2010. ''[[Defense Acquisition Guidebook (DAG)]].'' Ft. Belvoir, VA, USA: Defense Acquisition University (DAU)/U.S. Department of Defense.  
+
Frenz, P., G. Roedler, D.J. Gantzer, P. Baxter. 2010. ''[[Systems Engineering Measurement Primer]]: A Basic Introduction to Measurement Concepts and Use for Systems Engineering.'' Version 2.0. San Diego, CA: International Council on System Engineering (INCOSE).  INCOSE‐TP‐2010‐005‐02. Accessed April 13, 2015 at  http://www.incose.org/ProductsPublications/techpublications/PrimerMeasurement
  
INCOSE. 2012. ''[[INCOSE Systems Engineering Handbook]]: A Guide for System Life Cycle Processes and Activities.'' Version 3.2.2. San Diego, CA, USA: International Council on Systems Engineering (INCOSE), INCOSE-TP-2003-002-03.2.2.
+
ISO/IEC/IEEE. 2007. ''[[ISO/IEC/IEEE 15939|Systems and Software Engineering - Measurement Process]]''. Geneva, Switzerland: International Organization for Standardization (ISO)/International Electrotechnical Commission (IEC), [[ISO/IEC/IEEE 15939]]:2007.  
  
ISO/IEC/IEEE. 2015. ''[[ISO/IEC/IEEE 15288|Systems and Software Engineering -- System Life Cycle Processes]]''. Geneva, Switzerland: International Organisation for Standardisation / International Electrotechnical Commissions / Institute of Electrical and Electronics Engineers. ISO/IEC/IEEE 15288:2015.  
+
PSM. 2000. ''[[Practical Software and Systems Measurement (PSM) Guide]],'' version 4.0c. Practical Software and System Measurement Support Center.  Available at: http://www.psmsc.com.
 +
 
 +
Roedler, G., D. Rhodes, C. Jones, and H. Schimmoller. 2010. ''[[Systems Engineering Leading Indicators Guide]],'' version 2.0. San Diego, CA: International Council on Systems Engineering (INCOSE), INCOSE-TP-2005-001-03.
 +
 
 +
Roedler, G. and C.Jones. 2005. ''[[Technical Measurement Guide]],'' version 1.0. San Diego, CA: International Council on Systems Engineering (INCOSE), INCOSE-TP-2003-020-01.
 +
 
 +
===Additional References===
 +
Kasunic, M. and W. Anderson. 2004. ''Measuring Systems Interoperability: Challenges and Opportunities.'' Pittsburgh, PA, USA: Software Engineering Institute (SEI)/Carnegie Mellon University (CMU).
 +
 
 +
McGarry, J. et al. 2002. ''Practical Software Measurement: Objective Information for Decision Makers''. Boston, MA, USA: Addison-Wesley
 +
 
 +
NASA. 2007. ''[[NASA Systems Engineering Handbook]].'' Washington, DC, USA: National Aeronautics and Space Administration (NASA), December 2007. NASA/SP-2007-6105. 
 +
 
 +
Park, Goethert, and Florac. 1996. ''Goal-Driven Software Measurement – A Guidebook''. Pittsburgh, PA, USA: Software Engineering Institute (SEI)/Carnegie Mellon University (CMU), CMU/SEI-96-BH-002.
 +
 
 +
PSM. 2011. "Practical Software and Systems Measurement." Accessed August 18, 2011. Available at: http://www.psmsc.com/.
  
ISO/IEC/IEEE. 2009. ''[[ISO/IEC/IEEE 16326|Systems and Software Engineering - Life Cycle Processes - Project Management]]''. Geneva, Switzerland: International Organization for Standardization (ISO)/International Electrotechnical Commission (IEC)/Institute of Electrical and Electronics Engineers (IEEE), [[ISO/IEC/IEEE 16326]]:2009(E).
+
PSM Safety & Security TWG. 2006. ''Safety Measurement,'' version 3.0. Practical Software and Systems Measurement. Available at: http://www.psmsc.com/Downloads/TechnologyPapers/SafetyWhitePaper_v3.0.pdf.
  
NASA. 2007. ''[[NASA Systems Engineering Handbook]].'' Washington, D.C., USA: National Aeronautics and Space Administration (NASA), NASA/SP-2007-6105.  
+
PSM Safety & Security TWG. 2006. ''Security Measurement,'' version 3.0. Practical Software and Systems Measurement. Available at: http://www.psmsc.com/Downloads/TechnologyPapers/SecurityWhitePaper_v3.0.pdf.
  
SEI. 1995. ''[[A Systems Engineering Capability Maturity Model]],'' version 1.1. Pittsburgh, PA, USA: Software Engineering Institute (SEI)/Carnegie-Mellon University (CMU), CMU/SEI-95-MM-003.  
+
SEI. 2010. "Measurement and Analysis Process Area" in ''Capability Maturity Model Integrated (CMMI) for Development'', version 1.3. Pittsburgh, PA, USA: Software Engineering Institute (SEI)/Carnegie Mellon University (CMU).
  
SEI. 2007. ''[[Capability Maturity Model Integrated (CMMI) for Development]],'' version 1.2, measurement and analysis process area. Pittsburgh, PA, USA: Software Engineering Institute (SEI)/Carnegie Mellon University (CMU).
+
Software Productivity Center, Inc. 2011. Software Productivity Center web site. August 20, 2011. Available at: http://www.spc.ca/
  
===Additional References===
+
Statz, J. 2005. ''Measurement for Process Improvement,'' version 1.0. York, UK: Practical Software and Systems Measurement (PSM).
Boehm, B., C. Abts, A.W. Brown, S. Chulani, B.K. Clark, E. Horowitz, R. Madachy, D.J. Reifer, B. Steece. 2000. ''Software Cost Estimation with COCOMO II''. Englewood Cliffs, NJ, USA: Prentice Hall
 
  
DeMarco, T. and T. Lister. 2003. ''Waltzing with Bears; Managing Risks on Software Projects.'' New York, NY, USA: Dorset House.
+
Tufte, E. 2006. ''The Visual Display of Quantitative Information.'' Cheshire, CT, USA: Graphics Press.
  
ISO/IEC/IEEE. 2009. ''Systems and Software Engineering - Life Cycle Processes - Project Management.'' Geneva, Switzerland: International Organization for Standardization (ISO)/International Electrotechnical Commission (IEC)/Institute of Electrical and Electronics Engineers (IEEE), ISO/IEC/IEEE 16326:2009(E).
+
Wasson, C. 2005. ''System Analysis, Design, Development: Concepts, Principles, and Practices''.  Hoboken, NJ, USA: John Wiley and Sons.
  
Valerdi, R. 2008.  ''The Constructive Systems Engineering Cost Model (COSYSMO): Quantifying the Costs of Systems Engineering Effort in Complex Systems''. Saarbrücken,Germany: VDM Verlag Dr. Muller
 
 
----
 
----
<center>[[Systems Engineering Management|< Previous Article]] | [[Systems Engineering and Management|Parent Article]] | [[Assessment and Control|Next Article >]]</center>
+
<center>[[Risk Management|< Previous Article]] | [[Systems Engineering Management|Parent Article]] | [[Decision Management|Next Article >]]</center>
  
 
<center>'''SEBoK v. 2.0, released 1 June 2019'''</center>
 
<center>'''SEBoK v. 2.0, released 1 June 2019'''</center>

Revision as of 02:59, 19 October 2019

Measurement and the accompanying analysis are fundamental elements of systems engineering (SE) and technical management. SE measurement provides information relating to the products developed, services provided, and processes implemented to support effective management of the processes and to objectively evaluate product or service quality. Measurement supports realistic planning, provides insight into actual performance, and facilitates assessment of suitable actions (Roedler and Jones 2005, 1-65; Frenz et al. 2010).

Appropriate measures and indicators are essential inputs to tradeoff analyses to balance cost, schedule, and technical objectives. Periodic analysis of the relationships between measurement results and review of the requirements and attributes of the system provides insights that help to identify issues early, when they can be resolved with less impact. Historical data, together with project or organizational context information, forms the basis for the predictive models and methods that should be used.

Fundamental Concepts

The discussion of measurement in this article is based on some fundamental concepts. Roedler et al. (2005, 1-65) states three key SE measurement concepts that are paraphrased here:

  1. SE measurement is a consistent but flexible process that is tailored to the unique information needs and characteristics of a particular project or organization and revised as information needs change.
  2. Decision makers must understand what is being measured. Key decision-makers must be able to connect what is being measured to what they need to know and what decisions they need to make as part of a closed-loop, feedback control process (Frenz et al. 2010).
  3. Measurement must be used to be effective.

Measurement Process Overview

The measurement process as presented here consists of four activities from Practical Software and Systems Measurement (PSM) (2011) and described in (ISO/IEC/IEEE 15939; McGarry et al. 2002):

  1. establish and sustain commitment
  2. plan measurement
  3. perform measurement
  4. evaluate measurement

This approach has been the basis for establishing a common process across the software and systems engineering communities. This measurement approach has been adopted by the Capability Maturity Model Integration (CMMI) measurement and analysis process area (SEI 2006, 10), as well as by international systems and software engineering standards (ISO/IEC/IEEE 15939; ISO/IEC/IEEE 15288, 1). The International Council on Systems Engineering (INCOSE) Measurement Working Group has also adopted this measurement approach for several of their measurement assets, such as the INCOSE SE Measurement Primer (Frenz et al. 2010) and Technical Measurement Guide (Roedler and Jones 2005). This approach has provided a consistent treatment of measurement that allows the engineering community to communicate more effectively about measurement. The process is illustrated in Figure 1 from Roedler and Jones (2005) and McGarry et al. (2002).

Figure 1. Four Key Measurement Process Activities (PSM 2011). Reprinted with permission of Practical Software and Systems Measurement (PSM). All other rights are reserved by the copyright owner.

Establish and Sustain Commitment

This activity focuses on establishing the resources, training, and tools to implement a measurement process and ensure that there is a management commitment to use the information that is produced. Refer to PSM (August 18, 2011) and SPC (2011) for additional detail.

Plan Measurement

This activity focuses on defining measures that provide insight into project or organization information needs. This includes identifying what the decision-makers need to know and when they need to know it, relaying these information needs to those entities in a manner that can be measured, and identifying, prioritizing, selecting, and specifying measures based on project and organization processes (Jones 2003, 15-19). This activity also identifies the reporting format, forums, and target audience for the information provided by the measures.

Here are a few widely used approaches to identify the information needs and derive associated measures, where each can be focused on identifying measures that are needed for SE management:

  • The PSM approach, which uses a set of information categories, measurable concepts, and candidate measures to aid the user in determining relevant information needs and the characteristics of those needs on which to focus (PSM August 18, 2011).
  • The (GQM) approach, which identifies explicit measurement goals. Each goal is decomposed into several questions that help in the selection of measures that address the question and provide insight into the goal achievement (Park, Goethert, and Florac 1996).
  • Software Productivity Center’s (SPC's) 8-step Metrics Program, which also includes stating the goals and defining measures needed to gain insight for achieving the goals (SPC 2011).

The following are good sources for candidate measures that address information needs and measurable concepts/questions:

  • PSM Web Site (PSM 2011)
  • PSM Guide, Version 4.0, Chapters 3 and 5 (PSM 2000)
  • SE Leading Indicators Guide, Version 2.0, Section 3 (Roedler et al. 2010)
  • Technical Measurement Guide, Version 1.0, Section 10 (Roedler and Jones 2005, 1-65)
  • Safety Measurement (PSM White Paper), Version 3.0, Section 3.4 (Murdoch 2006, 60)
  • Security Measurement (PSM White Paper), Version 3.0, Section 7 (Murdoch 2006, 67)
  • Measuring Systems Interoperability, Section 5 and Appendix C (Kasunic and Anderson 2004)
  • Measurement for Process Improvement (PSM Technical Report), version 1.0, Appendix E (Statz 2005)

The INCOSE SE Measurement Primer (Frenz et al. 2010) provides a list of attributes of a good measure with definitions for each attribute; these attributes include relevance, completeness, timeliness, simplicity, cost effectiveness, repeatability, and accuracy. Evaluating candidate measures against these attributes can help assure the selection of more effective measures.

The details of each measure need to be unambiguously defined and documented. Templates for the specification of measures and indicators are available on the PSM website (2011) and in Goethert and Siviy (2004).

Perform Measurement

This activity focuses on the collection and preparation of measurement data, measurement analysis, and the presentation of the results to inform decision makers. The preparation of the measurement data includes verification, normalization, and aggregation of the data, as applicable. Analysis includes estimation, feasibility analysis of plans, and performance analysis of actual data against plans.

The quality of the measurement results is dependent on the collection and preparation of valid, accurate, and unbiased data. Data verification, validation, preparation, and analysis techniques are discussed in PSM (2011) and SEI (2010). Per TL 9000, Quality Management System Guidance, The analysis step should integrate quantitative measurement results and other qualitative project information, in order to provide managers the feedback needed for effective decision making (QuEST Forum 2012, 5-10). This provides richer information that gives the users the broader picture and puts the information in the appropriate context.

There is a significant body of guidance available on good ways to present quantitative information. Edward Tufte has several books focused on the visualization of information, including The Visual Display of Quantitative Information (Tufte 2001).

Other resources that contain further information pertaining to understanding and using measurement results include

  • PSM (2011)
  • ISO/IEC/IEEE 15939, clauses 4.3.3 and 4.3.4
  • Roedler and Jones (2005), sections 6.4, 7.2, and 7.3

Evaluate Measurement

This activity involves the analysis of information that explains the periodic evaluation and improvement of the measurement process and specific measures. One objective is to ensure that the measures continue to align with the business goals and information needs, as well as provide useful insight. This activity should also evaluate the SE measurement activities, resources, and infrastructure to make sure it supports the needs of the project and organization. Refer to PSM (2011) and Practical Software Measurement: Objective Information for Decision Makers (McGarry et al. 2002) for additional detail.

Systems Engineering Leading Indicators

Leading indicators are aimed at providing predictive insight that pertains to an information need. A SE leading indicator is a measure for evaluating the effectiveness of a how a specific activity is applied on a project in a manner that provides information about impacts that are likely to affect the system performance objectives (Roedler et al. 2010). Leading indicators may be individual measures or collections of measures and associated analysis that provide future systems engineering performance insight throughout the life cycle of the system; they support the effective management of systems engineering by providing visibility into expected project performance and potential future states (Roedler et al. 2010).

As shown in Figure 2, a leading indicator is composed of characteristics, a condition, and a predicted behavior. The characteristics and conditions are analyzed on a periodic or as-needed basis to predict behavior within a given confidence level and within an accepted time range into the future. More information is also provided by Roedler et al. (2010).

Figure 2. Composition of a Leading Indicator (Roedler et al. 2010). Reprinted with permission of the International Council on Systems Engineering (INCOSE) and Practical Software and Systems Measurement (PSM). All other rights are reserved by the copyright owner.

Technical Measurement

Technical measurement is the set of measurement activities used to provide information about progress in the definition and development of the technical solution, ongoing assessment of the associated risks and issues, and the likelihood of meeting the critical objectives of the acquirer. This insight helps an engineer make better decisions throughout the life cycle of a system and increase the probability of delivering a technical solution that meets both the specified requirements and the mission needs. The insight is also used in trade-off decisions when performance is not within the thresholds or goals.

Technical measurement includes measures of effectiveness (MOEs), measures of performance (MOPs), and technical performance measures (TPMs) (Roedler and Jones 2005, 1-65). The relationships between these types of technical measures are shown in Figure 3 and explained in the reference for Figure 3. Using the measurement process described above, technical measurement can be planned early in the life cycle and then performed throughout the life cycle with increasing levels of fidelity as the technical solution is developed, facilitating predictive insight and preventive or corrective actions. More information about technical measurement can be found in the NASA Systems Engineering Handbook, System Analysis, Design, Development: Concepts, Principles, and Practices, and the Systems Engineering Leading Indicators Guide (NASA December 2007, 1-360, Section 6.7.2.2; Wasson 2006, Chapter 34; Roedler and Jones 2005).

Figure 3. Relationship of the Technical Measures (Roedler et al 2010). Reprinted with permission of the International Council on Systems Engineering (INCOSE) and Practical Software and Systems Measurement (PSM). All other rights are reserved by the copyright owner.

Service Measurement

The same measurement activities can be applied for service measurement; however, the context and measures will be different. Service providers have a need to balance efficiency and effectiveness, which may be opposing objectives. Good service measures are outcome-based, focus on elements important to the customer (e.g., service availability, reliability, performance, etc.), and provide timely, forward-looking information.

For services, the terms critical success factors (CSF) and key performance indicators (KPI) are used often when discussing measurement. CSFs are the key elements of the service or service infrastructure that are most important to achieve the business objectives. KPIs are specific values or characteristics measured to assess achievement of those objectives.

More information about service measurement can be found in the Service Design and Continual Service Improvement volumes of BMP (2010, 1). More information on service SE can be found in the Service Systems Engineering article.

Linkages to Other Systems Engineering Management Topics

SE measurement has linkages to other SEM topics. The following are a few key linkages adapted from Roedler and Jones (2005):

  • Planning – SE measurement provides the historical data and supports the estimation for, and feasibility analysis of, the plans for realistic planning.
  • Assessment and Control – SE measurement provides the objective information needed to perform the assessment and determination of appropriate control actions. The use of leading indicators allows for early assessment and control actions that identify risks and/or provide insight to allow early treatment of risks to minimize potential impacts.
  • Risk Management – SE risk management identifies the information needs that can impact project and organizational performance. SE measurement data helps to quantify risks and subsequently provides information about whether risks have been successfully managed.
  • Decision Management – SE Measurement results inform decision making by providing objective insight.

Practical Considerations

Key pitfalls and good practices related to SE measurement are described in the next two sections.

Pitfalls

Some of the key pitfalls encountered in planning and performing SE Measurement are provided in Table 1.

Table 1. Measurement Pitfalls. (SEBoK Original)
Name Description
Golden Measures
  • Looking for the one measure or small set of measures that applies to all projects.
  • No one-size-fits-all measure or measurement set exists.
  • Each project has unique information needs (e.g., objectives, risks, and issues).
  • The one exception is that, in some cases with consistent product lines, processes, and information needs, a small core set of measures may be defined for use across an organization.
Single-Pass Perspective
  • Viewing measurement as a single-pass activity.
  • To be effective, measurement needs to be performed continuously, including the periodic identification and prioritization of information needs and associated measures.
Unknown Information Need
  • Performing measurement activities without the understanding of why the measures are needed and what information they provide.
  • This can lead to wasted effort.
Inappropriate Usage
  • Using measurement inappropriately, such as measuring the performance of individuals or makinng interpretations without context information.
  • This can lead to bias in the results or incorrect interpretations.

Good Practices

Some good practices, gathered from the references are provided in Table 2.

Table 2. Measurement Good Practices. (SEBoK Original)
Name Description
Periodic Review
  • Regularly review each measure collected.
Action Driven
  • Measurement by itself does not control or improve process performance.
  • Measurement results should be provided to decision makers for appropriate action.
Integration into Project Processes
  • SE Measurement should be integrated into the project as part of the ongoing project business rhythm.
  • Data should be collected as processes are performed, not recreated as an afterthought.
Timely Information
  • Information should be obtained early enough to allow necessary action to control or treat risks, adjust tactics and strategies, etc.
  • When such actions are not successful, measurement results need to help decision-makers determine contingency actions or correct problems.
Relevance to Decision Makers
  • Successful measurement requires the communication of meaningful information to the decision-makers.
  • Results should be presented in the decision-makers preferred format.
  • Allows accurate and expeditious interpretation of the results.
Data Availability
  • Decisions can rarely wait for a complete or perfect set of data, so measurement information often needs to be derived from analysis of the best available data, complemented by real-time events and qualitative insight (including experience).
Historical Data
  • Use historical data as the basis of plans, measure what is planned versus what is achieved, archive actual achieved results, and use archived data as a historical basis for the next planning effort.
Information Model
  • The information model defined in ISO/IEC/IEEE (2007) provides a means to link the entities that are measured to the associated measures and to the identified information need, and also describes how the measures are converted into indicators that provide insight to decision-makers.

Additional information can be found in the Systems Engineering Measurement Primer, Section 4.2 (Frenz et al. 2010), and INCOSE Systems Engineering Handbook, Section 5.7.1.5 (2012).

References

Works Cited

Frenz, P., G. Roedler, D.J. Gantzer, P. Baxter. 2010. Systems Engineering Measurement Primer: A Basic Introduction to Measurement Concepts and Use for Systems Engineering. Version 2.0. San Diego, CA: International Council on System Engineering (INCOSE). INCOSE‐TP‐2010‐005‐02. Accessed April 13, 2015 at http://www.incose.org/ProductsPublications/techpublications/PrimerMeasurement

INCOSE. 2012. Systems Engineering Handbook: A Guide for System Life Cycle Processes and Activities, version 3.2.2. San Diego, CA, USA: International Council on Systems Engineering (INCOSE), INCOSE-TP-2003-002-03.2.2.

ISO/IEC/IEEE. 2007. Systems and software engineering - Measurement process. Geneva, Switzerland: International Organization for Standardization (ISO)/International Electrotechnical Commission (IEC), ISO/IEC/IEEE 15939:2007.

ISO/IEC/IEEE. 2015. Systems and Software Engineering -- System Life Cycle Processes. Geneva, Switzerland: International Organisation for Standardisation / International Electrotechnical Commissions / Institute of Electrical and Electronics Engineers. ISO/IEC/IEEE 15288:2015.

Kasunic, M. and W. Anderson. 2004. Measuring Systems Interoperability: Challenges and Opportunities. Pittsburgh, PA, USA: Software Engineering Institute (SEI)/Carnegie Mellon University (CMU).

McGarry, J., D. Card, C. Jones, B. Layman, E. Clark, J.Dean, F. Hall. 2002. Practical Software Measurement: Objective Information for Decision Makers. Boston, MA, USA: Addison-Wesley.

NASA. 2007. Systems Engineering Handbook. Washington, DC, USA: National Aeronautics and Space Administration (NASA), December 2007. NASA/SP-2007-6105.

Park, R.E., W.B. Goethert, and W.A. Florac. 1996. Goal-Driven Software Measurement – A Guidebook. Pittsburgh, PA, USA: Software Engineering Institute (SEI)/Carnegie Mellon University (CMU), CMU/SEI-96-BH-002.

PSM. 2011. "Practical Software and Systems Measurement." Accessed August 18, 2011. Available at: http://www.psmsc.com/.

PSM. 2000. Practical Software and Systems Measurement (PSM) Guide, version 4.0c. Practical Software and System Measurement Support Center. Available at: http://www.psmsc.com/PSMGuide.asp.

PSM Safety & Security TWG. 2006. Safety Measurement, version 3.0. Practical Software and Systems Measurement. Available at: http://www.psmsc.com/Downloads/TechnologyPapers/SafetyWhitePaper_v3.0.pdf.

PSM Safety & Security TWG. 2006. Security Measurement, version 3.0. Practical Software and Systems Measurement. Available at: http://www.psmsc.com/Downloads/TechnologyPapers/SecurityWhitePaper_v3.0.pdf.

QuEST Forum. 2012. Quality Management System (QMS) Measurements Handbook, Release 5.0. Plano, TX, USA: Quest Forum.

Roedler, G., D. Rhodes, C. Jones, and H. Schimmoller. 2010. Systems Engineering Leading Indicators Guide, version 2.0. San Diego, CA, USA: International Council on Systems Engineering (INCOSE), INCOSE-TP-2005-001-03.

Roedler, G. and C. Jones. 2005. Technical Measurement Guide, version 1.0. San Diego, CA, USA: International Council on Systems Engineering (INCOSE), INCOSE-TP-2003-020-01.

SEI. 2010. "Measurement and Analysis Process Area" in Capability Maturity Model Integrated (CMMI) for Development, version 1.3. Pittsburgh, PA, USA: Software Engineering Institute (SEI)/Carnegie Mellon University (CMU).

Software Productivity Center, Inc. 2011. Software Productivity Center web site. August 20, 2011. Available at: http://www.spc.ca/

Statz, J. et al. 2005. Measurement for Process Improvement, version 1.0. York, UK: Practical Software and Systems Measurement (PSM).

Tufte, E. 2006. The Visual Display of Quantitative Information. Cheshire, CT, USA: Graphics Press.

Wasson, C. 2005. System Analysis, Design, Development: Concepts, Principles, and Practices. Hoboken, NJ, USA: John Wiley and Sons.

Primary References

Frenz, P., G. Roedler, D.J. Gantzer, P. Baxter. 2010. Systems Engineering Measurement Primer: A Basic Introduction to Measurement Concepts and Use for Systems Engineering. Version 2.0. San Diego, CA: International Council on System Engineering (INCOSE). INCOSE‐TP‐2010‐005‐02. Accessed April 13, 2015 at http://www.incose.org/ProductsPublications/techpublications/PrimerMeasurement

ISO/IEC/IEEE. 2007. Systems and Software Engineering - Measurement Process. Geneva, Switzerland: International Organization for Standardization (ISO)/International Electrotechnical Commission (IEC), ISO/IEC/IEEE 15939:2007.

PSM. 2000. Practical Software and Systems Measurement (PSM) Guide, version 4.0c. Practical Software and System Measurement Support Center. Available at: http://www.psmsc.com.

Roedler, G., D. Rhodes, C. Jones, and H. Schimmoller. 2010. Systems Engineering Leading Indicators Guide, version 2.0. San Diego, CA: International Council on Systems Engineering (INCOSE), INCOSE-TP-2005-001-03.

Roedler, G. and C.Jones. 2005. Technical Measurement Guide, version 1.0. San Diego, CA: International Council on Systems Engineering (INCOSE), INCOSE-TP-2003-020-01.

Additional References

Kasunic, M. and W. Anderson. 2004. Measuring Systems Interoperability: Challenges and Opportunities. Pittsburgh, PA, USA: Software Engineering Institute (SEI)/Carnegie Mellon University (CMU).

McGarry, J. et al. 2002. Practical Software Measurement: Objective Information for Decision Makers. Boston, MA, USA: Addison-Wesley

NASA. 2007. NASA Systems Engineering Handbook. Washington, DC, USA: National Aeronautics and Space Administration (NASA), December 2007. NASA/SP-2007-6105.

Park, Goethert, and Florac. 1996. Goal-Driven Software Measurement – A Guidebook. Pittsburgh, PA, USA: Software Engineering Institute (SEI)/Carnegie Mellon University (CMU), CMU/SEI-96-BH-002.

PSM. 2011. "Practical Software and Systems Measurement." Accessed August 18, 2011. Available at: http://www.psmsc.com/.

PSM Safety & Security TWG. 2006. Safety Measurement, version 3.0. Practical Software and Systems Measurement. Available at: http://www.psmsc.com/Downloads/TechnologyPapers/SafetyWhitePaper_v3.0.pdf.

PSM Safety & Security TWG. 2006. Security Measurement, version 3.0. Practical Software and Systems Measurement. Available at: http://www.psmsc.com/Downloads/TechnologyPapers/SecurityWhitePaper_v3.0.pdf.

SEI. 2010. "Measurement and Analysis Process Area" in Capability Maturity Model Integrated (CMMI) for Development, version 1.3. Pittsburgh, PA, USA: Software Engineering Institute (SEI)/Carnegie Mellon University (CMU).

Software Productivity Center, Inc. 2011. Software Productivity Center web site. August 20, 2011. Available at: http://www.spc.ca/

Statz, J. 2005. Measurement for Process Improvement, version 1.0. York, UK: Practical Software and Systems Measurement (PSM).

Tufte, E. 2006. The Visual Display of Quantitative Information. Cheshire, CT, USA: Graphics Press.

Wasson, C. 2005. System Analysis, Design, Development: Concepts, Principles, and Practices. Hoboken, NJ, USA: John Wiley and Sons.


< Previous Article | Parent Article | Next Article >
SEBoK v. 2.0, released 1 June 2019