Difference between pages "Developing Individuals" and "System Verification"

From SEBoK
(Difference between pages)
Jump to: navigation, search
m (Text replacement - "<center>'''SEBoK v. 2.0, released 1 June 2019'''</center>" to "<center>'''SEBoK v. 2.1, released 31 October 2019'''</center>")
 
(Tech and grammar edits as discussed with Bkcase)
 
Line 1: Line 1:
Developing each individual’s {{Term|Systems Engineering (glossary)|systems engineering}} (SE) {{Term|Competency (glossary)|competencies(glossary)}} is a key aspect of [[Enabling Individuals|enabling individuals]]. The goal may be to develop competency in a broad range of SE competencies or a single aspect of SE, and it is important to know exactly which SE competencies are desired.  This article describes strategies to develop SE competencies in individuals.
+
{{Term|Verification (glossary)|System Verification}} is a set of actions used to check the ''correctness'' of any element, such as a {{Term|System Element (glossary)|system element}}, a {{Term|System (glossary)|system}}, a document, a {{Term|Service (glossary)|service}}, a task, a {{Term|Requirement (glossary)|requirement}}, etc. These types of actions are planned and carried out throughout the {{Term|Life Cycle (glossary)|life cycle}} of the system. Verification is a generic term that needs to be instantiated within the context it occurs. As a process, verification is a transverse activity to every life cycle stage of the system. In particular, during the development cycle of the system, the verification process is performed in parallel with the {{Term|System Definition (glossary)|system definition}} and {{Term|System Realization (glossary)|system realization}} processes and applies to any activity and any product resulting from the activity. The activities of every life cycle process and those of the verification process can work together. For example, the {{Term|Integration (glossary)|integration}} process frequently uses the verification process. It is important to remember that verification, while separate from [[System Validation|validation]], is intended to be performed in conjunction with validation.
  
==Closing Competency Gaps==
+
==Definition and Purpose==
Delivering excellent systems that fulfill customer needs is the primary goal of the organization. Developing ''the capability'' to deliver such systems is a secondary goal, and while necessary, is not sufficient. To attain both of these goals, the organization must assess itself and effect a strategy to identify and close competency gaps. 
 
  
To identify competency gaps, an organization may take two basic steps:
+
{{Term|Verification (glossary)|Verification}} is the confirmation, through the provision of objective evidence, that specified requirements have been fulfilled. With a note added in ISO/IEC/IEEE 15288, the scope of verification includes a set of activities that compares a system or system element against the requirements, architecture and design characteristics, and other properties to be verified (ISO/IEC/IEEE 2015). This may include, but is not limited to, specified requirements, design description, and the system itself.
#Listing desired competencies, as discussed in [[Roles and Competencies]]; and
 
#Assessing the competencies of individual systems engineers, as discussed in [[Assessing Individuals]].
 
  
Models useful for listing competencies include the International Council on Systems Engineering (INCOSE) United Kingdom Advisory Board model (Cowper et al. 2005; INCOSE 2010), the ENG Competency Model (DAU 2013), and the Academy of Program/Project & Engineering Leadership (APPEL 2009) model (Menrad and Lawson 2008).
+
The purpose of verification, as a generic action, is to identify the faults/defects introduced at the time of any transformation of inputs into outputs. Verification is used to provide information and evidence that the transformation was made according to the selected and appropriate methods, techniques, standards, or rules.  
  
Once the organization knows the SE competencies it needs to develop to close the competency gaps it has identified, it may choose from the several methods (Davidz and Martin 2011) outlined in the table below.  
+
Verification is based on tangible evidence; i.e., it is based on information whose veracity can be demonstrated by factual results obtained from techniques such as inspection, measurement, testing, analysis, calculation, etc. Thus, the process of verifying a {{Term|System (glossary)|system}} ({{Term|Product (glossary)|product}}, {{Term|Service (glossary)|service}}, {{Term|Enterprise (glossary)|enterprise}}, or {{Term|System of Systems (SoS) (glossary)|system of systems}} (SoS)) consists  of comparing the realized characteristics or properties of the product, service, or enterprise against its expected design properties.
  
<center>'''Table 1. SE Competency Development Framework.''' (SEBoK Original)</center>
+
==Principles and Concepts==
<center>
 
<table border="1"><tr>
 
  
<tr><td><b>Goal</b></td>
+
===Concept of Verification Action===
<td><b>Objective</b></td>
 
<td><b>Method</b></td></tr>
 
  
<td rowspan="2" width="25%"><b><center>PRIMARY GOAL = Delivery of excellent systems to fulfill customer needs</center></b></td>
+
====Why Verify?====
 +
In the context of human realization, any human thought is susceptible to error. This is also the case with any engineering activity. Studies in human reliability have shown that people trained to perform a specific operation make around 1-3 errors per hour in best case scenarios. In any activity, or resulting outcome of an activity, the search for potential errors should not be neglected, regardless of whether or not one thinks they will happen or that they should not happen; the consequences of errors can cause extremely significant failures or threats.
  
<td>Focus on successful performance outcome</td>
+
A '''verification action''' is defined, and then performed, as shown in Figure 1.
<td>Corporate intiatives</td>
 
<tr>
 
<td>Focus on performance of project team</td>
 
<td>Team coaching of project team for performance enhancement</td>
 
</tr>
 
  
<tr>
+
[[File:Definition_and_usage_of_a_Verification_Action.png|thumb|400px|center|'''Figure 1. Definition and Usage of a Verification Action.''' (SEBoK Original)]]
<td rowspan="19" width="25%"><b><center>SECONDARY GOAL = Competency to deliver excellent systems to fulfill customer needs</center></b></td>
 
  
<td rowspan="9">Develop individual competency</td>
+
The definition of a verification action applied to an engineering element includes the following:
<td>Training courses</td>
+
* Identification of the element on which the verification action will be performed
</tr><tr>
+
* Identification of the reference to define the expected result of the verification action (see examples of reference in Table 1)
<td>Job rotation</td>
 
</tr><tr>
 
<td>Mentoring</td>
 
</tr><tr>
 
<td>Hands-on experience</td>
 
</tr><tr>
 
<td>Develop a few hand-picked individuals</td>
 
</tr><tr>
 
<td>University educational degree program</td>
 
</tr><tr>
 
<td>Customized educational program</td>
 
</tr><tr>
 
<td>Combination program - education, training, job rotation, mentoring, hands-on experience</td>
 
</tr><tr>
 
<td>Course certificate program</td>
 
  
</tr><tr>
+
The performance of a verification action includes the following:
<td>Ensure individual competency through certification</td>
+
* Obtaining a result by performing the verification action onto the submitted element
<td>Certification program</td></tr>
+
* Comparing the obtained result with the expected result
 +
* Deducing the degree of correctness of the element
  
<tr>
+
====What to Verify?====
<td>Filter those working in systems roles</td>
+
Any engineering element can be verified using a specific reference for comparison: stakeholder requirement, system requirement, function, system element, document, etc. Examples are provided in Table 1.
<td>Use individual characteristics to select employees for systems roles</td></tr>
 
  
<tr>
+
{|
<td>Ensure organizational competency through certification</td>
+
|+'''Table 1. Examples of Verified Items.''' (SEBoK Original)
<td>ISO 9000</td></tr>
+
!Items
 +
!Explanation for Verification
 +
|-
 +
|'''Document'''
 +
|To verify a document is to check the application of drafting rules.
 +
|-
 +
|'''Stakeholder Requirement and System Requirement'''
 +
|To verify a stakeholder requirement or a system requirement is to check the application of syntactic and grammatical rules, characteristics defined in the stakeholder requirements definition process, and the system requirements definition process such as necessity, implementation free, unambiguous, consistent, complete, singular, feasible, traceable, and verifiable.
 +
|-
 +
|'''Design'''
 +
|To verify the design of a system is to check its logical and physical architecture elements against the characteristics of the outcomes of the design processes.
 +
|-
 +
|'''System'''
 +
|To verify a system (product, service, or enterprise) is to check its realized characteristics or properties against its expected design characteristics.
 +
|-
 +
|'''Aggregate'''
 +
|To verify an aggregate for integration is to check every interface and interaction between implemented elements.
 +
|-
 +
|'''Verification Procedure'''
 +
|To verify a verification procedure is to check the application of a predefined template and drafting rules.
 +
|}
  
<tr><td rowspan="7">Develop organizational systems competency through processes</td>
+
===Verification versus Validation===
 +
The term ''verification'' is often associated with the term ''validation'' and understood as a single concept of ''V&V''. Validation is used to ensure that ''one is working the right problem'', whereas verification is used to ensure that ''one has solved the problem right'' (Martin 1997). From an actual and etymological meaning, the term verification comes from the Latin ''verus'', which means truth, and ''facere'', which means to make/perform. Thus, verification means to prove that something is ''true'' or correct (a property, a characteristic, etc.). The term validation comes from the Latin ''valere'', which means to become strong, and has the same etymological root as the word ''value''. Thus, validation means to prove that something has the right features to produce the expected effects. (Adapted from "Verification and Validation in plain English" (Lake INCOSE 1999).)
  
<td>Process improvement using an established framework</td>
+
The main differences between the verification process and the validation process concern the references used to check the correctness of an element, and the acceptability of the effective correctness.
</tr><tr>
 
<td>Concept maps to identify the thought processes of senior systems engineers</td>
 
</tr><tr>
 
<td>Standarize systems policies and procedures for consistency</td>
 
</tr><tr>
 
<td>Systems engineering web portal</td>
 
</tr><tr>
 
<td>Systems knowledge management repository</td>
 
</tr><tr>
 
<td>On-call organizational experts</td>
 
</tr><tr>
 
<td>Rotating professor who works at company part-time and is at university part-time</td>
 
  
</tr>
+
* Within verification, comparison between the expected result and the obtained result is generally binary, whereas within validation, the result of the comparison may require a judgment of value regarding whether or not to accept the obtained result compared to a threshold or limit.
</table>
+
* Verification relates more to one element, whereas validation relates more to a set of elements and considers this set as a whole.
</center>
+
* Validation presupposes that verification actions have already been performed.
 +
* The techniques used to define and perform the verification actions and those for validation actions are very similar.
  
===System Delivery===
+
===Integration, Verification, and Validation of the System===
 +
There is sometimes a misconception that verification occurs after integration and before validation. In most cases, it is more appropriate to begin verification activities during development or {{Term|Implementation (glossary)|implementation}} and to continue them into [[System Deployment and Use|deployment and use]].
  
Some organizations mount initiatives which focus directly on successful system delivery. Others focus on project team performance, in some cases by offering coaching, as a means to ensure successful system delivery.
+
Once the system elements have been realized, they are integrated to form the complete system. Integration consists of assembling and performing verification actions as stated in the integration process. A final validation activity generally occurs when the system is integrated, but a certain number of validation actions are also performed parallel to the system integration in order to reduce the number of verification actions and validation actions while controlling the risks that could be generated if some checks are excluded. Integration, verification, and validation are intimately processed together due to the necessity of optimizing the strategy of verification and validation, as well as the strategy of integration.
  
One example of the latter approach is the performance enhancement service of the US National Aeronautics and Space Administration (NASA) Academy of Program/Project & Engineering Leadership (APPEL), which assesses team performance and then offers developmental interventions with coaching (NASA 2010).
+
==Process Approach==
 +
===Purpose and Principle of the Approach===
 +
The purpose of the verification process is to confirm that the system fulfills the specified design requirements. This process provides the information required to effect the remedial actions that correct non-conformances in the realized system or the processes that act on it - see [[ISO/IEC/IEEE 15288]] (ISO/IEC/IEEE 2015).
  
Organizations pursue multiple paths towards developing the capability to deliver excellent systems, including
+
Each system element and the complete system itself should be compared against its own design references (specified requirements). As stated by Dennis Buede, ''verification is the matching of [configuration items], components, sub-systems, and the system to corresponding requirements to ensure that each has been built right'' (Buede 2009). This means that the verification process is instantiated as many times as necessary during the global development of the system. Because of the generic nature of a process, the verification process can be applied to any engineering element that has conducted to the definition and realization of the system elements and the system itself.
*developing the competency of individuals;
 
*developing the competency of the organization through processes (Davidz and Maier 2007); and
 
*putting measures should in place to verify the efficacy of the selected methods.
 
  
===Individual Competency===
+
Facing the huge number of potential verification actions that may be generated by the normal approach, it is necessary to optimize the verification strategy. This strategy is based on the balance between what must be verified and constraints, such as time, cost, and feasibility of testing, which naturally limit the number of verification actions and the risks one accepts when excluding some verification actions.
  
An organization may choose a combination of methods to develop individual systems competency. General Electric’s Edison Engineering Development Program (GE 2010) and Lockheed Martin’s Leadership Development Programs (Lockheed Martin 2010) are examples among the many combination programs offered within companies.
+
Several approaches exist that may be used for defining the verification process. The International Council on Systems Engineering (INCOSE) dictates that two main steps are necessary for verification: planning and performing verification actions (INCOSE 2012). NASA has a slightly more detailed approach that includes five main steps: prepare verification, perform verification, analyze outcomes, produce a report, and capture work products (NASA December 2007, 1-360, p. 102). Any approach may be used, provided that it is appropriate to the scope of the system, the constraints of the project, includes the activities of the process listed below in some way, and is appropriately coordinated with other activities.
  
Whether or not the program is specifically oriented to develop systems skills, the breadth of technical training and experience, coupled with business training, can produce a rich understanding of systems for the participant. Furthermore, new combination programs can be designed to develop specific systems-oriented skills for an organization.
+
'''Generic inputs''' are baseline references of the submitted element. If the element is a system, inputs are the logical and physical architecture elements as described in a system design document, the design description of internal interfaces to the system and interfaces requirements external to the system, and by extension, the system requirements.
 +
'''Generic outputs''' define the verification plan that includes verification strategy, selected verification actions, verification procedures, verification tools, the verified element or system, verification reports, issue/trouble reports, and change requests on design.
  
Methods for developing individual competency include
+
===Activities of the Process===
 +
To establish the verification strategy drafted in a verification plan (this activity is carried out concurrently to system definition activities), the following steps are necessary:
 +
* Identify verification scope by listing as many characteristics or properties as possible that should be checked. The number of verification actions can be extremely high.
 +
* Identify constraints according to their origin (technical feasibility, management constraints as cost, time, availability of verification means or qualified personnel, and contractual constraints that are critical to the mission) that limit potential verification actions.
 +
* Define appropriate verification techniques to be applied, such as inspection, analysis, simulation, peer-review, testing, etc., based on the best step of the project to perform every verification action according to the given constraints.
 +
* Consider a tradeoff of what should be verified (scope) taking into account all constraints or limits and deduce what can be verified; the selection of verification actions would be made according to the type of system, objectives of the project, acceptable risks, and constraints.
 +
* Optimize the verification strategy by defining the most appropriate verification technique for every verification action while defining necessary verification means (tools, test-benches, personnel, location, and facilities) according to the selected verification technique.
 +
* Schedule the execution of verification actions in the project steps or milestones and define the configuration of elements submitted to verification actions (this mainly involves testing on physical elements).
  
*'''classroom or online training courses''', a traditional choice for knowledge transfer and skill acquisition. Here, an instructor directs a classroom of participants. The method of instruction may vary from a lecture format to case study work to hands-on exercises.  The impact and effectiveness of this method varies considerably based on the skill of the instructor, the effort of the participants, the presentation of the material, the course content, the quality of the course design process, and the matching of the course material to organizational needs.  These types of interventions may also be given online. Squires (2011) investigates the relationship between online pedagogy and student perceived learning of SE competencies.
+
Performing verification actions includes the following tasks:
*'''job rotation''', where a participant rotates through a series of work assignments that cut across different aspects of the organization to gain broad experience in a relatively short time.
+
* Detail each verification action; in particular, note the expected results, the verification techniques to be applied, and the corresponding means required (equipment, resources, and qualified personnel).
*'''mentoring''', where a more experienced individual is paired with a protégé in a developmental relationship. Many organizations use mentoring, whose impact and effectiveness vary considerably. Success factors are the tenable pairing of individuals, and the provision of adequate time for mentoring.
+
* Acquire verification means used during system definition steps (qualified personnel, modeling tools, mocks-up, simulators, and facilities), and then those used during the integration step (qualified personnel, verification tools, measuring equipment, facilities, verification procedures, etc.).
*'''hands-on experience''', where organizations provide for their engineers to get hands-on experience that they would otherwise lack.  A research study by Davidz on enablers and barriers to the development of systems thinking showed that systems thinking is developed primarily by experiential learning (Davidz 2006; Davidz and Nightingale 2008, 1-14). As an example, some individuals found that working in a job that dealt with the full system, such as working in an integration and test environment, enabled development of systems thinking.
+
* Carry out verification procedures at the right time, in the expected environment, with the expected means, tools, and techniques.
*'''selecting individuals''' who appear to have high potential and focusing on their development. Hand-selection may or may not be accompanied by the other identified methods.
+
* Capture and record the results obtained when performing verification actions using verification procedures and means.
*'''formal education''', such as a university degree program. A growing number of SE degree programs are offered worldwide (Lasfer and Pyster 2011). Companies have also worked with local universities to set up customized educational programs for their employees.  The company benefits because it can tailor the educational program to the unique needs of its business. In a certificate program, individuals receive a certificate for taking a specific set of courses, either at a university or as provided by the company. There are a growing number of certificate programs for developing systems competency.
 
  
====Individual Certification====
+
The obtained results must be analyzed and compared to the expected results so that the status may be recorded as either ''compliant'' or ''non-compliant''. {{Term|Systems Engineering (glossary)|Systems engineering}} (SE) practitioners will likely need to generate verification reports, as well as potential issue/trouble reports, and change requests on design as necessary. 
  
Organizations may seek to boost individual systems competency through certification programs. These can combine work experience, educational background, and training classes. Certifications are offered by local, national, and international professional bodies.  
+
Controlling the process includes the following tasks:
 +
* Update the verification plan according to the progress of the project; in particular, planned verification actions can be redefined because of unexpected events.
 +
* Coordinate verification activities with the project manager: review the schedule and the acquisition of means, personnel, and resources. Coordinate with designers for issues/trouble/non-conformance reports and with the configuration manager for versions of the physical elements, design baselines, etc.
  
SE organizations may encourage employees to seek certification from the International Council on Systems Engineering (INCOSE 2011) or may use this type of certification as a filter (see '''Filters''', below). In addition, many companies have developed their own internal certification measures. For example, the Aerospace Corporation has an Aerospace Systems Architecting and Engineering Certificate Program (ASAECP). (Gardner 2007)
+
===Artifacts and Ontology Elements===
 +
This process may create several artifacts such as:
 +
* verification plans (contain the verification strategy)
 +
* verification matrices (contain the verification action, submitted element, applied technique, step of execution, system block concerned, expected result, obtained result, etc.)
 +
* verification procedures (describe verification actions to be performed, verification tools needed, the verification configuration, resources and personnel needed, the schedule, etc.)
 +
* verification reports
 +
* verification tools
 +
* verified elements
 +
* issue / non-conformance / trouble reports
 +
* change requests to the design
  
====Filters====
+
This process utilizes the ontology elements displayed in Table 2 below.
  
Another approach to developing individual competency is to select employees for systems roles based on certain characteristics, or filters. Before using a list of characteristics for filtering, though, an organization should critically examine
+
{|
#how the list of individual characteristics was determined, and
+
|+'''Table 2. Main Ontology Elements as Handled within Verification.''' (SEBoK Original)
#how the characteristics identified enable the performance of a systems job.
+
!Element
 
+
!Definition
Characteristics used as filters should
+
----
*enable one to perform a systems job
+
Attributes (examples)
*be viewed as important to perform a systems job, or
+
|-
*be necessary to perform a systems job.
+
|'''Verification Action'''
 
+
|A verification action describes what must be verified (the element as reference) on which element, the expected result, the verification technique to apply, on which level of decomposition.
A necessary characteristic is much stronger than an enabling one, and before filtering for certain traits, it is important to understand whether the characteristic is an enabler or a necessity.
+
----
 
+
Identifier, name, description
Finally, it is important to understand the extent to which findings are generally applicable, since a list of characteristics that determine success in one organization may not be generalizable to another organization.
+
|-
 
+
|'''Verification Procedure'''
===Organizational Capability===
+
|A verification procedure groups a set of verification actions performed together (as a scenario of tests) in a gin verification configuration.
Once an organization has determined which SE capabilities are mission critical (please see [[Deciding on Desired Systems Engineering Capabilities within Businesses and Enterprises]]), there are many different ways in which an organization can seek to develop or improve these capabilities. Some approaches seen in the literature include the following:
+
----
 +
Identifier, name, description, duration, unit of time
 +
|-
 +
|'''Verification Tool'''
 +
|A verification tool is a device or physical tool used to perform verification procedures (test bench, simulator, cap/stub, launcher, etc.).
 +
----
 +
Identifier, name, description
 +
|-
 +
|'''Verification Configuration'''
 +
|A verification configuration groups all physical elements (aggregates and verification tools) necessary to perform a verification procedure.
 +
----
 +
Identifier, name, description
 +
|-
 +
|'''Risk'''
 +
|An event having a probability of occurrence and a gravity degree on its consequence onto the system mission or on other characteristics (used for technical risk in engineering). A risk is the combination of vulnerability and of a danger or a threat.
 +
|-
 +
|'''Rationale'''
 +
|An argument that provides the justification for the selection of an engineering element.
 +
----
 +
Identifier, name, description (rationale, reasons for defining a verification action, a verification procedure, for using a verification tool, etc.)
 +
|}
  
*Organizations may choose to develop organizational systems capability through processes.  One method organizations may choose is to pursue process improvement using an established framework.  An example is the Capability Maturity Model® Integration (CMMI) process improvement approach (SEI 2010, 1).
+
===Methods and Techniques===
*Concept maps - graphical representations of engineering thought processes - have been shown to be an effective method of transferring knowledge from senior engineering personnel to junior engineering personnel (Kramer 2007, 26-29; Kramer 2005). These maps may provide a mechanism for increasing knowledge of the systems engineering population of an organization.
+
There are several verification techniques to check that an element or a system conforms to its design references or its specified requirements. These techniques are almost the same as those used for [[System Validation|validation]], though the application of the techniques may differ slightly. In particular, the purposes are different; verification is used to detect faults/defects, whereas validation is used to provide evidence for the satisfaction of (system and/or stakeholder) requirements. Table 3 below provides descriptions of some techniques for verification.
*An organization may also choose to develop organizational systems competencies by standardizing systems policies and procedures. An example from NASA is their ''NASA Systems Engineering Processes and Requirement'' (NASA 2007).
 
*Some organizations use a web portal to store and organize applicable systems engineering knowledge and processes, which assists in developing organizational systems competency.  An example is the Mission Assurance Portal for the Aerospace Corporation (Roberts et al. 2007, 10-13).
 
*Another approach being considered in the community is the development of a rotating professor role, where the person would work at the company and then be at a university to strengthen the link between academia and industry.
 
*Another approach is to alter organizational design to foster and mature a desired competency.  For example, an organization that identifies competency in the area of reliability as critical to its SE success may develop a reliability group, which will help foster growth and improvement in reliability competencies.
 
  
====Organizational Certification====
+
{|
 +
|+'''Table 3. Verification Techniques.''' (SEBoK Original)
 +
!Verification Technique
 +
!Description
 +
|-
 +
|'''Inspection'''
 +
|Technique based on visual or dimensional examination of an element; the verification relies on the human senses or uses simple methods of measurement and handling. Inspection is generally non-destructive, and typically includes the use of sight, hearing, smell, touch, and taste, simple physical manipulation, mechanical and electrical gauging, and measurement. No stimuli (tests) are necessary. The technique is used to check properties or characteristics best determined by observation (e.g. paint color, weight, documentation, listing of code, etc.).
 +
|-
 +
|'''Analysis'''
 +
|Technique based on analytical evidence obtained without any intervention on the submitted element using mathematical or probabilistic calculation, logical reasoning (including the theory of predicates), modeling and/or simulation under defined conditions to show theoretical compliance. Mainly used where testing to realistic conditions cannot be achieved or is not cost-effective.
 +
|-
 +
|'''Analogy or Similarity'''
 +
|Technique based on evidence of similar elements to the submitted element or on experience feedback. It is absolutely necessary to show by prediction that the context is invariant that the outcomes are transposable (models, investigations, experience feedback, etc.). Similarity can only be used if the submitted element is similar in design, manufacture, and use; equivalent or more stringent verification actions were used for the similar element, and the intended operational environment is identical to or less rigorous than the similar element.
 +
|-
 +
|'''Demonstration'''
 +
|Technique used to demonstrate correct operation of the submitted element against operational and observable characteristics without using physical measurements (no or minimal instrumentation or test equipment). Demonstration is sometimes called 'field testing'. It generally consists of a set of tests selected by the supplier to show that the element response to stimuli is suitable or to show that operators can perform their assigned tasks when using the element. Observations are made and compared with predetermined/expected responses. Demonstration may be appropriate when requirements or specification are given in statistical terms (e.g. mean time to repair, average power consumption, etc.).
 +
|-
 +
|'''Test'''
 +
|Technique performed onto the submitted element by which functional, measurable characteristics, operability, supportability, or performance capability is quantitatively verified when subjected to controlled conditions that are real or simulated. Testing often uses special test equipment or instrumentation to obtain accurate quantitative data to be analyzed.
 +
|-
 +
|'''Sampling'''
 +
|Technique based on verification of characteristics using samples. The number, tolerance, and other characteristics must be specified to be in agreement with the experience feedback.
 +
|}
  
Certification at the organizational level exists also, and can be a means for ensuring competency. ISO certification is one example (ISO 2010). Before taking this approach, the organization should verify that the capabilities required by the certification are indeed the systems capabilities it seeks. For more on determining appropriate organizational capabilities, see [[Deciding on Desired Systems Engineering Capabilities within Businesses and Enterprises]].
+
==Practical Considerations==
 +
Key pitfalls and good practices related to this topic are described in the next two sections.
  
====Repositioning the Product Life Cycle====
+
===Pitfalls===
 +
Some of the key pitfalls encountered in planning and performing System Verification are provided in Table 4.
  
An organization may also choose to reposition its product life cycle philosophy to maintain system competency. For example, NASA has done this with its APPEL program (APPEL 2009).  
+
{|
 +
|+'''Table 4. Major Pitfalls with System Verification''' (SEBoK Original)
 +
!Pitfall
 +
!Description
 +
|-
 +
|Confusion between verification and validation
 +
|Confusion between verification and validation causes developers to take the wrong reference/baseline to define verification and validation actions and/or to address the wrong level of granularity (detail level for verification, global level for validation).
 +
|-
 +
|No verification strategy
 +
|One overlooks verification actions because it is impossible to check every characteristic or property of all system elements and of the system in any combination of operational conditions and scenarios. A strategy (justified selection of verification actions against risks) must be established.
 +
|-
 +
|Save or spend time
 +
|Skip verification activity to save time.
 +
|-
 +
|Use only testing
 +
|Use only testing as a verification technique. Testing requires checking products and services only when they are implemented. Consider other techniques earlier during design; analysis and inspections are cost effective and allow discovering early potential errors, faults, or failures.
 +
|-
 +
|Stop verifications when funding is diminished
 +
|Stopping the performance of verification actions when budget and/or time are consumed. Prefer using criteria such as coverage rates to end verification activity.
 +
|}
  
Since the systems competencies of individuals are primarily developed through experiential learning, providing experiential learning opportunities is critical. Shortening the product life cycle is one way to ensure that individuals acquire the full range of desired competency sooner.
+
===Proven Practices===
 +
Some proven practices gathered from the references are provided in Table 5.
  
==Maintaining Competency Plans==
+
{|
 
+
|+'''Table 5. Proven Practices with System Verification.''' (SEBoK Original)
An organization that has developed an SE competency plan should consider how to maintain it. How, and how often, will the competency plan be re-examined and updated? The maintenance process should account for the ongoing evolution of global contexts, business strategies, and the SEBoK. The process for assessing competencies and taking action to improve them must be part of the normal operations of the organization and should occur periodically.
+
!Practice
 +
!Description
 +
|-
 +
|Start verifications early in the development
 +
|The earlier characteristics of an element are verified in the project, the easier the corrections are to do and the consequences on schedule and cost will be fewer.
 +
|-
 +
|Define criteria ending verifications
 +
|Carrying out verification actions without limits generates a risk of drift for costs and deadlines. Modifying and verifying in a non-stop cycle until arriving at a perfect system is the best way to never supply the system. Thus, it is necessary to set limits of cost, time, and a maximum number of modification loops back for each verification action type, ending criteria (percentages of success, error count detected, coverage rate obtained, etc.).
 +
|-
 +
|Involve design responsible with verification
 +
|Include the verification responsible in the designer team or include some designer onto the verification team.
 +
|}
  
 
==References==  
 
==References==  
  
 
===Works Cited===
 
===Works Cited===
Academy of Program/Project & Engineering Leadership (APPEL). 2009. [[NASA's Systems Engineering Competencies]]. Washington, D.C.: U.S. National Aeronautics and Space Association. Accessed on September 15, 2011. Available at http://www.nasa.gov/offices/oce/appel/pm-development/pm_se_competency_framework.html.
+
Buede, D.M. 2009. ''The Engineering Design of Systems: Models and Methods.'' 2nd ed. Hoboken, NJ, USA: John Wiley & Sons Inc.  
  
Cowper, D., S. Bennison, R. Allen-Shalless, K. Barnwell, S. Brown, A. El Fatatry, J. Hooper, S. Hudson, L. Oliver, and A. Smith. 2005. ''Systems Engineering Core Competencies Framework.'' Folkestone, UK: International Council on Systems Engineering (INCOSE) UK Advisory Board (UKAB).
+
INCOSE. 2012. ''INCOSE Systems Engineering Handbook,'' version 3.2.2. San Diego, CA, USA: International Council on Systems Engineering (INCOSE), INCOSE-TP-2003-002-03.2.2.  
  
Davidz, H.L. and J. Martin. 2011. "[[Defining a Strategy for Development of Systems Capability in the Workforce]]". ''Systems Engineering''. 14(2): 141-143.  
+
ISO/IEC/IEEE. 2015.''Systems and Software Engineering - System Life Cycle Processes. ''Geneva, Switzerland: International Organization for Standardization (ISO)/International Electrotechnical Commission (IEC), Institute of Electrical and Electronics Engineers. ISO/IEC/IEEE 15288:2015.
  
Davidz, H.L. and M.W. Maier. 2007.  "[[An Integrated Approach to Developing Systems Professionals]]." Paper presented at the 17th Annual International Council on Systems Engineering (INCOSE) International Symposium, 24-28 June 2007. San Diego, CA, USA.
+
Lake, J. 1999. "V & V in Plain English." International Council on Systems Engineering (INCOSE) 9th Annual International Symposium, Brighton, UK, 6-10 June 1999.
  
Davidz, H.L., and D. Nightingale. 2008. "Enabling Systems Thinking to Accelerate the Development of Senior Systems Engineers." ''Systems Engineering''. 11(1): 1-14.
+
NASA. 2007. ''Systems Engineering Handbook''. Washington, DC, USA: National Aeronautics and Space Administration (NASA), December 2007. NASA/SP-2007-6105.
  
Davidz, H.L. 2006. ''Enabling Systems Thinking to Accelerate the Development of Senior Systems Engineers.'' Dissertation. Massachusetts Institute of Technology (MIT), Cambridge, MA, USA.
+
===Primary References===
 +
INCOSE. 2012. ''[[INCOSE Systems Engineering Handbook]]: A Guide for System Life Cycle Processes and Activities'', version 3.2.2. San Diego, CA, USA: International Council on Systems Engineering (INCOSE), INCOSE-TP-2003-002-03.2.2.
  
Gardner, B. 2007. "A Corporate Approach to National Security Space Education.''Crosslink,''  the Aerospace Corporation Magazine of Advances in Aerospace Technology. 8(1) (Spring 2007):10-5.  Accessed April 23, 2013.  Available at: http://aerospace.wpengine.netdna-cdn.com/wp-content/uploads/crosslink/V8N1.pdf.
+
ISO/IEC/IEEE. 2015. ''[[ISO/IEC/IEEE 15288|Systems and Software Engineering - System Life Cycle Processes]].'' Geneva, Switzerland: International Organization for Standardization (ISO)/International Electrotechnical Commission (IEC)/ Institute of Electrical and Electronics Engineers. [[ISO/IEC/IEEE 15288]]:2015.
  
GE. 2010. ''Edison Engineering Development Program (EEDP) in General Electric.'' Accessed on September 15, 2011. Available at http://www.gecareers.com/GECAREERS/jsp/us/studentOpportunities/leadershipPrograms/eng_program_guide.jsp.
+
NASA. 2007. ''[[NASA Systems Engineering Handbook|Systems Engineering Handbook]].'' Washington, D.C.: National Aeronautics and Space Administration (NASA), December 2007. NASA/SP-2007-6105.
  
INCOSE. 2010. ''[[Systems Engineering Competencies Framework 2010-0205]].'' San Diego, CA, USA: International Council on Systems Engineering (INCOSE), INCOSE-TP-2010-003.
+
===Additional References===
+
Buede, D.M. 2009. ''The Engineering Design of Systems: Models and Methods,'' 2nd ed. Hoboken, NJ, USA: John Wiley & Sons Inc.  
INCOSE. 2011. "Systems Engineering Professional Certification."  In ''International Council on Systems Engineering'' online.  Accessed April 13, 2015.  Available at:  http://www.incose.org/certification/.
 
 
 
Kramer, M.J. 2007. "Can Concept Maps Bridge The Engineering Gap?" ''Crosslink'', the Aerospace Corporation Magazine of Advances in Aerospace Technology. 8(1) (Spring 2007): 26-9. Accessed April 23, 2013. Available at: http://aerospace.wpengine.netdna-cdn.com/wp-content/uploads/crosslink/V8N1.pdf.
 
 
 
Kramer, M.J. 2005. ''Using Concept Maps for Knowledge Acquisition in Satellite Design: Translating 'Statement of Requirements on Orbit' to 'Design Requirements.'' Dissertation. Ft. Lauderdale, FL, USA: Graduate School of Computer and Information Sciences, Nova Southeastern University.
 
 
 
Lasfer, K. and A. Pyster. 2011. "The Growth of Systems Engineering Graduate Programs in the United States."  Paper presented at Conference on Systems Engineering Research, 15-16 April 2011. Los Angeles, CA, USA.
 
 
Lockheed Martin. 2010. ''Training and Leadership Development Programs for College Applicants in Lockheed Martin Corporation.'' Bethesda, MD, USA. Accessed on August 30, 2012. Available at http://www.lockheedmartinjobs.com/leadership-development-program.asp.
 
  
NASA. 2010. ''Academy of Program/Project & engineering leadership (APPEL): Project life cycle support in U.S. National Aeronautics and Space Administration (NASA).'' Washington, DC, USA: U.S. National Air and Space Administration (NASA). Accessed on September 15, 2011. Available at http://www.nasa.gov/offices/oce/appel/performance/lifecycle/161.html.
+
DAU. 2010. ''Defense Acquisition Guidebook (DAG)''. Ft. Belvoir, VA, USA: Defense Acquisition University (DAU)/U.S. Department of Defense (DoD). February 19, 2010.
  
NASA. 2007. ''NASA Procedural Requirements: NASA Systems Engineering Processes and Requirements''. Washington, DC, USA: U.S. National Aeronautic and Space Administration (NASA). NPR 7123.1A.
+
ECSS. 2009. ''Systems Engineering General Requirements.'' Noordwijk, Netherlands: Requirements and Standards Division, European Cooperation for Space Standardization (ECSS), 6 March 2009. ECSS-E-ST-10C.  
  
Roberts, J., B. Simpson, and S. Guarro. 2007. "A Mission Assurance Toolbox." ''Crosslink'', the Aerospace Corporation Magazine of Advances in Aerospace Technology. 8(2) (Fall 2007): 10-13.
+
MITRE. 2011. "Verification and Validation." in ''Systems Engineering Guide.'' Accessed 11 March 2012 at [[http://mitre.org/work/systems_engineering/guide/se_lifecycle_building_blocks/test_evaluation/verification_validation.html]].
  
SEI. 2007. ''Capability Maturity Model Integrated (CMMI) for Development'', version 1.2, Measurement and Analysis Process Area. Pittsburgh, PA, USA: Software Engineering Institute (SEI)/Carnegie Mellon University (CMU).
+
SAE International. 1996. ''Certification Considerations for Highly-Integrated or Complex Aircraft Systems.'' Warrendale, PA, USA: SAE International, ARP475.
  
Squires, A. 2011. ''Investigating the Relationship between Online Pedagogy and Student Perceived Learning of Systems Engineering Competencies''. Dissertation. Stevens Institute of Technology, Hoboken, NJ, USA.
+
SEI. 2007. "Measurement and Analysis Process Area" in ''Capability Maturity Model Integrated (CMMI) for Development, version 1.2.'' Pittsburgh, PA, USA: Software Engineering Institute (SEI)/Carnegie Mellon University (CMU).
 
 
===Primary References===
 
Academy of Program/Project & Engineering Leadership (APPEL). 2009. ''[[NASA's Systems Engineering Competencies]]''. Washington, DC, USA: U.S. National Aeronautics and Space Administration (NASA).  Accessed on May 2, 2014. Available at http://appel.nasa.gov/career-resources/project-management-and-systems-engineering-competency-model/.
 
 
 
DAU. 2013. ''[[ENG Competency Model]]'', 12 June 2013 version. in Defense Acquisition University (DAU)/U.S. Department of Defense Database Online. Accessed on September 23, 2014. Available at https://acc.dau.mil/CommunityBrowser.aspx?id=657526&lang=en-US
 
 
 
Davidz, H.L. and J. Martin. 2011. "[[Defining a Strategy for Development of Systems Capability in the Workforce]]". ''Systems Engineering.'' 14(2): 141-143.  
 
 
 
INCOSE. 2010. ''[[Systems Engineering Competencies Framework 2010-0205]]''. San Diego, CA, USA: International Council on Systems Engineering (INCOSE), INCOSE-TP-2010-003.
 
 
 
===Additional References===
 
None.
 
  
 
----
 
----
<center>[[Assessing Individuals|< Previous Article]] | [[Enabling Individuals|Parent Article]] | [[Ethical Behavior|Next Article >]]</center>
+
<center>[[System Integration|< Previous Article]] | [[System Realization|Parent Article]] | [[System Validation|Next Article >]]</center>
  
<center>'''SEBoK v. 2.1, released 31 October 2019'''</center>
+
<center>'''SEBoK v. 2.0, released 1 June 2019'''</center>
  
[[Category: Part 5]][[Category:Topic]]
+
[[Category: Part 3]][[Category:Topic]]
[[Category:Enabling Individuals]]
+
[[Category:System Realization]]

Revision as of 20:41, 19 October 2019

System VerificationSystem Verification is a set of actions used to check the correctness of any element, such as a system elementsystem element, a systemsystem, a document, a serviceservice, a task, a requirementrequirement, etc. These types of actions are planned and carried out throughout the life cyclelife cycle of the system. Verification is a generic term that needs to be instantiated within the context it occurs. As a process, verification is a transverse activity to every life cycle stage of the system. In particular, during the development cycle of the system, the verification process is performed in parallel with the system definitionsystem definition and system realizationsystem realization processes and applies to any activity and any product resulting from the activity. The activities of every life cycle process and those of the verification process can work together. For example, the integrationintegration process frequently uses the verification process. It is important to remember that verification, while separate from validation, is intended to be performed in conjunction with validation.

Definition and Purpose

VerificationVerification is the confirmation, through the provision of objective evidence, that specified requirements have been fulfilled. With a note added in ISO/IEC/IEEE 15288, the scope of verification includes a set of activities that compares a system or system element against the requirements, architecture and design characteristics, and other properties to be verified (ISO/IEC/IEEE 2015). This may include, but is not limited to, specified requirements, design description, and the system itself.

The purpose of verification, as a generic action, is to identify the faults/defects introduced at the time of any transformation of inputs into outputs. Verification is used to provide information and evidence that the transformation was made according to the selected and appropriate methods, techniques, standards, or rules.

Verification is based on tangible evidence; i.e., it is based on information whose veracity can be demonstrated by factual results obtained from techniques such as inspection, measurement, testing, analysis, calculation, etc. Thus, the process of verifying a systemsystem (productproduct, serviceservice, enterpriseenterprise, or system of systemssystem of systems (SoS)) consists of comparing the realized characteristics or properties of the product, service, or enterprise against its expected design properties.

Principles and Concepts

Concept of Verification Action

Why Verify?

In the context of human realization, any human thought is susceptible to error. This is also the case with any engineering activity. Studies in human reliability have shown that people trained to perform a specific operation make around 1-3 errors per hour in best case scenarios. In any activity, or resulting outcome of an activity, the search for potential errors should not be neglected, regardless of whether or not one thinks they will happen or that they should not happen; the consequences of errors can cause extremely significant failures or threats.

A verification action is defined, and then performed, as shown in Figure 1.

Figure 1. Definition and Usage of a Verification Action. (SEBoK Original)

The definition of a verification action applied to an engineering element includes the following:

  • Identification of the element on which the verification action will be performed
  • Identification of the reference to define the expected result of the verification action (see examples of reference in Table 1)

The performance of a verification action includes the following:

  • Obtaining a result by performing the verification action onto the submitted element
  • Comparing the obtained result with the expected result
  • Deducing the degree of correctness of the element

What to Verify?

Any engineering element can be verified using a specific reference for comparison: stakeholder requirement, system requirement, function, system element, document, etc. Examples are provided in Table 1.

Table 1. Examples of Verified Items. (SEBoK Original)
Items Explanation for Verification
Document To verify a document is to check the application of drafting rules.
Stakeholder Requirement and System Requirement To verify a stakeholder requirement or a system requirement is to check the application of syntactic and grammatical rules, characteristics defined in the stakeholder requirements definition process, and the system requirements definition process such as necessity, implementation free, unambiguous, consistent, complete, singular, feasible, traceable, and verifiable.
Design To verify the design of a system is to check its logical and physical architecture elements against the characteristics of the outcomes of the design processes.
System To verify a system (product, service, or enterprise) is to check its realized characteristics or properties against its expected design characteristics.
Aggregate To verify an aggregate for integration is to check every interface and interaction between implemented elements.
Verification Procedure To verify a verification procedure is to check the application of a predefined template and drafting rules.

Verification versus Validation

The term verification is often associated with the term validation and understood as a single concept of V&V. Validation is used to ensure that one is working the right problem, whereas verification is used to ensure that one has solved the problem right (Martin 1997). From an actual and etymological meaning, the term verification comes from the Latin verus, which means truth, and facere, which means to make/perform. Thus, verification means to prove that something is true or correct (a property, a characteristic, etc.). The term validation comes from the Latin valere, which means to become strong, and has the same etymological root as the word value. Thus, validation means to prove that something has the right features to produce the expected effects. (Adapted from "Verification and Validation in plain English" (Lake INCOSE 1999).)

The main differences between the verification process and the validation process concern the references used to check the correctness of an element, and the acceptability of the effective correctness.

  • Within verification, comparison between the expected result and the obtained result is generally binary, whereas within validation, the result of the comparison may require a judgment of value regarding whether or not to accept the obtained result compared to a threshold or limit.
  • Verification relates more to one element, whereas validation relates more to a set of elements and considers this set as a whole.
  • Validation presupposes that verification actions have already been performed.
  • The techniques used to define and perform the verification actions and those for validation actions are very similar.

Integration, Verification, and Validation of the System

There is sometimes a misconception that verification occurs after integration and before validation. In most cases, it is more appropriate to begin verification activities during development or implementationimplementation and to continue them into deployment and use.

Once the system elements have been realized, they are integrated to form the complete system. Integration consists of assembling and performing verification actions as stated in the integration process. A final validation activity generally occurs when the system is integrated, but a certain number of validation actions are also performed parallel to the system integration in order to reduce the number of verification actions and validation actions while controlling the risks that could be generated if some checks are excluded. Integration, verification, and validation are intimately processed together due to the necessity of optimizing the strategy of verification and validation, as well as the strategy of integration.

Process Approach

Purpose and Principle of the Approach

The purpose of the verification process is to confirm that the system fulfills the specified design requirements. This process provides the information required to effect the remedial actions that correct non-conformances in the realized system or the processes that act on it - see ISO/IEC/IEEE 15288 (ISO/IEC/IEEE 2015).

Each system element and the complete system itself should be compared against its own design references (specified requirements). As stated by Dennis Buede, verification is the matching of [configuration items], components, sub-systems, and the system to corresponding requirements to ensure that each has been built right (Buede 2009). This means that the verification process is instantiated as many times as necessary during the global development of the system. Because of the generic nature of a process, the verification process can be applied to any engineering element that has conducted to the definition and realization of the system elements and the system itself.

Facing the huge number of potential verification actions that may be generated by the normal approach, it is necessary to optimize the verification strategy. This strategy is based on the balance between what must be verified and constraints, such as time, cost, and feasibility of testing, which naturally limit the number of verification actions and the risks one accepts when excluding some verification actions.

Several approaches exist that may be used for defining the verification process. The International Council on Systems Engineering (INCOSE) dictates that two main steps are necessary for verification: planning and performing verification actions (INCOSE 2012). NASA has a slightly more detailed approach that includes five main steps: prepare verification, perform verification, analyze outcomes, produce a report, and capture work products (NASA December 2007, 1-360, p. 102). Any approach may be used, provided that it is appropriate to the scope of the system, the constraints of the project, includes the activities of the process listed below in some way, and is appropriately coordinated with other activities.

Generic inputs are baseline references of the submitted element. If the element is a system, inputs are the logical and physical architecture elements as described in a system design document, the design description of internal interfaces to the system and interfaces requirements external to the system, and by extension, the system requirements. Generic outputs define the verification plan that includes verification strategy, selected verification actions, verification procedures, verification tools, the verified element or system, verification reports, issue/trouble reports, and change requests on design.

Activities of the Process

To establish the verification strategy drafted in a verification plan (this activity is carried out concurrently to system definition activities), the following steps are necessary:

  • Identify verification scope by listing as many characteristics or properties as possible that should be checked. The number of verification actions can be extremely high.
  • Identify constraints according to their origin (technical feasibility, management constraints as cost, time, availability of verification means or qualified personnel, and contractual constraints that are critical to the mission) that limit potential verification actions.
  • Define appropriate verification techniques to be applied, such as inspection, analysis, simulation, peer-review, testing, etc., based on the best step of the project to perform every verification action according to the given constraints.
  • Consider a tradeoff of what should be verified (scope) taking into account all constraints or limits and deduce what can be verified; the selection of verification actions would be made according to the type of system, objectives of the project, acceptable risks, and constraints.
  • Optimize the verification strategy by defining the most appropriate verification technique for every verification action while defining necessary verification means (tools, test-benches, personnel, location, and facilities) according to the selected verification technique.
  • Schedule the execution of verification actions in the project steps or milestones and define the configuration of elements submitted to verification actions (this mainly involves testing on physical elements).

Performing verification actions includes the following tasks:

  • Detail each verification action; in particular, note the expected results, the verification techniques to be applied, and the corresponding means required (equipment, resources, and qualified personnel).
  • Acquire verification means used during system definition steps (qualified personnel, modeling tools, mocks-up, simulators, and facilities), and then those used during the integration step (qualified personnel, verification tools, measuring equipment, facilities, verification procedures, etc.).
  • Carry out verification procedures at the right time, in the expected environment, with the expected means, tools, and techniques.
  • Capture and record the results obtained when performing verification actions using verification procedures and means.

The obtained results must be analyzed and compared to the expected results so that the status may be recorded as either compliant or non-compliant. Systems engineeringSystems engineering (SE) practitioners will likely need to generate verification reports, as well as potential issue/trouble reports, and change requests on design as necessary.

Controlling the process includes the following tasks:

  • Update the verification plan according to the progress of the project; in particular, planned verification actions can be redefined because of unexpected events.
  • Coordinate verification activities with the project manager: review the schedule and the acquisition of means, personnel, and resources. Coordinate with designers for issues/trouble/non-conformance reports and with the configuration manager for versions of the physical elements, design baselines, etc.

Artifacts and Ontology Elements

This process may create several artifacts such as:

  • verification plans (contain the verification strategy)
  • verification matrices (contain the verification action, submitted element, applied technique, step of execution, system block concerned, expected result, obtained result, etc.)
  • verification procedures (describe verification actions to be performed, verification tools needed, the verification configuration, resources and personnel needed, the schedule, etc.)
  • verification reports
  • verification tools
  • verified elements
  • issue / non-conformance / trouble reports
  • change requests to the design

This process utilizes the ontology elements displayed in Table 2 below.

Table 2. Main Ontology Elements as Handled within Verification. (SEBoK Original)
Element Definition

Attributes (examples)

Verification Action A verification action describes what must be verified (the element as reference) on which element, the expected result, the verification technique to apply, on which level of decomposition.

Identifier, name, description

Verification Procedure A verification procedure groups a set of verification actions performed together (as a scenario of tests) in a gin verification configuration.

Identifier, name, description, duration, unit of time

Verification Tool A verification tool is a device or physical tool used to perform verification procedures (test bench, simulator, cap/stub, launcher, etc.).

Identifier, name, description

Verification Configuration A verification configuration groups all physical elements (aggregates and verification tools) necessary to perform a verification procedure.

Identifier, name, description

Risk An event having a probability of occurrence and a gravity degree on its consequence onto the system mission or on other characteristics (used for technical risk in engineering). A risk is the combination of vulnerability and of a danger or a threat.
Rationale An argument that provides the justification for the selection of an engineering element.

Identifier, name, description (rationale, reasons for defining a verification action, a verification procedure, for using a verification tool, etc.)

Methods and Techniques

There are several verification techniques to check that an element or a system conforms to its design references or its specified requirements. These techniques are almost the same as those used for validation, though the application of the techniques may differ slightly. In particular, the purposes are different; verification is used to detect faults/defects, whereas validation is used to provide evidence for the satisfaction of (system and/or stakeholder) requirements. Table 3 below provides descriptions of some techniques for verification.

Table 3. Verification Techniques. (SEBoK Original)
Verification Technique Description
Inspection Technique based on visual or dimensional examination of an element; the verification relies on the human senses or uses simple methods of measurement and handling. Inspection is generally non-destructive, and typically includes the use of sight, hearing, smell, touch, and taste, simple physical manipulation, mechanical and electrical gauging, and measurement. No stimuli (tests) are necessary. The technique is used to check properties or characteristics best determined by observation (e.g. paint color, weight, documentation, listing of code, etc.).
Analysis Technique based on analytical evidence obtained without any intervention on the submitted element using mathematical or probabilistic calculation, logical reasoning (including the theory of predicates), modeling and/or simulation under defined conditions to show theoretical compliance. Mainly used where testing to realistic conditions cannot be achieved or is not cost-effective.
Analogy or Similarity Technique based on evidence of similar elements to the submitted element or on experience feedback. It is absolutely necessary to show by prediction that the context is invariant that the outcomes are transposable (models, investigations, experience feedback, etc.). Similarity can only be used if the submitted element is similar in design, manufacture, and use; equivalent or more stringent verification actions were used for the similar element, and the intended operational environment is identical to or less rigorous than the similar element.
Demonstration Technique used to demonstrate correct operation of the submitted element against operational and observable characteristics without using physical measurements (no or minimal instrumentation or test equipment). Demonstration is sometimes called 'field testing'. It generally consists of a set of tests selected by the supplier to show that the element response to stimuli is suitable or to show that operators can perform their assigned tasks when using the element. Observations are made and compared with predetermined/expected responses. Demonstration may be appropriate when requirements or specification are given in statistical terms (e.g. mean time to repair, average power consumption, etc.).
Test Technique performed onto the submitted element by which functional, measurable characteristics, operability, supportability, or performance capability is quantitatively verified when subjected to controlled conditions that are real or simulated. Testing often uses special test equipment or instrumentation to obtain accurate quantitative data to be analyzed.
Sampling Technique based on verification of characteristics using samples. The number, tolerance, and other characteristics must be specified to be in agreement with the experience feedback.

Practical Considerations

Key pitfalls and good practices related to this topic are described in the next two sections.

Pitfalls

Some of the key pitfalls encountered in planning and performing System Verification are provided in Table 4.

Table 4. Major Pitfalls with System Verification (SEBoK Original)
Pitfall Description
Confusion between verification and validation Confusion between verification and validation causes developers to take the wrong reference/baseline to define verification and validation actions and/or to address the wrong level of granularity (detail level for verification, global level for validation).
No verification strategy One overlooks verification actions because it is impossible to check every characteristic or property of all system elements and of the system in any combination of operational conditions and scenarios. A strategy (justified selection of verification actions against risks) must be established.
Save or spend time Skip verification activity to save time.
Use only testing Use only testing as a verification technique. Testing requires checking products and services only when they are implemented. Consider other techniques earlier during design; analysis and inspections are cost effective and allow discovering early potential errors, faults, or failures.
Stop verifications when funding is diminished Stopping the performance of verification actions when budget and/or time are consumed. Prefer using criteria such as coverage rates to end verification activity.

Proven Practices

Some proven practices gathered from the references are provided in Table 5.

Table 5. Proven Practices with System Verification. (SEBoK Original)
Practice Description
Start verifications early in the development The earlier characteristics of an element are verified in the project, the easier the corrections are to do and the consequences on schedule and cost will be fewer.
Define criteria ending verifications Carrying out verification actions without limits generates a risk of drift for costs and deadlines. Modifying and verifying in a non-stop cycle until arriving at a perfect system is the best way to never supply the system. Thus, it is necessary to set limits of cost, time, and a maximum number of modification loops back for each verification action type, ending criteria (percentages of success, error count detected, coverage rate obtained, etc.).
Involve design responsible with verification Include the verification responsible in the designer team or include some designer onto the verification team.

References

Works Cited

Buede, D.M. 2009. The Engineering Design of Systems: Models and Methods. 2nd ed. Hoboken, NJ, USA: John Wiley & Sons Inc.

INCOSE. 2012. INCOSE Systems Engineering Handbook, version 3.2.2. San Diego, CA, USA: International Council on Systems Engineering (INCOSE), INCOSE-TP-2003-002-03.2.2.

ISO/IEC/IEEE. 2015.Systems and Software Engineering - System Life Cycle Processes. Geneva, Switzerland: International Organization for Standardization (ISO)/International Electrotechnical Commission (IEC), Institute of Electrical and Electronics Engineers. ISO/IEC/IEEE 15288:2015.

Lake, J. 1999. "V & V in Plain English." International Council on Systems Engineering (INCOSE) 9th Annual International Symposium, Brighton, UK, 6-10 June 1999.

NASA. 2007. Systems Engineering Handbook. Washington, DC, USA: National Aeronautics and Space Administration (NASA), December 2007. NASA/SP-2007-6105.

Primary References

INCOSE. 2012. INCOSE Systems Engineering Handbook: A Guide for System Life Cycle Processes and Activities, version 3.2.2. San Diego, CA, USA: International Council on Systems Engineering (INCOSE), INCOSE-TP-2003-002-03.2.2.

ISO/IEC/IEEE. 2015. Systems and Software Engineering - System Life Cycle Processes. Geneva, Switzerland: International Organization for Standardization (ISO)/International Electrotechnical Commission (IEC)/ Institute of Electrical and Electronics Engineers. ISO/IEC/IEEE 15288:2015.

NASA. 2007. Systems Engineering Handbook. Washington, D.C.: National Aeronautics and Space Administration (NASA), December 2007. NASA/SP-2007-6105.

Additional References

Buede, D.M. 2009. The Engineering Design of Systems: Models and Methods, 2nd ed. Hoboken, NJ, USA: John Wiley & Sons Inc.

DAU. 2010. Defense Acquisition Guidebook (DAG). Ft. Belvoir, VA, USA: Defense Acquisition University (DAU)/U.S. Department of Defense (DoD). February 19, 2010.

ECSS. 2009. Systems Engineering General Requirements. Noordwijk, Netherlands: Requirements and Standards Division, European Cooperation for Space Standardization (ECSS), 6 March 2009. ECSS-E-ST-10C.

MITRE. 2011. "Verification and Validation." in Systems Engineering Guide. Accessed 11 March 2012 at [[1]].

SAE International. 1996. Certification Considerations for Highly-Integrated or Complex Aircraft Systems. Warrendale, PA, USA: SAE International, ARP475.

SEI. 2007. "Measurement and Analysis Process Area" in Capability Maturity Model Integrated (CMMI) for Development, version 1.2. Pittsburgh, PA, USA: Software Engineering Institute (SEI)/Carnegie Mellon University (CMU).


< Previous Article | Parent Article | Next Article >
SEBoK v. 2.0, released 1 June 2019