AI Online

Ai INNOVATION, SINCE 1895

Quigley Corner: Product Development Test, Inspection, Evaluation Master Plan Organized (TIEMPO)

TEST, INSPECTION AND EVALUATION

Experience should inform that product quality starts at the very beginning actions and continues throughout the development.  We should use all of the tools at our disposal to ensure the competency of the product. Solely employing verification activities and perhaps at the end of the project, is not the road to success.

Good planning of the product development incarnations; specifically, a phased (iterated) and incremental delivery of the feature content, makes it possible for an organization to employ test, inspections, and evaluation for competitive advantage.  We will show how this works in any product development endeavor; however, note the concepts easily generalize to any industry.  To really improve (and prove your product quality), a comprehensive approach is required, and not a single last gasp at the end of the development cycle.

The Test, Inspection & Evaluation Master Plan Operations adds an extra method to the Test and Evaluation Master Plan [TEMP—specified in IEEE 1220 and MIL-STD-499B] to support product quality. TIEMPO expands the concept of a Test and Evaluation Master Plan by focusing on staged deliveries of which each product/process release as a superset of the previous release.  These staged deliveries will fall under our configuration management control, in fact, we can consider these iterations our development baselines.  Each package is well defined; likewise, the test, inspection and evaluation actions are well defined for each iteration.  Ultimately, the planned product releases are coordinated with evaluation methods for each delivery. Under this plan, we can handle inspections as:

 

  • Iterative software package contents
  • Iterative hardware packages
  • Software (code) reviews
  • Design reviews
    • Simulation and models
    • Specifications and Drawings
    • Mechanical
    • Embedded product
    • Design Failure Mode and Effects Analysis (DFMEA)
    • Process Failure Mode and Effects Analysis (PFMEA)
    • Schematic reviews
    • Software Requirements Specification (SRS – IEEE 830-1993 and IEEE 830-1984) Reviews
    • Software Design Document (SDD, IEEE 1016-2009) Reviews
    • Systems Requirements Reviews (IEEE Std 15288.2-2014)
    • Functional Requirements Reviews
  • Prototype part inspections
  • Production line process (designed)
  • Production line reviews
    • Work instructions
    • Work stations
    • Tools and techniques
  • Project documentation
    • Schedule plans
    • Scope documentation
    • Budget plans
    • Risk management plans

 

Philosophy of the Master Plan

 

At its core, TIEMPO assists in coordinating our product functional growth with critiquing methods. Each package has a defined set of feature contents (associated with configuration management), to which our entire battery of quality safeguarding techniques will be deployed.  This approach—defined builds of moderate size and constant critique—has a very agile character, allowing for readily available reviews of pre-product artifacts as well as iterations of the product.  In addition, this approach helps us reduce risk by developing our superset releases wherein each subset remains relatively untouched.  Most defects will reside in the new or latest portion (the previously developed part of the product or process is now a subset and proven defect-free).  Even if the previous iteration had defects, we will have had the opportunity between these iterations to correct these defects before the next iteration with the defined additional capability.  The frequent critique of the product allows quality growth and reliability growth and gives us data from which we can assess the product readiness level (should we launch the product).

 

Benefits of the Master Plan

 

Experience suggests the following benefits arise from this approach:

  • Reduction of defects early in the development process (product artifacts) even before tangible development effort
  • Well planned functional growth in iterative software and hardware packages (including systems planning and growth)
  • Ability to prepare for test (known build content-developing test fixtures, test data, and test cases), inspection and evaluation activities based upon clearly-identified packages
  • Linking test, inspection and evaluations to artifacts and design iterations
  • Reduced risk at product launch
  • Identification of all activities to safeguard the quality—even before material availability and testing can take place
  • Ease of stakeholder assessment, including customer access for review of product evolution and appraisal activities

 

An overview of one approach:

 

Below we show how each of the inspection, testing and evaluation pieces fit together.  This method need not be restricted only phase-oriented product development, but any incremental and iterative approach will be the beneficiary of a constant critique.  In fact, any article or artifact produced on the way to the final product is also fodder for a critique.  This includes entrepreneurial activities.

Test (Verification):

 

Test or verification consists of those activities typically associated with determining whether the product meets the specification or original design criterion.  If we employ an incremental and iterative approach, we will constantly compare our prototype parts against our specifications as the specification and product matures.  Though prototype parts may not represent production in terms of durability, they should represent some reasonable facsimile of the shape and feature content congruent with the final product.  We use these parts to reduce the risk by not jumping from idea to final product without learning along the way.  We should learn something from this testing and use this to weigh the predicted future quality of the resultant product.  It is obvious how testing fits into TIEMPO.  However, there are some non-obvious opportunities to apply TIEMPO. For example, we can also use the inspection technique on our test cases, analyzing if we will indeed stress the product in a way valuable to our organization and project. The feedback from this inspection process will allow us to refine the testing scope, test cases, or any proposed non-specification or exploratory-based testing.

Figure 1Signal test result.

So, when do we start testing?

 

Many may believe that it is not possible to test a product without some measure of hardware or software samples available. It is possible to test if we have developed models and simulation to allow us to explore the product possibilities in advance of the material or software. This requires accurate models. To ensure accurate models we will run tests between our model results and real-world results to determine the gap and make necessary adjustments to the models.  We may even use these tools to develop our requirements if sophisticated enough.  When we do this, we reduce the risk and cost of the end design because we have already performed some evaluation of the design proposal.  As prototype parts become available, we will perform testing on these parts alone or in concert with our simulators. If we have staged or planned our function packages delivered via TIEMPO, as we should, we will test an incrementally improving product.

 

 

Types of tests during development

 

When we get into the heavy lifting of the product or service testing, we have a variety of methods in our arsenal.  At this stage we are trying to uncover any product maladies we wish not to hurt the company nor our customer.  A single approach to testing only finds some of the issues. To find the defects both immediate and latent, we will use combination of approaches such as:

  • Compliance testing (testing to specifications)
  • Extreme testing
  • Multi-stimuli or combinatorial testing
  • Stochastic (randomized exploratory)

 

 

Figure 2 PCB Inspection

 

Inspections

 

Reviews are analogous to inspections.  The goal of reviews are to find problems in our effort as early as we can.  There may be plenty of assumptions and suppositions that are not documented or voiced in the creation of these products. There may be language and interpretation issues in the specifications, that we can uncover through questioning, resulting in clarification and updates.  The act of reviewing can ferret out the erroneous deleterious requirements allowing us to adjust. We can employ a variety of review techniques on our project and product.   We list a few below:

 

  • Concept reviews
  • Customer Walk throughs (use case reviews)
  • Product requirements reviews
  • Specification reviews
  • System design reviews
  • Software design reviews
  • Hardware design reviews
  • Bill Of Materials reviews
  • Project and Product Pricing
  • Test plan reviews
  • Test case reviews
  • Prototype inspections
  • Technical and Users Manuals
  • Failure Mode Effects Analysis

 

A method employed by the automotive industry can be applied to any industry.  That technique is known as the Design Failure Mode Effects Analysis (DFMEA) and the Process Failure Mode Effects Analysis (PFMEA).  These tools represent a formal and well-structured review of the product and the production processes.  The method forces to consider the failure mechanism and the impact.  If we have a historical record we can take advantage of that record or even previous FMEA exercises.  There are two advantages; the first of which is the prioritization of severity.  The severity is a calculated number known as the Risk Priority Number (RPN) and is the result of the product of:

 

  • Severity (ranked 1-10)
  • Probability (ranked 1-10)
  • Detectability (ranked 1-10)

 

Figure 3DFMEA Example[1]

 

The larger the resulting RPN, the higher the severity, we then prioritize our addressing of these concerns first.  The second advantage fits with the testing portion of the TIEMPO.  The FMEA approach links testing to those identified areas of risk as well.  We may alter our approach or we may choose to explore via testing as an assessment to our prediction.  For example, let’s say we have a design that we believe may allow water intrusion into a critical area.  We may then elect to perform some sort of moisture exposure test to see if we are right about this event and the subsequent failure we predict.

Inspection caveats

 

By definition, an inspection is a form of quality containment, which means trapping potential escapes of defective products or processes. The function of inspection, then, is to capture substandard material and to stimulate meaningful remediation.  Inspection for items such as specification fiascoes, for example, prevents the defects from rippling through impending development activities where the correction is much more costly.  The containment consists of updating the specification and incrementing the revision number while capturing a “lesson learned.”  Reviews take time, attention to detail and an analytical review of whatever is under critique, anything less will likely result in a waste of time and resources.

 

Evaluation (Validation)

 

Evaluations can sometimes be colloquially referred to as live fire events, not that the product necessarily will be subjected to ammunition, only that it is subjected to the slings and arrows of real life.  The product will be used as by the customer starting from fundamental and easy use to more severe and random applications and product stresses including environmental and circumstances.  Evaluations are a form of validation., and validation answers the question “did we build the right thing” in contrast to verification which answers, “did we build the thing right.”  These activities are opportunities to learn about the product application. Does this fulfill the customer’s needs? If not, take appropriate actions .  Evaluations require a reasonably mature level of prototype, one that can withstand the rigor of actual application.  Our development team will be the first to run this battery of exercises (scripted and contrived events), but if the product iteration is sufficiently durable, and the risk of error impact on the customer sufficiently low, we may even provide articles for some key customers to use and explore.  We will want to have formalized check points to interview the customer to get their input on the product.

 

Product Development Phases

 

Usually there are industry specific product development processes.  One hypothetical, generic model for such a launch process might look like the following:

 

  • CONCEPT
  • SYSTEM LEVEL
  • PRELIMINARY DESIGN
  • CRITICAL DESIGN
  • TEST READINESS
  • PRODUCTION READINESS
  • LAUNCH

 

For the process outlined above, we could expect to see a test (T), inspection (I) and evaluation (E) per phase as indicated in the chart below:

The design aspects will apply to process design just as much as to product or service design.

 

 

Conclusion

 

We should not rely on one method for driving the product quality. Test, inspection and evaluation are instrumental for the successful launch of a new product.  We need not limit to the product but can also employ these techniques on a new process or even a developing a service.  We should learn as we progress through the development.  If all goes well, we can expect a successful launch, TIEMPO helps in learning and proactively developing responses, and eventually predicting the quality of the product as we approach launch.

 

We provide an outline for a TIEMPO document[2]:

  1. 1. Systems Introduction

1.1.       Mission Description

1.2.       System Threat Assessment –

1.3.       Min. Acceptable Operational Performance Requirements

1.4.       System Description

1.5.       Testing Objectives

1.6        Inspection Objectives

1.7        Evaluation Objectives

1.8.       Critical Technical Parameters

 

  1. Integrated Test, Inspection Evaluation Program Summary

2.1.       Inspection Areas (documents, code, models, material)

2.2.       Inspection Schedule

2.3.       Integrated Test Program Schedule

2.4.       Management

 

  1. Developmental Test and Evaluation Outline

3.1.       Simulation

3.2.       Developmental Test and Evaluation Overview

3.3.       Component Test and Evaluation

3.4.       Subsystem Test and Evaluation

3.5.       Developmental Test and Evaluation to Dates

3.6.       Future Developmental Test and Evaluation

3.7.       Live Use Test and Evaluation

 

  1. Inspection

4.1.       Inspection of models (model characteristics, model vs. real world)

4.2.       Inspection Material (Physical) parts

4.3.       Prototype Inspections

4.4.       Post Test Inspections

4.5.       Inspection Philosophy

4.6.       Inspection Documentation

4.7.       Inspections Software

4.8.       Design Reviews

 

  1. Operational Test an Evaluation Outline

5.1.       Operational Test and Evaluation Overview

5.2.       Operational Test and Evaluation to Date

5.3.       Features / Function delivery

5.4.       Future Operational Test and Evaluation

 

  1. Test and Evaluation Resource Summary

6.1.       Test Articles

6.2.       Test Sites and Instrumentation

6.3.       Test Support

6.4.       Inspection (Requirements and Design Documentation) resource requirements

6.5.       Inspection (source code) resource requirements

6.6.       Inspection prototype resource requirements

6.7.       Threat Systems / Simulators

6.8.       Test Targets and Expendables

6.9.       Operational Force Test Support

6.10.    Simulations, Models and Test beds

6.11.    Special Requirements

6.12.    T&E Funding Requirements

6.13.    Manpower/Personnel Training

[1] Potential failure mode and effects analysis (FMEA): Reference manual (3rd ed.). (2008). An adaptation.

[2] Pries, Kim H. , Quigley, Jon M. Total Quality Management for Project Management, Boca Raton, FL CRC Press, 2013

Ai contributor: Jon M. Quigley

Jon has been in the automotive product development and manufacturing for nearly 30 years.  He holds two master level degrees as well as two globally recognized certifications.  He has ceded seven patents to the companies at which he has worked.  He is a serial author and contributor to 15 books, guest lecturer, frequent guest on podcasts (such as SPaMcast), and teacher on the topics about which he writes.  He has a long-standing membership in SAE International, including authoring a book, and teaching classes.  Jon has won awards such as the Volvo-3P Technical Award in 2005 going on to win the 2006 Volvo Technology Award.

*Quigley’s Corner:

The automotive industry is a complex web of market, product development, product and supply chain management and manufacturing. This column explores developing of new products and the variety of approaches to successfully create and launch quality products through to production.

Previous posts

Next posts

Share this post

Share on facebook
Share on twitter
Share on linkedin
Share on pinterest
Share on print
Share on email

Thu. July 2nd, 2020

AI Library

AUTOMOTIVE INDUSTRIES

Founded in 1895, the world's first trade magazine covering the automotive industry.
Visit Us On TwitterVisit Us On Linkedin