7 Oct 01 - 12 Oct 08
Software Inspection Measurements and Metrics
Software Inspection Measurements and Derived Metrics
Preparation for Entering Data and Deriving Metrics
1. Enter your software inspections measurement data below.
2. Compute derived metrics and compare the results with those of the National Software Quality Experiment (NSQE).
3. To compare results to the full NSQE database, proceed to User Guidance for Entering Data.
4. To compare results to the NSQE database averages for a particular software process maturity level, select a level.
Software Process Maturity Level
User Guidance for Entering Data
1. Enter actual measurements from one or many inspection sessions.
2. Also feel free to enter aribtrary experimental data in order to observe the behavior of the derived metrics.
3. Please note, neither the data you enter nor the derived metrics leave your machine.
Number of inspection sessions
Preparation effort in minutes (sum the prep time of all participants)
Meeting effort in minutes (number of particpants times elapsed time)
Meeting time in minutes (elapsed time)
Major defects (those affecting execution)
Minor defects (those not affecting execution)
Size in lines of code (including comments)
User Guidance for Computing Derived Metrics
1. To compute a metric, click on the derived metric bar desired.
2. To diagnose a metric, click on the adjacent NSQE Diagnostic bar.
3. For a derived metric within the NSQE upper and lower control limit, the NSQE Diagnostic presents these limits.
4. For a derived metric outside the NSQE limits, the NSQE Diagnostic presents the limit exceeded.
5. If a software process maturity level was selected, the level average for the derived metric is presented on the far right.
6. To clear all results and start over, click on the Reset Form bar.
To examine the behavior of Return on Investment more closely, click on the
Return on Investment Tool
For additional information, click on the
National Software Quality Experiment
Send mail by clicking