U.S. patent application number 13/417753 was filed with the patent office on 2012-07-05 for systems engineering process.
This patent application is currently assigned to INTERNATIONAL BUSINESS MACHINES CORPORATION. Invention is credited to Bruce G. Barker, Vincent A. Buscher, Juan P. Giraldo, Carrie L. LeBolt, Stephen L. McLellen, Sharon E. Murphy, Paul R. Popick, David P. Ricci, Janet L. Spann.
Application Number | 20120173437 13/417753 |
Document ID | / |
Family ID | 35240542 |
Filed Date | 2012-07-05 |
United States Patent
Application |
20120173437 |
Kind Code |
A1 |
Barker; Bruce G. ; et
al. |
July 5, 2012 |
SYSTEMS ENGINEERING PROCESS
Abstract
A method and system for implementing a project for a customer. A
computer readable Requirements Traceability and Verification Matrix
(RTVM). The RTVM is stored in a computer readable storage device. A
verification is implemented, which verifies: an acceptability of
business requirements of the project due to business requirements
review (BRR) exit criteria having been satisfied, an acceptability
of system requirements of the project due to system requirements
review (SRR) exit criteria having been satisfied, an acceptability
of component requirements of the project due to component
requirements review (PDR) exit criteria having been satisfied. The
business requirements are decomposed into the system requirements.
The system requirements are decomposed into the component
requirements. In response to the verification, the RVTM is updated
with verification information relating to the business
requirements, verification information relating to the system
requirements, and verification information relating to the
component requirements.
Inventors: |
Barker; Bruce G.;
(Charlotte, NC) ; Buscher; Vincent A.; (Bethel,
CT) ; Giraldo; Juan P.; (Alexandria, VA) ;
LeBolt; Carrie L.; (Center Point, TX) ; McLellen;
Stephen L.; (Clifton, VA) ; Murphy; Sharon E.;
(Ballwin, MO) ; Popick; Paul R.; (Easton, MD)
; Ricci; David P.; (Ocean Springs, MS) ; Spann;
Janet L.; (Herndon, VA) |
Assignee: |
INTERNATIONAL BUSINESS MACHINES
CORPORATION
Armonk
NY
|
Family ID: |
35240542 |
Appl. No.: |
13/417753 |
Filed: |
March 12, 2012 |
Related U.S. Patent Documents
|
|
|
|
|
|
Application
Number |
Filing Date |
Patent Number |
|
|
12557891 |
Sep 11, 2009 |
|
|
|
13417753 |
|
|
|
|
10839583 |
May 5, 2004 |
7590552 |
|
|
12557891 |
|
|
|
|
Current U.S.
Class: |
705/301 |
Current CPC
Class: |
G06Q 10/063 20130101;
G06Q 10/10 20130101; G06Q 10/0637 20130101; G06Q 10/06393 20130101;
G06Q 30/0201 20130101; G06Q 10/06315 20130101; G06Q 10/06313
20130101; G06Q 10/103 20130101; G06Q 10/06395 20130101 |
Class at
Publication: |
705/301 |
International
Class: |
G06Q 10/06 20120101
G06Q010/06 |
Claims
1. A method for implementing a project for a customer, said method
comprising: a processor of a computer system generating a computer
readable Requirements Traceability and Verification Matrix (RTVM);
said processor storing the RTVM in a computer readable storage
device; said processor verifying an acceptability of business
requirements of the project due to business requirements review
(BRR) exit criteria having been satisfied, an acceptability of
system requirements of the project due to system requirements
review (SRR) exit criteria having been satisfied, an acceptability
of component requirements of the project due to component
requirements review (PDR) exit criteria having been satisfied,
wherein the business requirements are decomposed into the system
requirements and the system requirements are decomposed into the
component requirements; in response to said verifying, said
processor updating the RVTM with verification information relating
to the business requirements, verification information relating to
the system requirements, and verification information relating to
the component requirements.
2. The method of claim 1, wherein said verifying further comprises
verifying an acceptability of component designs and test plans due
to critical design review (CDR) exit criteria having been
satisfied, wherein the test plans are for testing the component
designs, wherein the component requirements are decomposed into the
component designs and the test plans, and wherein said updating
further comprises updating the RTVM with verification information
relating to the component designs and test plans.
3. The method of claim 2, said method comprising said processor
developing the business requirements, the system requirements, the
component requirements, and the component designs and test plans,
wherein said developing comprises said verifying.
4. The method of claim 1, wherein the RTVM depicts hierarchical
relationships between the business requirements and the system
requirements, and wherein the RTVM adepicts hierarchical
relationships between the system requirements and the component
requirements.
5. The method of claim 1, wherein the business requirements, the
system requirements, and component requirements are each identified
in the RTVM by a symbolic code which effectuates the hierarchical
relationships between the business requirements and the system
requirements and the hierarchical relationships between system
requirements and the component requirements.
6. The method of claim 1, wherein the verification information
relating to the business requirements, the system requirements, and
the component requirements include requirement status, affected
design documents, affected build components, customer acceptance
criteria, a test method, a test case, and a test result for at
least one test type.
7. The method of claim 6, wherein the test method is selected from
the group consisting of Analysis, Demonstration, Inspection,
Simulation/Modeling, and Test, and wherein the at least one test
type includes at least one of: unit test, integration test, system
test, systems integration test, usability test, acceptance test,
and operability test.
8. The method of claim 1, wherein the RTVM is encoded within a
spreadsheet, and wherein each business requirement, system
requirement, and component requirement is depicted in a unique row
of the spreadsheet.
9. A computer program product, comprising a computer readable
hardware storage medium having a computer readable program code
stored therein, said computer readable program code configured to
be executed by a processor of a computer system to implement a
method for implementing a project for a customer, said method
comprising: said processor generating a computer readable
Requirements Traceability and Verification Matrix (RTVM); said
processor storing the RTVM in a computer readable storage device;
said processor verifying an acceptability of business requirements
of the project due to business requirements review (BRR) exit
criteria having been satisfied, an acceptability of system
requirements of the project due to system requirements review (SRR)
exit criteria having been satisfied, an acceptability of component
requirements of the project due to component requirements review
(PDR) exit criteria having been satisfied, wherein the business
requirements are decomposed into the system requirements and the
system requirements are decomposed into the component requirements;
in response to said verifying, said processor updating the RVTM
with verification information relating to the business
requirements, verification information relating to the system
requirements, and verification information relating to the
component requirements.
10. The computer program product of claim 9, wherein said verifying
further comprises verifying an acceptability of component designs
and test plans due to critical design review (CDR) exit criteria
having been satisfied, wherein the test plans are for testing the
component designs, wherein the component requirements are
decomposed into the component designs and the test plans, and
wherein said updating further comprises updating the RTVM with
verification information relating to the component designs and test
plans.
11. The computer program product of claim 10, said method
comprising said processor developing the business requirements, the
system requirements, the component requirements, and the component
designs and test plans, wherein said developing comprises said
verifying.
12. The computer program product of claim 9, wherein the RTVM
depicts hierarchical relationships between the business
requirements and the system requirements, and wherein the RTVM
adepicts hierarchical relationships between the system requirements
and the component requirements.
13. The computer program product of claim 9, wherein the business
requirements, the system requirements, and component requirements
are each identified in the RTVM by a symbolic code which
effectuates the hierarchical relationships between the business
requirements and the system requirements and the hierarchical
relationships between system requirements and the component
requirements.
14. The computer program product of claim 9, wherein the
verification information relating to the business requirements, the
system requirements, and the component requirements include
requirement status, affected design documents, affected build
components, customer acceptance criteria, a test method, a test
case, and a test result for at least one test type.
15. A computer system comprising a processor and a computer
readable memory unit coupled to the processor, said memory unit
containing computer readable program code configured to be executed
by the processor to implement a method for implementing a project
for a customer, said method comprising: said processor generating a
computer readable Requirements Traceability and Verification Matrix
(RTVM); said processor storing the RTVM in a computer readable
storage device; said processor verifying an acceptability of
business requirements of the project due to business requirements
review (BRR) exit criteria having been satisfied, an acceptability
of system requirements of the project due to system requirements
review (SRR) exit criteria having been satisfied, an acceptability
of component requirements of the project due to component
requirements review (PDR) exit criteria having been satisfied,
wherein the business requirements are decomposed into the system
requirements and the system requirements are decomposed into the
component requirements; in response to said verifying, said
processor updating the RVTM with verification information relating
to the business requirements, verification information relating to
the system requirements, and verification information relating to
the component requirements.
16. The computer system of claim 15, wherein said verifying further
comprises verifying an acceptability of component designs and test
plans due to critical design review (CDR) exit criteria having been
satisfied, wherein the test plans are for testing the component
designs, wherein the component requirements are decomposed into the
component designs and the test plans, and wherein said updating
further comprises updating the RTVM with verification information
relating to the component designs and test plans.
17. The computer system of claim 16, said method comprising said
processor developing the business requirements, the system
requirements, the component requirements, and the component designs
and test plans, wherein said developing comprises said
verifying.
18. The computer system of claim 15, wherein the RTVM depicts
hierarchical relationships between the business requirements and
the system requirements, and wherein the RTVM adepicts hierarchical
relationships between the system requirements and the component
requirements.
19. The computer system of claim 15, wherein the business
requirements, the system requirements, and component requirements
are each identified in the RTVM by a symbolic code which
effectuates the hierarchical relationships between the business
requirements and the system requirements and the hierarchical
relationships between system requirements and the component
requirements.
20. The computer system of claim 15, wherein the verification
information relating to the business requirements, the system
requirements, and the component requirements include requirement
status, affected design documents, affected build components,
customer acceptance criteria, a test method, a test case, and a
test result for at least one test type.
Description
[0001] This application is a continuation application claiming
priority to Ser. No. 12/557,891, filed Sep. 11, 2009; which is a
divisional application claiming priority to Ser. No. 10/839,583,
filed May 5, 2004, U.S. Pat. No. 7,590,552, issued Sep. 15,
2009.
BACKGROUND OF THE INVENTION
[0002] 1. Technical Field
[0003] The present invention relates to a Systems Engineering
process for managing and implementing a project for a customer.
[0004] 2. Related Art
[0005] Current Systems Engineering techniques utilize subjective
principles and practices that rely primarily upon the
practitioner's experience and judgment to define, analyze, and
manage requirements, architectures, and designs. An objective
methodology that consistently applies these principles and
practices does not exist, which is a one of the primary reason why
many commercial projects fail. Thus, there is a need for a
consistent, structured, and flexible Systems Engineering method for
defining, analyzing, and managing requirements, architectures, and
designs for commercial projects.
SUMMARY OF THE INVENTION
[0006] The present invention provides a method for implementing a
project for a customer, said method comprising the steps of:
[0007] developing business requirements of the project, including
reviewing the business requirements for acceptability in accordance
with business requirements review (BRR) exit criteria;
[0008] developing system requirements of the project after the step
of developing business requirements, including reviewing the system
requirements for acceptability in accordance with system
requirements review (SRR) exit criteria, said business requirements
being decomposed into the system requirements; and
[0009] developing component requirements of the project after the
step of developing system requirements, including reviewing the
component requirements for acceptability in accordance with
preliminary design review (PDR) exit criteria, said system
requirements being decomposed into the component requirements.
[0010] The present invention provides a method for implementing a
project for a customer, comprising the steps of:
[0011] developing business requirements (BR) of the project,
including reviewing the business requirements for acceptability in
accordance with BRR exit criteria;
[0012] developing system requirements (SR) of the project after the
step of developing business requirements, including reviewing the
system requirements for acceptability in accordance with SRR exit
criteria, said business requirements being decomposed into the
system requirements;
[0013] developing component requirements of the project after the
step of developing system requirements, including reviewing the
component requirements (CR) for acceptability in accordance with
component requirements review (PDR) exit criteria, said system
requirements being decomposed into the component requirements;
[0014] developing component designs compatible with the component
requirements and
[0015] developing test plans for testing the component designs
after the step of developing component requirements, including
reviewing the component designs and test plans for acceptability in
accordance with critical design review (CDR) exit criteria, said
component requirements being decomposed into the component designs
and the test plans;
[0016] providing a Requirements Traceability and Verification
Matrix (RTVM) as input to each of the BRR, SRR, and PDR, said RTVM
depicting hierarchical relationships between the business
requirements and the system requirements, said RTVM further
depicting hierarchical relationships between the system
requirements and the component requirements;
[0017] updating the RTVM with verification information relating to
the business requirements, after the BRR exit criteria have been
determined to be satisfied and before the step of developing system
requirements has been initiated;
[0018] updating the RTVM with verification information relating to
the system requirements, after the SRR exit criteria have been
determined to be satisfied and before the step of developing
component requirements has been initiated;
[0019] updating the RTVM with verification information relating to
the component requirements, after the PDR exit criteria have been
determined to be satisfied; and
[0020] updating the RTVM with verification information relating to
the component designs and test plans, after the CDR exit criteria
have been determined to be satisfied.
[0021] The present invention provides a computer program product,
comprising:
[0022] a computer usable medium having a computer readable
Requirements Traceability and Verification Matrix (RTVM) embodied
therein, said RTVM being used in conjunction with a method for
implementing a project for a customer, said method comprising the
steps of:
[0023] developing business requirements of the project, including
reviewing the business requirements for acceptability in accordance
with business requirements review (BRR) exit criteria;
[0024] developing system requirements of the project after the step
of developing business requirements, including reviewing the system
requirements for acceptability in accordance with system
requirements review (SRR) exit criteria, said business requirements
being decomposed into the system requirements; and
[0025] developing component requirements of the project after the
step of developing system requirements, including reviewing the
component requirements for acceptability in accordance with
component requirements review (PDR) exit criteria, said system
requirements being decomposed into the component requirements,
[0026] said RTVM comprising verification information relating to
the business requirements if the BRR exit criteria have been
satisfied,
[0027] said RTVM comprising verification information relating to
the system requirements if the SRR exit criteria have been
satisfied,
[0028] said RTVM comprising verification information relating to
the component requirements if the PDR exit criteria have been
satisfied,
[0029] said RTVM depicting hierarchical relationships between the
business requirements and the system requirements,
[0030] said RTVM further depicting hierarchical relationships
between the system requirements and the component requirements.
[0031] The present invention advantageously provides a consistent,
structured, and flexible Systems Engineering method for defining,
analyzing, and managing requirements, architectures, and designs
for commercial projects.
BRIEF DESCRIPTION OF THE DRAWINGS
[0032] FIG. 1 is a flow chart depicting an initial step followed by
six sequential steps of a Systems Engineering (SE) process for
implementing a project for a customer, in accordance with
embodiments of the present invention.
[0033] FIGS. 2A-2G depict details associated with the first
sequential step of FIG. 1, namely the step of developing business
requirements for the project, in accordance with embodiments of the
present invention.
[0034] FIGS. 3A-3H depict details associated with the second
sequential step of FIG. 1, namely the step of developing system
requirements, in accordance with embodiments of the present
invention.
[0035] FIGS. 4A-4H depict details associated with the third
sequential step of FIG. 1, namely the step of developing component
requirements, in accordance with embodiments of the present
invention.
[0036] FIGS. 5A-5H depict details associated with the fourth
sequential step of FIG. 1, namely the step of developing and
testing components, in accordance with embodiments of the present
invention.
[0037] FIGS. 6A-6G depict details associated with the fifth
sequential step of FIG. 1, namely the step of testing the system,
in accordance with embodiments of the present invention.
[0038] FIGS. 7A-7G depict details associated with the sixth
sequential step of FIG. 1, namely the step of putting the system
into production, in accordance with embodiments of the present
invention.
[0039] FIGS. 8A-8H collectively depict a Requirements Traceability
and Verification Matrix (RTVM), in accordance with embodiments of
the present invention.
[0040] FIG. 9 illustrates a computer system used for generating a
RTVM, in accordance with embodiments of the present invention.
DETAILED DESCRIPTION OF THE INVENTION
Overview
[0041] FIG. 1 is a flow chart depicting sequential steps 100, 200,
300, 400, 500, 600, and 700 of a Systems Engineering (SE) process
for implementing a project or program (hereinafter, "project") for
a customer, in accordance with embodiments of the present
invention. The SE process ensures that customer "requirements" and
expectations for the project are effectively and efficiently
identified, integrated, and managed. Requirements define the needs
and objectives of the stakeholders of the project. A "stakeholder"
is anyone who has a stake in the outcome of the success of the
project and may include, inter alia, customers, business process
owners, contractors, software develops, testers, and systems
engineers who are managing SE reviews for the project. The purpose
of requirements management is to manage the requirements of the
projects' products and product components and to identify
inconsistencies between those requirements and the project plans
and work products.
[0042] Step 100 initiates the project.
[0043] Step 200 develops business requirements for the project. The
detailed implementation of step 200 is described infra in
conjunction with FIGS. 2A-2G and FIGS. 8A-8H. The business
requirements are the fundamental requirements for the project and
may initially be specified or proposed by the customer.
[0044] Each business requirement is decomposed into one or more
system requirements. Thus, the system requirements associated with
a given business requirement is hierarchically related to the given
business requirement. Step 300 develops the system requirements and
system architecture. The detailed implementation of step 300 is
described infra in conjunction with FIGS. 3A-3H and FIGS.
8A-8H.
[0045] Each system requirement may be decomposed into one or more
component requirements. When a system requirement is completely
allocated to a single component requirement, the component
requirement and the system requirement are the same, which is the
meaning of the terminology "System requirement with no component
requirement" for system requirements S3.1 and S4.1 in FIG. 8A
described infra. In conjunction with the decomposition of the
system requirement into component requirements, the system
architecture is defined in terms of components. Thus, the component
requirements associated with a given system requirement are
hierarchically related to the given system requirement. Step 400
develops the component requirements and component architecture. The
detailed implementation of step 400 is described infra in
conjunction with FIGS. 4A-4H and FIGS. 8A-8H.
[0046] The following example illustrates a project, associated
business requirements, associated system requirements generated by
the business requirements, and associated component requirements
generated by the system requirements. In this example, the project
is to design and build a car. A business requirement is that the
car is to be able to decelerate. This business requirement may
include the system requirement of having a method to decelerate the
car. The system requirement of having a method to decelerate the
car is decomposed into braking components (i.e., wheel braking
components, hydraulic control components, etc.). In this example,
the system architecture is defined in terms of components which
make up the car, and one of the components which make up the car is
a braking component. The braking component is defined in terms of
the parts that make up the braking component.
[0047] Step 500 develops and tests the components. The detailed
implementation of step 500 is described infra in conjunction with
FIGS. 5A-5H and FIGS. 8A-8H.
[0048] Step 600 tests and verifies that the system meets the
requirements by conducting component tests and systems test for
each level of requirements in the system/component hierarchy. The
system pertaining to the project is the set of all methods,
components, software, documentation, interfaces, schedules, and the
like, which support the implementation of the project. The detailed
implementation of step 600 is described infra in conjunction with
FIGS. 6A-6G and FIGS. 8A-8H.
[0049] Step 700 puts the system into production by verifying that
all tests were completed successfully, the production
infrastructure has been updated with the new system, the revised
manual procedures have been documented, and the users have been
trained in the new procedures. The detailed implementation of step
700 is described in conjunction with infra in conjunction with
FIGS. 7A-7G.
[0050] Each of steps 200, 300, 400, 500, 600, and 700 utilize a
review template and a review scorecard that define and support
specific, measurable and repeatable procedures. The review template
provides standardization and replication of the procedures by
dictating the detailed content and specific sequence of required SE
reviews. A review scorecard quantitatively measures how well the
project need is being met by a step (i.e., one of steps 200, 300,
400, 500, 600, and 700) as the step is being implemented and
completed. The review scorecard provides a mechanism for assessing
accuracy, completeness, quality, and risk associated with the step.
Thus the SE process of the present invention provides a structured
guide to the work required to complete each step of the SE process
and a measurable standard with which to determine the readiness and
risk to proceed forward with the project.
[0051] FIGS. 8A-8H depicts a Requirements Traceability and
Verification Matrix (RTVM) which provides cumulative traceability
from the business requirements, systems requirements, system
architecture, design elements, and test methods to ensure that the
system is verified and validated in all of its facets. The RTVM
effectively tracks the hierarchical relationships among the
business requirements, the system requirements, the component
requirements, and the architectural components. The RTVM is input
to, and is utilized in, each of steps 200, 300, 400, 500, and
600.
[0052] Definitionally, the word scope of "criteria", as used herein
including in the claims, encompasses both singular and plural
forms; i.e., a "criteria" may encompass one criterion or
alternatively may encompass more than one criterion. For example,
the scope of "review exit criteria" encompasses any of: one
criterion, two criteria, three criteria, etc.
[0053] The SE process steps 200, 300, 400, 500, 600, and 700, as
well as the RTVM, are next described in detail.
Develop Business Requirements (Step 200)
[0054] FIGS. 2A-2G depict details associated with step 200 of FIG.
1, namely the step of developing business requirements for the
project, in accordance with embodiments of the present
invention.
[0055] FIG. 2A illustrates three aspects of step 200 to be utilized
in developing the business requirements, namely a Business
Requirements Review (BRR) template 210, a BRR scorecard 220, and
the RTVM 800. The BRR template 210 has BRR aspects 212, which
include an establishment and review of ground rules, goals, BRR
entry criteria, and BRR exit criteria. The phrases "entry criteria"
and "entrance criteria" are equivalent. The ground rules are rules
to be followed during the conduct of the BRR and are intended to
provide the systems engineer conducting the review with a clear set
of instructions for what the content of the review should or should
not include. The BRR entry criteria denotes criteria to be
satisfied in order to conduct the BRR. The BRR exit criteria
denotes criteria to be satisfied in order to complete the BRR.
Satisfying the BRR entry criteria and BRR exit criteria requires an
objective standard, such as achieving a minimum score relating to
the extent to which the BRR entry criteria and BRR exit criteria
are satisfied. The present invention teaches use of a scorecard for
implementing an objective scoring standard in terms of a numerical
score, namely the BRR scorecard 220. The BRR scorecard 220 has
aspects 222, which include a detailed scorecard and an associated
spider chart (described infra in conjunction with FIGS. 2F and 2G,
respectively). Upon completion of the BRR, the RTVM 800 is updated
to record the changes from the BRR as described infra in
conjunction with FIGS. 8A-8H.
[0056] FIG. 2B depicts process steps 230-238 of the BRR. Step 230
initiates the BRR. Steps 231-237 follow step 230. Updating the RTVM
via step 238 is the final process step of the BRR as described
supra in conjunction with the RTVM 800 shown in FIG. 2A. Step 231
establishes the ground rules for conducting the BRR, as described
infra in conjunction with FIG. 2C. Step 232 establishes goals and
objectives of the BRR, as described infra in conjunction with FIG.
2D. Step 233 reviews the BRR entry criteria for conducting the BRR,
as described infra in conjunction with FIG. 2E. Step 234 presents
materials needed for conducting the BRR session, as described infra
in conjunction with FIG. 2E. Step 235 records defects and issues
which emerge during the conduct of steps 231-234 and 237. Steps 236
initiates verification that the BRR exit criteria have been
satisfied, as described infra in conjunction with FIG. 2E. Step 237
determines objectively in terms of a quantitative metric whether
the BRR exit criteria have been satisfied. If step 237 determines
that the BRR exit criteria have been satisfied, then steps 231-237
are exited and the RTVM 800 is updated to record the changes from
the BRR as described infra in conjunction with FIGS. 8A-8H. If step
237 determines that the BRR exit criteria have not been satisfied,
then steps 231-237 are selectively re-executed iteratively until
step 237 determines that the BRR exit criteria have been satisfied.
Note that there is no required sequential order for executing steps
231-237, and the scope of the present invention includes execution
of steps 231-237 in any desired order, including the possibility of
concurrent performance of some of the steps. The defects and issues
recorded in step 235 may provide logical and intuitive guidelines
for executing steps 231-237 in an order that makes sense in light
of the identified defects and issues. For example, if a defect or
issue relates to a seemingly unavoidable violation of a ground rule
of step 232, then it may be appropriate to revisit step 232 next to
assess whether the violated ground rule should be eliminated or
modified. As another example, if the defect or issue relates to an
inconsistency between a BRR exit criteria and a particular goal,
then it may be appropriate to revisit steps 231 and 236 iteratively
until consistency is established between the BRR exit criteria and
the particular goal.
[0057] FIG. 2C describes a review of ground rules to be followed
during the conduct of the BRR (see step 231 of FIG. 2B). In FIG.
2C, the indicated main ground rule that "no solutions are allowed"
means that the purpose of the BRR is to perform a review to
determine that the business requirements have been established, and
not to develop project solutions that focus on how those business
requirements will be implemented or "solutioned". FIG. 2C also
identifies three other ground rules.
[0058] The first other ground rule in FIG. 2C is that the review of
issues and the address of concerns are relative to identifying and
reviewing gaps in a document that describes the business process
flows for the project. The business process flows to be reviewed
for gaps are "as is" end to end (hereinafter, "e2e") processes,
which include manual touch point desk-level procedures. "End to
end" means from beginning to end so as to ensure that all business
threads of the business process flow are shown. "Manual touch
point" refers to any step in a process such that the step is
accomplished by an individual (i.e., a person). A "desk-level
procedure" implements process steps and sub-steps which are applied
across an organization. Accordingly, a "manual touch point
desk-level procedure" is a procedure utilized by an individual to
implement process steps and sub-steps which are applied across an
organization.
[0059] The second other ground rule in FIG. 2C is review and
approve business requirements that will drive the project. Note
from FIG. 2C that functional and non-functional requirements are to
be checked. A functional requirement pertains to a function that a
system must perform. A non-functional requirement pertains to how
well the function is to be performed. For example, a functional
requirement may be that a car must be able to move from a first
point to a second point, while a non-functional requirement may be
that the car is able to accelerate from 0 to 60 mph in 5
seconds.
[0060] The third other ground rule in FIG. 2C is that stakeholders,
customers, and business process owners are each represented during
the BRR sessions, and that the business requirements and BRR exit
criteria must ultimately be signed off by (i.e., approved by) said
representatives. Note that the stakeholders may include the
customers and business process owners, as well as contractors and
systems engineers.
[0061] FIG. 2D describes establishing goals and objectives of the
BRR (see step 232 of FIG. 2B). The systems engineer conducting the
BRR can select from the list of goals for any BRR held. These goals
are intended to provide a guide to the activities the systems
engineer will need to accomplish in order to complete the BRR. This
list of goals can be tailored to each project. It is a goal in FIG.
2C to convey a clear understanding of the business scope,
objectives, and requirements that pertain to this project, and
associated acceptance/success criteria, and identify "as is"
business processes that will be affected. As a result of
establishing goals and objectives of the BRR, the business
requirements and BRR exit criteria are approved and baselined
(i.e., established for use in steps 300, 400, 500, 600, and 700 of
FIG. 1). Note that the procedure of establishing goals and
objectives of the BRR does not produce documents for the BRR, but
rather facilitates a gathering and revision of existing documents
relating to the BRR. As indicated in FIG. 2D, the audience (i.e.,
participants) for establishing goals and objectives of the BRR
include: stakeholders, customers, systems engineers, and a Solution
Project Manager (SPM) who is responsible for the e2e solution. As
described in FIG. 2D, establishing goals and objectives of the BRR
includes to: review business requirements and acceptance criteria
with customers and stakeholders; establish traceability and set a
baseline; validate that "as is" process flows exist and are
complete; review project plan milestones; review known risks (i.e,
risk mitigation) and related issues, dependencies, and defects; and
capture new risks, issues, dependencies, and defects.
[0062] FIG. 2E describes a review of BRR entry criteria for the BRR
(see step 233 of FIG. 2B), BRR presentation content (see step 234
of FIG. 2B), and BRR exit criteria for the BRR (see step 236 of
FIG. 2B). The BRR entry criteria, BRR presentation content, and BRR
exit criteria are intended as a guide for the systems engineer
conducting the BRR. The items listed in the Entry Criteria column
are the documents and activities that must be completed prior to
the start of the BRR. The items listed in the Presentation column
are the documents and work products that must be presented to the
stakeholders and team members attending the BRR. The items listed
in Exit Criteria column are the activities and documents that must
be completed before the review is considered complete. In FIG. 2E,
the BRR entry criteria, BRR presentation content, and BRR exit
criteria each pertain to the BRR criteria categories of: business
objectives and scope, "as is" business process flows, business
requirements, and business criteria and associated metrics. The
detailed descriptions and explanations of the BRR presentation
content and the BRR entry criteria and BRR exit criteria within
each BRR criteria category are contained within FIG. 2E.
[0063] In FIG. 2E, the phrase "top sheet" means summary page. In
the "as is" business process flows entry criteria, "M3" means
Measured, Monitored and Managed. End to end M3 pertains to the
function of connecting monitors and probes hardware to the physical
system in places such that progress through critical business
process can be watched and measurements (such as time, throughput,
etc.) are taken such that the system can be verified to be
operating as designed. Thus, the "as is" business process flows
entry criteria adds requirements to the design of the system so
that the aforementioned function can be performed when the system
is made operational.
[0064] FIG. 2F depicts a detailed scorecard (see step 237 of FIG.
2B) for scoring performance relating to the BRR exit criteria in
the BRR exit criteria categories of FIG. 2E. The BRR criteria
categories in FIG. 2F (called "scorecard criteria") are: business
objectives and scope, "as is" business process flows, business
requirements, and business criteria and associated metrics). In
FIG. 2F, each BRR criteria category includes one or more criteria.
For example, the BRR criteria category of business objectives and
scope includes the criteria of: validated and verified business
scope and objectives; business case reviewed (top sheet, etc.); and
identified issues, risks, and dependencies. Each criteria in FIG.
2F is scored and each criteria is assigned a weight. The criteria
for each BRR criteria category in FIG. 2F reflects the criteria
within the corresponding BRR exit criteria category of FIG. 2E.
[0065] Although FIG. 2F shows a same weight for each criteria of a
given BRR criteria category, the criteria weights associated with
the given BRR criteria category may generally be variable. The
weight of a BRR criteria category is equal to the sum of the
weights of its associated criteria. The weights of the BRR criteria
categories denote the relative importance of the four BRR criteria
categories in FIG. 2F and balance the influence that each BRR
criteria category will have on the Overall Review Score. The
criteria weights denote the relative importance of the criteria and
balance the influence that each criteria has on the Overall Review
Score. The various criteria weights may be established in any of
steps 231-237 (see FIG. 2B) or may be preassigned prior to the
conduct of the BRR. The criteria weights to be entered in the
"Weighting Factor" column may be predetermined based on an initial
assessment of the relative importance of the BRR criteria and may
also be updated as new pertinent information becomes available
during the BRR.
[0066] In some embodiments, the score S entered into the "USE THIS
COLUMN TO ENTER SCORES" column for the criteria are as follows for
S in a range of 0 to 4:
[0067] S=0 (no data available);
[0068] 0<S.ltoreq.1 (critical issue(s); e.g., the issue(s) may
prevent sign off);
[0069] 1<S.ltoreq.2 (major defect(s) with possible
workaround(s));
[0070] 2<S.ltoreq.3 (major defect(s) with known
workaround(s));
[0071] 3<S<4 (minor or no defects); and
[0072] 4 (not applicable)
[0073] The "BRR Score" column for each criteria contains the
product of the values in the "Weighting Factor" and "USE THIS
COLUMN TO ENTER SCORES" columns relating to each criteria. The "BRR
Score" for a given BRR criteria category is the sum of the "BRR
Scores" of the criteria pertaining to the given BRR criteria
category. The value in the "USE THIS COLUMN TO ENTER SCORES" column
for the BRR criteria category is the value in the "BRR Score
Column" divided by the value in the "Weighting Factor" column of
the BRR criteria category. Thus the Overall Review Score is the sum
of the BRR Scores of the criteria or, equivalently, the sum of the
BRR Scores of the BRR criteria categories. The Overall Review Score
may be normalized to be in a range of 0 to 100. For this range of 0
to 100, the BRR Score of the criteria and BRR criteria categories
may be interpreted as a percent contribution to the Overall Review
Score. The Overall Review Score may be computed by software that is
stored on a computer readable medium and is executed by a processor
of a computer system.
[0074] Various algorithms may be used to determine whether the BRR
process has been successfully completed. In a first exemplary
algorithm, the BRR process has been successfully completed if the
Overall Review Score is no less than a predetermined threshold
score (e.g., 85, within a range of 85 to 100, etc.). In a second
algorithm, the BRR process has been successfully completed if each
criteria category score satisfies a given threshold score (e.g.,
85, within a range of 80 to 90, etc.), and the given threshold
score may be constant or criteria dependent or criteria category
dependent. In a third algorithm, the BRR process has been
successfully completed if each criteria category score satisfies a
given threshold score and if the Overall Review Score is no less
than a predetermined threshold score. Additional algorithms may
impose scoring thresholds on some or all of the criteria and/or
criteria categories. If the algorithm determines that the BRR
process has not been successfully completed, then steps 231-237 are
selectively re-executed iteratively until the pertinent algorithm
determines that the BRR process has been successfully
completed.
[0075] FIG. 2G is a spider chart for graphically representing the
BRR criteria category scores tabulated in the scorecard of FIG. 2F.
Each axis of FIG. 2G represents a BRR criteria category of FIG. 2F
such that points 2A, 2B, 2C, and 2D respectively represent the BRR
scores of the BRR criteria category of: business objectives and
scope, "as is" business process flows, business requirements, and
business criteria and associated metrics. The dashed polygons
identify the scores at the intersections between the dashed
polygons and the four axes. The points 2A, 2B, 2C, and 2D define a
polygon 2P that is useful for visualizing the BRR criteria category
scores relative to each other and also for visualizing the score of
each BRR criteria category in relation to the pertinent threshold
score (e.g., 85). Although not shown in FIG. 2G, a threshold score
to ultimately be satisfied by the BRR criteria category scores
and/or Overall Review Score could also be represented on FIG. 2G.
For example, if the threshold score for the Overall Review Score is
85, then a heavily bolded polygon having the value 85 (i.e.,
between the dashed polygons having values of 80 and 90), could be
superimposed onto FIG. 2G. The spider chart of FIG. 2G may be
generated by a software tool or algorithm (e.g., a graphics
plotting tool), wherein the software tool or algorithm is stored on
a computer readable medium and is executed by a processor of a
computer system.
[0076] Note that a detailed scorecard and a spider chart could be
utilized for scoring performance in relation to the BRR entry
criteria categories of FIG. 2E in a manner analogous to the use of
the detailed scorecard and spider chart of FIGS. 2F and 2G,
respectively, in relation to scoring performance for the BRR exit
criteria categories of FIG. 2E.
[0077] If the pertinent algorithm determines that the BRR process
has been successfully completed, then the RTVM is next updated as
described infra in conjunction with FIGS. 8A-8H, followed by
execution of the Develop System Requirements step 300 of FIG.
1.
[0078] In accordance with the preceding discussion, the Develop
Business Requirements step 200 (see FIG. 1) of the SE process
reviews developing the business requirements for the project. The
BRR maps out clear, specific steps to develop, evaluate and
finalize the business requirements, then obtain stakeholder (e.g.,
customer) agreement. The BRR template (see FIG. 2A), along with the
BRR scorecard (FIG. 2F), and the RTVM (see FIG. 2A), are used to
establish the business requirements. The BRR template maps out a
clear, specific set of steps to evaluate the business requirements.
The BRR scorecard quantitatively measures how well the business
requirements satisfy the business needs established by the
customer. The RTVM provides traceability from the business
requirements, systems requirements, design elements and test
methods to ensure that the system is verified and validated. The
RTVM links steps 200, 300, 400, 500, and 600 shown in FIG. 1 and is
described infra in detail in conjunction with FIGS. 8A-8H.
[0079] In accordance with the preceding discussion, the Develop
Business Requirements step 200 (see FIG. 1) develops the business
requirements for the project. The BRR template, along with the BRR
scorecard (FIG. 2F) and the RTVM are used to define the business
level solution requirements as well as the IT solution approach to
meet the business requirements. The BRR template maps out a clear,
specific set of steps to evaluate the business requirements. The
BRR scorecard (see FIG. 2F) quantitatively measures how well
the
requirements are being developed. The RTVM provides traceability
from the assigned requirements to the design elements and test
methods to ensure that all requirements can be validated.
[0080] The approved business requirements forms a business
requirements baseline, which defines the boundaries of a project
and enables both the SE team and the customer to formally agree on
the scope of the project as well as the Information Technology (IT)
solution approach to meet the requirements. The present invention
provides a structured guide to the work required to complete the
BRR step of the SE process and a measurable standard with which to
determine the readiness and risk to proceed forward with the
project.
[0081] The BRR template provides standardization and replication of
the BRR review steps shown in FIG. 2B for any project using the SE
process by dictating the detailed content and specific sequence of
the BRR, resulting in a measurable improvement in quality, content
and consistency of all BRRs. A goal of the BRR is to ensure the
business scope, objectives, process flows, requirements and success
criteria for the project are established and agreed to by the SE
team, the customer, and other stakeholders. Successful completion
of the BRR reduces project risk by identifying defects and issues.
Successful completion of the BRR is achieved when a sufficient
number of defects and issues have been resolved to obtain a success
score (i.e., passing score) as described supra in conjunction with
FIG. 2F.
[0082] To achieve successful completion of the BRR, the present
invention lists clear, standardized objectives and ground rules for
the BRR to ensure the review goals are met. To achieve successful
completion of the BRR, the present invention also establishes a
standardized set of BRR entry criteria, BRR presentation content,
and BRR exit criteria that must be met in order to successfully
complete the review. The BRR entry criteria list the information to
be presented within the review. The BRR presentation material of
FIG. 2E further clarifies what information should be presented and
how it can be presented. The BRR exit criteria delineate the
requirements for accepting the technical information as complete.
If the level of detail required in the BRR review template is not
available, then the practitioner is not ready to hold a BRR.
[0083] To achieve successful completion of the BRR, the present
invention also requires completion of the BRR scorecard to provide
a measurable evaluation of the review's success.
[0084] When the BRR is conducted as described herein, a list of
defects and issues is recorded as well as a quantitative measure
(i.e., score) which evaluates the content and completeness of the
business requirements. The score is tied to the defects and issues
that have been recorded. Correction of the defects and resolution
of the issues are essential to the creation of the business
requirements baseline. As the defects are corrected and the issues
resolved, the BRR criteria are periodically reapplied to evaluate
the content and completeness of the requirements and a new
quantitative measure is developed. The quantitative measure is used
to identify and quantify risk. The business requirements are not
baselined until a minimum success score (i.e., minimum acceptable
success score; e.g., 85, 80-90, or 85-100 in the Overall Review
Score of FIG. 2F) has been achieved.
[0085] The BRR scorecard is a quantitative and consistent method to
measure the quality and completeness of the project's business
scope, objectives, process flows, requirements and success criteria
(i.e., acceptance criteria) according to a comprehensive set of
success criteria. The BRR scorecard provides a mechanism for
scoring the proposed or tentative business requirements baseline
for accuracy, completeness, quality, and risk. The BRR scorecard
includes the BRR criteria and weighting for the BRR criteria in a
software tool (e.g., a commercially available tool such as the
Microsoft Excel.RTM. spreadsheet tool) to quickly and uniformly
generate an Overall Review Score. The success criteria can be
easily tailored to address specific project complexity or scope
situations. Using a uniform (i.e., project independent) success
criteria allows teams to conduct analysis across all projects and
develop organization lessons learned with corrective actions.
However, the scope of the present invention also includes
embodiments in which the success criteria are project
dependent.
[0086] The present invention measures the quality of the business
requirements baseline with respect to the prescribed BRR exit
criteria by providing: guidelines for performing the scoring; a
prescribed set of scoring criteria for each question (see FIG. 2F);
a prescribed set of questions (see FIG. 2F) for the BRR exit
criteria which specifically address the defined goals of the BRR
(See FIG. 2D); and a weighting factor (see FIG. 2F) for each
question to accurately reflect the significance of each criteria in
regard to the overall evaluation.
[0087] An output of the BRR scorecard may be an Overall Review
Score (see FIG. 2F) such as between 0 and 100 which: rates the
quality of the material presented at the BRR; specifies the issues
and defects that were discovered during the BRR; and states the
technical risk associated with the proposed business requirement
baseline presented at the BRR.
[0088] The Overall Review Score may be mapped to "Red" (critical),
"Yellow" (caution), "Green" (satisfactory) status for a summary
assessment of the overall condition of the business requirements of
the project. The BRR scorecard (see FIG. 2F) automatically
generates a "spider chart" (see FIG. 2G) as a graphical
representation of the scoring per criteria. The spider chart is a
visual reference to highlight the problem areas discovered during
the BRR and measured by the Overall Review Score for the BRR.
[0089] When the BRR is completed and the related defects/issues
have been resolved, then the business requirements are used to
define the system architecture and requirements.
Develop System Requirements (Step 300)
[0090] FIGS. 3A-3H depict details associated with step 300 of FIG.
1, namely the step of developing system requirements, in accordance
with embodiments of the present invention. The system requirements
are hierarchically associated with the established business
requirements for the project, since each business requirement may
be decomposed into one or more system requirements.
[0091] FIG. 3A illustrates three aspects of step 300 to be utilized
in developing the system requirements, namely a System Requirements
Review (SRR) template 310, a SRR scorecard 320, and the RTVM 800.
The SRR template 310 has SRR aspects 312, which include an
establishment and review of ground rules, goals, SRR entry
criteria, and SRR exit criteria. The ground rules are rules to be
followed during the conduct of the SRR and are intended to provide
the systems engineer conducting the review with a clear set of
instructions for what the content of the review should or should
not include. The SRR entry criteria denotes criteria to be
satisfied in order to conduct the SRR. The SRR exit criteria
denotes criteria to be satisfied in order to complete the SRR.
Satisfying the SRR entry criteria and SRR exit criteria requires an
objective standard, such as achieving a minimum score relating to
the extent to which the SRR entry criteria and SRR exit criteria
are satisfied. The present invention uses a scorecard for
implementing an objective scoring standard in terms of a numerical
score, namely the SRR scorecard 320. The SRR scorecard 320 has
aspects 322, which include a detailed scorecard and an associated
spider chart (described infra in conjunction with FIGS. 3G and 3H,
respectively). Upon completion of the SRR, the RTVM 800 is updated
to record the changes from the SRR as described infra in
conjunction with FIGS. 8A-8H.
[0092] FIG. 3B depicts process steps 330-338 of the SRR. Step 330
initiates the SRR. Steps 331-338 follow step 330. Updating the RTVM
800 (see FIG. 3A) is the final process step of the SRR after steps
331-338 have been completely executed. Step 331 establishes goals
and objectives of the SRR, as described infra in conjunction with
FIG. 3C. Step 332 establishes the ground rules for conducting the
SRR, as described infra in conjunction with FIG. 3D. Step 333
reviews the SRR entry criteria for conducting the SRR, as described
infra in conjunction with FIGS. 3E-3F. Step 334 presents materials
needed for conducting the SRR session, as described infra in
conjunction with FIGS. 3E-3F. Step 335 records defects and issues
which emerge during the conduct of steps 331-334 and 336-338. Step
336 utilizes the RTVM 800 (see FIG. 3A) to review the requirements
traceability with respect to the business requirements and system
requirements. Step 337 initiates verification that the SRR exit
criteria have been satisfied, as described infra in conjunction
with FIGS. 3E-3F. Step 338 determines objectively in terms of a
quantitative metric whether the SRR exit criteria have been
satisfied. If step 338 determines that the SRR exit criteria have
been satisfied, then steps 331-338 are exited and the RTVM 800 is
updated to record the changes from the SRR as described infra in
conjunction with FIGS. 8A-8H. If step 338 determines that the SRR
exit criteria have not been satisfied, then steps 331-338 are
selectively re-executed iteratively until step 338 determines that
the SRR exit criteria have been satisfied. Note that there is no
required sequential order for executing steps 331-338, and the
scope of the present invention includes execution of steps 331-338
in any desired order, including the possibility of concurrent
performance of some of the steps. The defects and issues recorded
in step 335 may provide logical and intuitive guidelines for
executing steps 331-338 in an order that makes sense in light of
the identified defects and issues. For example, if the defect or
issue relates to an inconsistency between a system requirement and
a particular goal, then it may be appropriate to revisit steps 331
and 337 iteratively until consistency is established between the
system requirement and the particular goal. Once the defect is
reconciled, the re-scoring is done and step 338 redetermines
whether the SRR exit criteria have been satisfied. FIG. 3C
describes establishing goals and objectives of the SRR (see step
331 of FIG. 3B). The systems engineer conducting the SRR can select
from the list of goals for any SRR held. These goals are intended
to provide a guide to the activities the systems engineer will need
to accomplish in order to complete the SRR. This list of goals can
be tailored to each project. It is a goal in FIG. 3C to convey a
clear understanding of the business/stakeholder needs, rationale
and priorities, and review the system level solution requirements
and the information technology (IT) solution approach, and obtain
customer concurrence on system requirements/architecture. As
described in FIG. 3C, the establishing of goals and objectives of
the SRR include: review and approve documented system requirements
and architecture; establish traceability; establish the technical
baseline; identify technical risks; review mitigation plans;
identify dependencies; establish plans, and identify technical
performance measures.
[0093] FIG. 3D describes a review of ground rules to be followed
during the conduct of the SRR (see step 331 of FIG. 3B). The
indicated ground rules comprise: review documentation and address
the listed concerns; and key stakeholders and business process
holders are present during the SRR and sign off on the solution
(system) requirements/architecture and the solution (system) scope.
The "key stakeholders" and "key business process owners" in the SRR
ground rules refer to representatives of each class of stakeholder
and each class of business process owner. In other words, while it
is desirable that as many stakeholders and business process owners
in each class be involved in implementing the ground rules (e.g.,
being present during the SRR), the actual ground rules require that
each stakeholder class be represented regardless of how many
members in each stakeholder class collectively fulfill the
representation requirement.
[0094] FIGS. 3E-3F collectively describe a review of SRR entry
criteria for the SRR (see step 333 of FIG. 3B), SRR presentation
content (see step 334 of FIG. 3B), and SRR exit criteria for the
SRR (see step 336 of FIG. 3B). The SRR entry criteria, SRR
presentation content, and SRR exit criteria are intended as a guide
for the systems engineer conducting the SRR. The items listed in
the Entry Criteria column are the documents and activities that
must be completed prior to the start of the SRR. The items listed
in the Presentation column are the documents and work products that
must be presented to the stakeholders and team members attending
the SRR. The items listed in Exit Criteria column are the
activities and documents that must be completed before the review
is considered complete. In FIGS. 3E-3F, the SRR entry criteria, SRR
presentation content, and SRR exit criteria each pertain to the SRR
criteria categories of: business requirements and process
definition, system requirements definition, system level
architecture, acceptance criteria, and requirements traceability.
The detailed descriptions and explanations of the SRR presentation
content and the SRR entry criteria and SRR exit criteria within
each SRR criteria category are contained within FIGS. 3E-3F.
[0095] FIG. 3G depicts a detailed scorecard (see step 238 of FIG.
3B) for scoring performance relating to the SRR exit criteria in
the SRR exit criteria categories of FIGS. 3E and 3F. The SRR
criteria categories in FIG. 3G (called "scorecard criteria") are:
business requirements and process definition, system requirements
definition, system level architecture, requirements traceability,
and acceptance criteria. In FIG. 3G, each SRR criteria category
includes one or more criteria. For example, the SRR criteria
category of system level architecture includes the criteria of:
customer concurs that the system level architecture satisfies the
business requirements; dynamic and static architectures are defined
and complete; system level architecture matches system
requirements; `to be` system landscape is defined, and `to be`
business flows is defined. The criteria for each SRR criteria
category in FIG. 3G reflects the criteria within the corresponding
SRR exit criteria category of FIGS. 3E-3F. Each criteria in FIG. 3G
is scored and each criteria is assigned a weight. The scores in
FIG. 3G are analogous to the scores in FIG. 2F described supra. The
meaning and use of the weights and the interpretation of the
"Weighting Factor", "SRR Score", and "USE THIS COLUMN TO ENTER
SCORES" columns in FIG. 3G are analogous to the corresponding
columns of FIG. 2F described supra. Similarly, the Overall Review
Score in FIG. 3G is analogous to the corresponding Overall Review
Score in FIG. 2F described supra and may be computed by any of the
techniques described supra for computing the Overall Review Score
relating to the BRR.
[0096] Various algorithms may be used to determine whether the SRR
process has been successfully completed. In a first exemplary
algorithm, the SRR process has been successfully completed if the
Overall Review Score is no less than a predetermined threshold
score (e.g., 85, within a range of 85 to 100, etc.). In a second
algorithm, the SRR process has been successfully completed if each
criteria category score satisfies a given threshold score (e.g.,
85, within a range of 80 to 90, etc.), and the given threshold
score may be constant or criteria dependent or criteria category
dependent. In a third algorithm, the SRR process has been
successfully completed if each criteria category score satisfies a
given threshold score and if the Overall Review Score is no less
than a predetermined threshold score. Additional algorithms may
impose scoring thresholds on some or all of the criteria and/or
criteria categories. If the algorithm determines that the SRR
process has not been successfully completed, then steps 331-338 are
selectively re-executed iteratively until the pertinent algorithm
determines that the SRR process has been successfully
completed.
[0097] FIG. 3H is a spider chart for graphically representing the
SRR criteria category scores tabulated in the scorecard of FIG. 3G.
Each axis of FIG. 3H represents a SRR criteria category of FIG. 3G
such that points 3A, 3B, 3C, 3D, and 3E respectively represent the
SRR scores of the SRR criteria category of: business requirements
and process definition, system requirements definition, system
level architecture, requirements traceability, and acceptance
criteria. The dashed polygons identify the scores at the
intersections between the dashed polygons and the four axes. The
points 3A, 3B, 3C, 3D, and 3E define a polygon 3P that is useful
for visualizing the SRR criteria category scores relative to each
other and also for visualizing the score of each SRR criteria
category in relation to the pertinent threshold score (e.g., 85).
Although not shown in FIG. 3H, a threshold score to ultimately be
satisfied by the SRR criteria category scores and/or Overall Review
Score could also be represented on FIG. 3H. For example, if the
threshold score for the Overall Review Score is 85, then a heavily
bolded polygon having the value 85 (i.e., between the dashed
polygons having values of 80 and 90), could be superimposed onto
FIG. 3H. The spider chart of FIG. 3G may be generated by a software
tool or algorithm (e.g., a graphics plotting tool), wherein the
software tool or algorithm is stored on a computer readable medium
and is executed by a processor of a computer system.
[0098] Note that a detailed scorecard and a spider chart could be
utilized for scoring performance in relation to the SRR entry
criteria of FIGS. 3E-3F in a manner analogous to the use of the
detailed scorecard and spider chart of FIGS. 3G and 3H,
respectively, in relation to scoring performance for the SRR exit
criteria of FIGS. 3E-3F.
[0099] If the pertinent algorithm determines that the SRR process
has been successfully completed, then the RTVM is next updated as
described infra in conjunction with FIGS. 8A-8H, followed by
execution of the Develop Component Requirements step 400 of FIG.
1.
[0100] In accordance with the preceding discussion, the Develop
System Requirements step 300 (see FIG. 1) of the SE process reviews
developing the system baseline. The SRR template (see FIG. 3A),
along with the SRR scorecard (see FIG. 3G), and the RTVM (see FIG.
3A) are used to define the system level requirements as well as the
IT solution approach to meet the system requirements. The SRR
template maps out a clear, specific set of steps to evaluate the
system requirements. The SRR scorecard quantitatively measures how
well the system requirements support the business requirements. The
RTVM provides traceability from the assigned requirements to the
design elements and test methods to ensure that all requirements
can be validated.
[0101] The present invention enables both the SE team, the
customer, and other stakeholders to agree on the scope of the
system and formally baseline the technical scope of the project.
The present invention provides a structured guide to the work
required to complete the Develop System Requirements 300 step (see
FIG. 1) of the SE process and a measurable standard with which to
determine the readiness and risk to proceed forward with the
project.
[0102] The SRR review template (see FIG. 3A) provides
standardization and replication of the SRR review steps (see FIG.
3B) for any project using the SE process by dictating the detailed
content and specific sequence of the SRR, resulting in a measurable
improvement in quality, content, and consistency of all SRRs. The
overall goal of the SRR is to establish and formally baseline the
system requirements. Successful completion of the SRR will reduce
project risk by identifying defects and issues. Successful
completion of the SRR is achieved when a sufficient number of
defects and issues have been resolved to obtain a success score as
described supra in conjunction with FIG. 3G.
[0103] To achieve successful completion of the SRR, the present
invention lists clear, standardized ground rules and objectives for
the SRR to ensure the review goals are met.
[0104] To achieve successful completion of the SRR, the present
invention also establishes a standardized set of SRR entry
criteria, SRR presentation content, and SRR exit criteria (see
FIGS. 3E-3F) that must be met in order to successfully complete the
review. The SRR entry criteria list the information to be presented
within the review. The SRR presentation content further clarify
what information should be presented and how it can be presented.
The SRR exit criteria delineate the requirements for accepting the
technical information as complete. If the level of detail required
in the SRR review template is not available, then the practitioner
is not ready to hold an SRR.
[0105] To achieve successful completion of the SRR, the present
invention also requires completion of the SRR scorecard (see FIG.
3G) to provide a measurable evaluation of the review's success.
[0106] When the SRR is conducted via the method outlined herein, a
list of defects and issues is recorded as well as a quantitative
measure (i.e., score) which evaluates the content and completeness
of the system and business requirements. The score is tied to the
defects and issues that have been recorded. Correction of the
defects and resolution of the issues are essential to the creation
of the system requirements baseline. As the defects are corrected
and the issues resolved the SRR criteria are periodically reapplied
to evaluate the content and completeness of the requirements and a
new quantitative measure is developed. The quantitative measure is
used to identify and quantify risk. The system requirements are not
baselined until a minimum success score (i.e., minimum acceptable
success score; e.g., 85, 80-90, or 85-100 in the Overall Review
Score of FIG. 3G) has been achieved.
[0107] The SRR scorecard is a quantitative and consistent method to
measure the quality and completeness of the project's business and
system requirements, system level architecture, requirements
traceability, and acceptance criteria according to a comprehensive
set of criteria. It provides a mechanism for scoring the system
requirements baseline for accuracy, completeness, quality and risk.
The SRR scorecard contains the SRR criteria and weighting in a
software tool (e.g., a commercially available tool such as the
Microsoft Excel.RTM. spreadsheet tool) to quickly and uniformly
generate a review score. The success criteria can be easily
tailored to address specific project complexity or scope
situations. Using a uniform (i.e., project independent) criteria
allows teams to conduct analysis across all projects and develop
organization lessons learned with corrective actions. However, the
scope of the present invention also includes embodiments in which
the success criteria are project dependent.
[0108] The present invention measures the quality of the prescribed
SRR exit criteria by providing guidelines for performing the
scoring; a prescribed set of scoring criteria for each question
(see FIG. 3G); a prescribed set of questions/criteria (see FIG. 3G)
which specifically address the defined goals of the SRR (see FIG.
3D); and a weighting factor (see FIG. 3G) for each
question/criteria to accurately reflect the significance of each
element in regard to the overall evaluation.
[0109] An output of the SRR scorecard may be an Overall Review
Score (see FIG. 3G) such as between 0 and 100 which: rates the
quality of the material presented at the SRR; specifies the issues
and defects that were discovered during the SRR; and states the
technical risk associated with the system requirements baseline
that was presented at the SRR.
[0110] The Overall Review Score may be mapped to "Red" (critical),
"Yellow" (caution), "Green" (satisfactory) status for a summary
assessment of the overall condition of the system requirements of
the project. The SRR scorecard (see FIG. 3G) automatically
generates a "spider chart" (see FIG. 3H) as a graphical
representation of the scoring per criteria. The spider chart is a
visual reference to highlight the problem areas discovered during
the SRR and measured by the Overall Review Score for the SRR.
[0111] When the SRR is completed and the related defects/issues
have been resolved, then the system requirements are used to define
the component architecture and requirements.
Develop Component Requirements (Step 400)
[0112] FIGS. 4A-4H depict details associated with step 400 of FIG.
1, namely the step of developing component requirements and
component architecture, in accordance with embodiments of the
present invention. The component requirements are hierarchically
associated with the established system requirements for the
project, since each system requirement may be decomposed into one
or more component requirements. The component requirements relate
to hardware elements, software modules, or processes required to
fulfill the business requirements. Developing component
requirements includes defining the architectural components.
[0113] FIG. 4A illustrates three aspects of step 400 to be utilized
in developing the component requirements, namely a Preliminary
Design Review (PDR) template 410, a PDR scorecard 420, and the RTVM
800. The PDR template 410, PDR scorecard 420, and the RTVM 800
collectively define the component requirements. The PDR template
410 has PDR aspects 412, which include an establishment and review
of ground rules, goals, PDR entry criteria, and PDR exit criteria.
The ground rules are rules to be followed during the conduct of the
PDR and are intended to provide the systems engineer conducting the
review with a clear set of instructions for what the content of the
review should or should not include. The PDR entry criteria denotes
criteria to be satisfied in order to conduct the PDR. The PDR exit
criteria denotes criteria to be satisfied in order to complete the
PDR. Satisfying the PDR entry criteria and PDR exit criteria
requires an objective standard, such as achieving a minimum score
relating to the extent to which the PDR entry criteria and PDR exit
criteria are satisfied. The present invention uses a scorecard for
implementing an objective scoring standard in terms of a numerical
score, namely the PDR scorecard 420. The PDR scorecard 420 has
aspects 422, which include a detailed scorecard and an associated
spider chart (described infra in conjunction with FIGS. 4G and 4H,
respectively). Upon completion of the PDR, the RTVM 800 is updated
to record the changes from the PDR as described infra in
conjunction with FIGS. 8A-8H.
[0114] FIG. 4B depicts process steps 430-438 of the PDR. Step 430
initiates the PDR. Steps 431-438 follow step 430. Updating the RTVM
800 (see FIG. 4A) is the final process step of the PDR after steps
431-438 have been completely executed. Step 431 establishes goals
and objectives of the PDR, as described infra in conjunction with
FIG. 4C. Step 432 establishes the ground rules for conducting the
PDR, as described infra in conjunction with FIG. 4D. Step 433
reviews the PDR entry criteria for conducting the PDR, as described
infra in conjunction with FIGS. 4E-4F. Step 434 presents materials
needed for conducting the PDR session, as described infra in
conjunction with FIGS. 4E-4F. Step 435 records defects and issues
which emerge during the conduct of steps 431-434 and 436-438. Step
436 utilizes the RTVM 800 (see FIG. 4A) to review the requirements
traceability with respect to the business requirements, system
requirements, and component requirements. Step 437 initiates
verification that the PDR exit criteria have been satisfied, as
described infra in conjunction with FIGS. 4E-4F. Step 438
determines objectively in terms of a quantitative metric whether
the PDR exit criteria have been satisfied. If step 438 determines
that the PDR exit criteria have been satisfied, then steps 431-438
are exited and the RTVM 800 is updated to record the changes from
the PDR as described infra in conjunction with FIGS. 8A-8H. If step
438 determines that the PDR exit criteria have not been satisfied,
then steps 431-438 are selectively re-executed iteratively until
step 438 determines that the PDR exit criteria have been satisfied.
Note that there is no required sequential order for executing steps
431-438, and the scope of the present invention includes execution
of steps 431-438 in any desired order, including the possibility of
concurrent performance of some of the steps. The defects and issues
recorded in step 435 may provide logical and intuitive guidelines
for executing steps 431-438 in an order that makes sense in light
of the identified defects and issues. For example, if the defect or
issue relates to an inconsistency between a PDR component
architecture and a particular goal, then it is appropriate to
revisit steps 431 and 437 iteratively until consistency is
established between the component architecture and the particular
goal which will lead in turn to satisfying the PDR exit
criteria.
[0115] FIG. 4C describes establishing goals and objectives of the
PDR (see step 431 of FIG. 4B). The systems engineer conducting the
PDR can select from the list of goals for any PDR held. These goals
are intended to provide a guide to the activities the systems
engineer will need to accomplish in order to complete the PDR. This
list of goals can be tailored to each project. It is a goal in FIG.
4C to present the high level design and the component level
baseline, to include the allocated requirements and the
architecture baseline. As described in FIG. 4C, the establishing of
goals and objectives of the PDR include: review and approve the
component level architecture baseline; establish traceability;
establish a technical baseline for the component requirements;
identify technical risks; review mitigation plans; identify
dependencies; identify hardware and software performance monitor
tools in support of the e2e monitoring, measurement, and management
requirements; review technical performance measures; and review
test architecture.
[0116] FIG. 4D describes a review of ground rules to be followed
during the conduct of the PDR (see step 431 of FIG. 4B). The
indicated ground rules comprise: review documentation and address
the listed concerns; and key stakeholders and business process
holders are present during the PDR and sign off on the component
requirements/architecture and the solution (component
architecture). The "key stakeholders" and "key business process
owners" in the PDR ground rules refer to representatives of each
class of stakeholder and each class of business process owner. In
other words, while it is desirable that as many stakeholders and
business process owners in each class be involved in implementing
the ground rules (e.g., being present during the PDR), the actual
ground rules require that each stakeholder class be represented
regardless of how many members in each stakeholder class
collectively fulfill the representation requirement.
[0117] FIGS. 4E-4F collectively describe a review of PDR entry
criteria for the PDR (see step 433 of FIG. 4B), PDR presentation
content (see step 434 of FIG. 4B), and PDR exit criteria for the
PDR (see step 437 of FIG. 4B). The PDR entry criteria, PDR
presentation content, and PDR exit criteria are intended as a guide
for the systems engineer conducting the PDR. The items listed in
the Entry Criteria column are the documents and activities that
must be completed prior to the start of the PDR. The items listed
in the Presentation column are the documents and work products that
must be presented to the stakeholders and team members attending
the PDR. The items listed in Exit Criteria column are the
activities and documents that must be completed before the review
is considered complete. In FIGS. 4E-4F, the PDR entry criteria, PDR
presentation content, and PDR exit criteria each pertain to the PDR
criteria categories of: static architecture definition; dynamic
architecture definition; architecture element definition/component
requirements; and test architecture definition. The detailed
descriptions and explanations of the PDR presentation content and
the PDR entry criteria and PDR exit criteria within each PDR
criteria category are contained within FIGS. 4E-4F.
[0118] Note that the terms "logical architecture" and "physical
architecture" in FIG. 4D are defined as follows. Logical
architecture defines the functions that the system must perform.
Physical architecture defines resources for every function
identified in the logical architecture.
[0119] FIG. 4G depicts a detailed scorecard (see step 438 of FIG.
4B) for scoring performance relating to the PDR exit criteria in
the PDR exit criteria categories of FIGS. 4E-4F. The PDR criteria
categories in FIG. 4G (called "scorecard criteria") are: static
architecture definition; dynamic architecture definition;
architecture element/component requirements; and test architecture
definition. In FIG. 4G, each PDR criteria category includes one or
more criteria. For example, the PDR criteria category of dynamic
architecture includes the criteria of: completeness in the use case
diagrams, completeness of the interaction diagrams, and
completeness of data architecture/model. Each of the criteria are
scored and each of the criteria are assigned a weight. The criteria
for each PDR criteria category in FIG. 4G reflects the criteria
within the corresponding PDR exit criteria category of FIGS. 4E-4F.
Each criteria in FIG. 4G is scored and each criteria is assigned a
weight. The scores in FIG. 4G are analogous to the scores in FIG.
2F described supra. The meaning and use of the weights and the
interpretation of the "Weighting Factor", "PDR Score", and "USE
THIS COLUMN TO ENTER SCORES" columns in FIG. 4G are analogous to
the corresponding columns of FIG. 2F described supra. Similarly,
the Overall Review Score in FIG. 4G is analogous to the
corresponding Overall Review Score in FIG. 2F described supra and
may be computed by any of the techniques described supra for
computing the Overall Review Score relating to the BRR.
[0120] Various algorithms may be used to determine whether the PDR
process has been successfully completed. In a first exemplary
algorithm, the PDR process has been successfully completed if the
Overall Review Score is no less than a predetermined threshold
score (e.g., 85, within a range of 85 to 100, etc.). In a second
algorithm, the PDR process has been successfully completed if each
criteria category score satisfies a given threshold score (e.g.,
85, within a range of 80 to 90, etc.), and the given threshold
score may be constant or criteria dependent or criteria category
dependent. In a third algorithm, the PDR process has been
successfully completed if each criteria category score satisfies a
given threshold score and if the Overall Review Score is no less
than a predetermined threshold score. Additional algorithms may
impose scoring thresholds on some or all of the criteria and/or
criteria categories. If the algorithm determines that the PDR
process has not been successfully completed, then steps 431-438 are
selectively re-executed iteratively until the pertinent algorithm
determines that the PDR process has been successfully
completed.
[0121] FIG. 4H is a spider chart for graphically representing the
PDR criteria category scores tabulated in the scorecard of FIG. 4G.
Each axis of FIG. 4H represents a PDR criteria category of FIG. 4G
such that points 4A, 4B, 4C, and 4D respectively represent the PDR
scores of the PDR criteria category of: static architecture
definition; dynamic architecture definition; architecture
element/component requirements; and test architecture definition.
The dashed polygons identify the scores at the intersections
between the dashed polygons and the four axes. The points 4A, 4B,
4C, and 4D define a polygon 4P that is useful for visualizing the
PDR criteria category scores relative to each other and also for
visualizing the score of each PDR criteria category in relation to
the pertinent threshold score (e.g., 85). Although not shown in
FIG. 4H, a threshold score to ultimately be satisfied by the PDR
criteria category scores and/or Overall Review Score could also be
represented on FIG. 4H. For example, if the threshold score for the
Overall Review Score is 85, then a heavily bolded polygon having
the value 85 (i.e., between the dashed polygons having values of 80
and 90), could be superimposed onto FIG. 4H. The spider chart of
FIG. 4H may be generated by a software tool or algorithm (e.g., a
graphics plotting tool), wherein the software tool or algorithm is
stored on a computer readable medium and is executed by a processor
of a computer system.
[0122] Note that a detailed scorecard and a spider chart could be
utilized for scoring performance in relation to the PDR entry
criteria of FIGS. 4E-4F in a manner analogous to the use of the
detailed scorecard and spider chart of FIGS. 4G and 4H,
respectively, in relation to scoring performance for the PDR exit
criteria of FIGS. 4E-4F.
[0123] If the pertinent algorithm determines that the PDR process
has been successfully completed, then the RTVM is next updated as
described infra in conjunction with FIGS. 8A-8H, followed by
execution of the Develop and Test Components Requirements step 500
of FIG. 1.
[0124] In accordance with the preceding discussion, the Develop
Component Requirements step 400 (see FIG. 1) of the SE process
reviews development of the component requirements and component
architecture which are the natural decomposition of the system
requirements and system architecture into more specific detail for
logical and/or physical hardware elements, software modules, or
processes required to develop the system. The PDR template, along
with the PDR scorecard and the RTVM are used to define the
component requirements as well as the IT solution approach to meet
the requirements. The PDR template maps out a clear, specific set
of steps to evaluate the component requirements. The PDR scorecard
quantitatively measures how well the component requirements satisfy
the system and business requirements. The RTVM provides
traceability from the assigned requirements to the design elements
and test methods to ensure that all requirements can be validated.
The present invention enables both the SE team and the customer to
agree on the scope of the system and formally baseline the
technical scope of the project. The inventions provide a structured
guide to the work required to complete this step of the SE process
and a measurable standard with which to determine the readiness and
risk to proceed forward with the project.
[0125] The PDR template provides standardization and replication of
the PDR review steps for any project using the SE process by
dictating the detailed content and specific sequence of the PDR,
resulting in a measurable improvement in quality, content and
consistency of all PDRs. The overall goal of the PDR is to
establish and formally baseline the component requirements and
architecture. Successful completion of the PDR will reduce project
risk by removing defects and resolving issues. Successful
completion of the PDR is achieved when a sufficient number of
defects and issues have been resolved to obtain a success score as
described supra in conjunction with FIG. 4G. To achieve successful
completion of the PDR, the present invention lists clear,
standardized objectives and ground rules for the PDR to ensure the
review goals are met.
[0126] To achieve successful completion of the PDR, the present
invention also establishes a standardized set of PDR entry
criteria, PDR presentation content, and PDR exit criteria that must
be met in order to successfully complete the review. The PDR entry
criteria list the information to be presented within the review.
The PDR presentation content further clarify what information
should be presented and how it can be presented. The PDR exit
criteria delineate the requirements for accepting the technical
information as complete. If the level of detail required in the PDR
review template is not available, then the practitioner is not
ready to hold a PDR.
[0127] To achieve successful completion of the PDR, the present
invention also requires completion of the PDR scorecard to provide
a measurable evaluation of the review's success.
[0128] When the PDR is conducted, a list of defects and issues is
recorded as well as a quantitative measure (i.e., score) which
evaluates the content and completeness of the component
requirements and component architecture. The score is tied to the
defects and issues that have been recorded. Correction of the
defects and resolution of the issues are essential to the creation
of the component requirements and architecture baseline. As the
defects are corrected and the issues resolved the PDR criteria are
periodically reapplied to evaluate the content and completeness of
the requirements and architecture to develop a new quantitative
measure. The quantitative measure is used to identify and quantify
risk. The component requirements and architecture are not baselined
until a minimum success score (i.e., minimum acceptable success
score; e.g., 85, 80-90, or 85-100 in the Overall Review Score of
FIG. 4G) has been achieved.
[0129] The PDR scorecard is a simple, quantitative and consistent
method to measure the quality and completeness of the project's
static and dynamic architecture, component requirements and test
architecture definition according to a comprehensive set of
criteria. The PDR scorecard provides a mechanism for scoring the
component requirements and architecture baseline for accuracy,
completeness, quality and risk. The PDR scorecard contains the PDR
criteria and weighting in a software tool (e.g., a commercially
available tool such as the Microsoft Excel.RTM. spreadsheet tool)
to quickly and uniformly generate a review score. The criteria can
be easily tailored to address specific project complexity or scope
situations. Using a uniform (i.e., project independent) criteria
allows teams to conduct analysis across all projects and develop
organization lessons learned with corrective actions. However, the
scope of the present invention also includes embodiments in which
the success criteria are project dependent.
[0130] The present invention measures the quality of the prescribed
PDR exit criteria by providing: guidelines for performing the
scoring; a prescribed set of scoring criteria for each
question/criteria; a prescribed set of questions/criteria which
specifically address the defined goals of the PDR; and a weighting
factor for each question/criteria to accurately reflect the
significance of each element in regard to the overall
evaluation.
[0131] An output of the PDR scorecard may be an Overall Review
Score such as between 0 and 100 which: rates the quality of the
material presented at the PDR, specifies the issues and defects
that were discovered during the PDR, and states the technical risk
associated with the system requirements baseline that was presented
at the PDR.
[0132] The Overall Review Score may be mapped to "Red" (critical),
"Yellow" (caution), "Green" (satisfactory) status for a summary
assessment of the overall condition of the project. The PDR
scorecard (see FIG. 4G) automatically generates a "spider chart"
(See FIG. 4H) as a graphical representation of the scoring per
criteria. The spider chart is a visual reference to highlight the
problem areas discovered during the PDR and measured by the Overall
Review Score for the PDR.
Develop and Test Components (Step 500)
[0133] FIGS. 5A-5H depict details associated with step 500 of FIG.
1, namely the step of developing and testing components, in
accordance with embodiments of the present invention. This step
includes verifying the component design and creating a design
baseline, which includes developing component designs compatible
with the component requirements and developing test plans for
testing the component designs. The component requirements are
decomposed into the component designs and the test plans.
Developing the component designs and the test plans include
reviewing the component designs and test plans for acceptability in
accordance with Critical Design Review (CDR) exit criteria.as will
be discussed infra.
[0134] FIG. 5A illustrates three aspects of step 500 to be utilized
in developing and testing components which verify the component
requirements as baselined in the PDR, so as to ensure that the
component design satisfies the component requirements. The three
aspects are: a CDR template 510, a CDR scorecard 520, and the RTVM
800. The CDR template 510 has CDR aspects 512, which include an
establishment and review of ground rules, goals, PDR entry
criteria, and CDR exit criteria. The ground rules are rules to be
followed during the conduct of the CDR and are intended to provide
the systems engineer conducting the review with a clear set of
instructions for what the content of the review should or should
not include. The CDR entry criteria denotes criteria to be
satisfied in order to conduct the CDR. The CDR exit criteria
denotes criteria to be satisfied in order to complete the CDR.
Satisfying the CDR entry criteria and CDR exit criteria requires an
objective standard, such as achieving a minimum score relating to
the extent to which the CDR entry criteria and CDR exit criteria
are satisfied. The present invention teaches use of a scorecard for
implementing an objective scoring standard in terms of a numerical
score, namely the CDR scorecard 520. The CDR scorecard 520 has
aspects 522, which include a detailed scorecard and an associated
spider chart (described infra in conjunction with FIGS. 5G and 5H,
respectively). Upon completion of the CDR, the RTVM 800 is updated
to record the changes from the CDR as described infra in
conjunction with FIGS. 8A-8H.
[0135] FIG. 5B depicts process steps 530-538 of the CDR. Step 530
initiates the CDR. Steps 531-538 follow step 530. Updating the RTVM
800 (see FIG. 5A) is the final process step of the CDR after steps
531-538 have been completely executed and includes updating the
RTVM 800 with verification information relating to the component
designs and associated test plans. Step 531 establishes goals and
objectives of the CDR, as described infra in conjunction with FIG.
5C. Step 532 establishes the ground rules for conducting the CDR,
as described infra in conjunction with FIG. 5D. Step 533 reviews
the CDR entry criteria for conducting the CDR, as described infra
in conjunction with FIGS. 5E-5F. Step 534 presents materials needed
for conducting the CDR session, as described infra in conjunction
with FIGS. 5E-5F. Step 535 records defects and issues which emerge
during the conduct of steps 531-534 and 536-538. Step 536 utilizes
the RTVM 800 (see FIG. 5A) to review the requirements traceability
with respect to the business requirements, system requirements,
component requirements, component architecture, component design
and developing and testing components in relation to the components
design. Steps 537 initiates verification that the CDR exit criteria
have been satisfied, as described infra in conjunction with FIGS.
5E-5F. Step 538 determines objectively in terms of a quantitative
metric whether the CDR exit criteria have been satisfied. If step
538 determines that the CDR exit criteria have been satisfied, then
steps 531-538 are exited and the RTVM 800 is updated to record the
changes from the CDR as described infra in conjunction with FIGS.
8A-8H. If step 538 determines that the CDR exit criteria have not
been satisfied, then steps 531-538 are selectively re-executed
iteratively until step 538 determines that the CDR exit criteria
have been satisfied. Note that there is no required sequential
order for executing steps 531-538, and the scope of the present
invention includes execution of steps 531-538 in any desired order,
including the possibility of concurrent performance of some of the
steps. The defects and issues recorded in step 535 provide logical
and intuitive guidelines for executing steps 531-538 in an order
that makes sense in light of the identified defects and issues. For
example, if the defect or issue relates to an inconsistency between
a component design and a particular goal, then it may be
appropriate to revisit steps 531 and 537 iteratively until
consistency is established between the component design and the
particular goal, which in turn leads to satisfying the CDR exit
criteria.
[0136] FIG. 5C describes establishing goals and objectives of the
CDR (see step 531 of FIG. 5B). The systems engineer conducting the
CDR can select from the list of goals for any CDR held. These goals
are intended to provide a guide to the activities the systems
engineer will need to accomplish in order to complete the CDR. This
list of goals can be tailored to each project. It is a goal in FIG.
5C to present and review component designs and component e2e test
plans, to include production infrastructure capacities, with a
focus on documentation relating to a delivered solution. As
described in FIG. 5C, the establishing of goals and objectives of
the CDR include to: review and approve component designs; review
and approve component detailed test plans; establish traceability
between component designs and end-to-end functionality and the
system level acceptance criteria; establish the design baseline;
identify technical risks; review mitigation plans to offset risk;
identify dependencies; identify technical performance measures, and
verify production infrastructure capacity supports system
requirements.
[0137] FIG. 5D describes a review of ground rules to be followed
during the conduct of the CDR (see step 531 of FIG. 5B). The
indicated ground rules comprise: review documentation and address
the listed concerns; and key members of the software development,
test, production, and SDC.fwdarw.hardware teams are present during
the CDR and agree with.fwdarw.approve the design baseline. The"key"
members of the software development are defined as representatives
of software development.
[0138] FIGS. 5E-5F collectively describe a review of CDR entry
criteria for the CDR (see step 533 of FIG. 5B), CDR presentation
content (see step 534 of FIG. 5B), and CDR exit criteria for the
CDR (see step 537 of FIG. 5B). The CDR entry criteria, CDR
presentation content, and CDR exit criteria are intended as a guide
for the systems engineer conducting the CDR. The items listed in
the Entry Criteria column are the documents and activities that
must be completed prior to the start of the CDR. The items listed
in the Presentation column are the documents and work products that
must be presented to the stakeholders and team members attending
the CDR. The items listed in Exit Criteria column are the
activities and documents that must be completed before the review
is considered complete. In FIGS. 5E-5F, the CDR entry criteria, CDR
presentation content, and CDR exit criteria each pertain to the CDR
criteria categories of: system and components requirement review;
physical component design and test review; service delivery center;
system testing; data load (test); and data load (production). The
detailed descriptions and explanations of the CDR presentation
content and the CDR entry criteria and CDR exit criteria within
each CDR criteria category are contained within FIGS. 5E-5F.
[0139] FIG. 5G depicts a detailed scorecard (see step 538 of FIG.
5B) for scoring performance relating to the CDR exit criteria in
the CDR exit criteria categories of FIGS. 5E-5F. The CDR criteria
categories in FIG. 5G (called "scorecard criteria") are: system
components requirement review; component design and test; service
delivery center/operations and delivery organization; system
testing; data load (test); and data load (production). In FIG. 5G,
each CDR criteria category includes one or more criteria. For
example, the CDR criteria category of system testing includes the
criteria of: test plans traceable to system/component requirements
and acceptance criteria; and facilitate the testing of end-to-end
functionality and solution delivery. The criteria for each CDR
criteria category in FIG. 5G reflects the criteria within the
corresponding CDR exit criteria category of FIGS. 5E-5F. Each
criteria in FIG. 5G is scored and each criteria is assigned a
weight. The scores in FIG. 5G are analogous to the scores in FIG.
2F described supra. The meaning and use of the weights and the
interpretation of the "Weighting Factor", "CDR Score", and "USE
THIS COLUMN TO ENTER SCORES" columns in FIG. 5G are analogous to
the corresponding columns of FIG. 2F described supra. Similarly,
the Overall Review Score in FIG. 5G is analogous to the
corresponding Overall Review Score in FIG. 2F described supra and
may be computed by any of the techniques described supra for
computing the Overall Review Score relating to the BRR.
[0140] Various algorithms may be used to determine whether the CDR
process has been successfully completed. In a first exemplary
algorithm, the CDR process has been successfully completed if the
Overall Review Score is no less than a predetermined threshold
score (e.g., 85, within a range of 85 to 100, etc.). In a second
algorithm, the CDR process has been successfully completed if each
criteria category score satisfies a given threshold score (e.g.,
85, within a range of 80 to 90, etc.), and the given threshold
score may be constant or criteria dependent or criteria category
dependent. In a third algorithm, the CDR process has been
successfully completed if each criteria category score satisfies a
given threshold score and if the Overall Review Score is no less
than a predetermined threshold score. Additional algorithms may
impose scoring thresholds on some or all of the criteria and/or
criteria categories. If the algorithm determines that the CDR
process has not been successfully completed, then steps 531-538 are
selectively re-executed iteratively until the pertinent algorithm
determines that the CDR process has been successfully
completed.
[0141] FIG. 5H is a spider chart for graphically representing the
CDR criteria category scores tabulated in the scorecard of FIG. 5G.
Each axis of FIG. 5H represents a CDR criteria category of FIG. 5G
such that points 5A, 5B, 5C, 5D, 5E, and 5F respectively represent
the CDR scores of the CDR criteria category of: system and
components requirement review; component design and test review;
service delivery center/operations and delivery organization;
system testing; data load (test); and data load (production). The
dashed polygons identify the scores at the intersections between
the dashed polygons and the four axes. The points 5A, 5B, 5C, 5D,
5E, and 5F define a polygon 5P that is useful for visualizing the
CDR criteria category scores relative to each other and also for
visualizing the score of each CDR criteria category in relation to
the pertinent threshold score (e.g., 85). Although not shown in
FIG. 5H, a threshold score to ultimately be satisfied by the CDR
criteria category scores and/or Overall Review Score could also be
represented on FIG. 5H. For example, if the threshold score for the
Overall Review Score is 85, then a heavily bolded polygon having
the value 85 (i.e., between the dashed polygons having values of 80
and 90), could be superimposed onto FIG. 5H. The spider chart of
FIG. 5H may be generated by a software tool or algorithm (e.g., a
graphics plotting tool), wherein the software tool or algorithm is
stored on a computer readable medium and is executed by a processor
of a computer system.
[0142] Note that a detailed scorecard and a spider chart could be
utilized for scoring performance in relation to the CDR entry
criteria of FIGS. 5E-5F in a manner analogous to the use of the
detailed scorecard and spider chart of FIGS. 5G and 5H,
respectively, in relation to scoring performance for the CDR exit
criteria of FIGS. 5E-5F.
[0143] If the pertinent algorithm determines that the CDR process
has been successfully completed, then the RTVM is next updated as
described infra in conjunction with FIGS. 8A-8H, followed by
execution of the Test System step 600 of FIG. 1.
[0144] In accordance with the preceding discussion, the Develop and
Test Components step 500 (see FIG. 1) of the SE process reviews
developing and testing a component solution which satisfies the
component requirements as baselined in the CDR. The Critical Design
Review (CDR) template, along with the CDR scorecard, and the RTVM
are used to define the component solution and test plan to meet the
component requirements. The CDR template maps out clear, specific
steps to evaluate the component design. The CDR Scorecard
quantitatively measures how well the component design meets the
component requirements. The RTVM provides traceability from the
assigned requirements to the design elements and test methods to
ensure that all requirements can be validated. The present
invention enables both the SE team and the customer to agree on the
scope of the system and formally baseline the technical scope of
the project. The present invention provides a structured guide to
the work required to complete the Develop and Test Components step
500 step in accordance with a measurable standard with which to
determine the readiness and risk to proceed forward with the
project.
[0145] The CDR template provides standardization and replication of
the CDR review steps for any project using the SE process by
dictating the detailed content and specific sequence of the CDR.
This results in a measurable improvement in quality, content and
consistency of all CDRs. A goal of the CDR is to establish and
formally baseline the component design and test plan. Successful
completion of the CDR will reduce project risk by identifying
defects and issues. Successful completion of the CDR is achieved
when a sufficient number of defects and issues have been resolved
to obtain a success score as described supra in conjunction with
FIG. 5G.
[0146] To achieve successful completion of the CDR, the present
invention lists clear, standardized objectives and ground rules for
the CDR to ensure the review goals are met.
[0147] To achieve successful completion of the CDR, the present
invention also establishes a standardized set of CDR entry
criteria, CDR presentation content, and CDR exit criteria that must
be met in order to successfully complete the review. The CDR entry
criteria list the information to be presented within the review.
The CDR presentation material further clarifies what information
should be presented and how it can be presented. The CDR exit
criteria delineate the requirements for accepting the technical
information as complete. If the level of detail required in the
template is not available, then the practitioner is not ready to
hold a CDR.
[0148] To achieve successful completion of the CDR, the present
invention also requires completion of the CDR scorecard to provide
a measurable evaluation of the review's success.
[0149] When the CDR is conducted as described herein, a list of
defects and issues is recorded as well as a quantitative measure
(i.e., score) which evaluates the content and completeness of the
component design. The score is tied to the defects and issues that
have been recorded. Correction of the defects and resolution of the
issues are essential to the creation of the component design. As
the defects are corrected and the issues resolved the CDR criteria
are periodically reapplied to evaluate the content and completeness
of the design and a new quantitative measure is developed. The
quantitative measure is used to identify and quantify risk. The
component design is not baselined until a minimum success score
(i.e., minimum acceptable score success score-e.g., 85, 80-90, or
85-100 in the Overall Review Score of FIG. 5G) has been
achieved.
[0150] The CDR scorecard is a quantitative and consistent method to
measures the quality and completeness of the project's system and
component requirements, component design, and test and production
plans according to a comprehensive set of criteria. The CDR
scorecard provides a mechanism for scoring the component baseline
for accuracy, completeness, quality and risk.
[0151] The CDR scorecard contains the CDR criteria and weighting in
a software tool (e.g., a commercially available tool such as the
Microsoft Excel.RTM. spreadsheet tool) to quickly and uniformly
generate a review score. The success criteria can be easily
tailored to address specific project complexity or scope
situations. Using a uniform (i.e, project independent) criteria
allows teams to conduct analysis across all projects and develop
organization lessons learned with corrective actions. However, the
scope of the present invention also includes embodiments in which
the success criteria are project dependent.
[0152] The present invention measures the quality of the prescribed
CDR exit criteria by providing: guidelines for performing the
scoring; a prescribed set of scoring criteria for each question; a
prescribed set of questions which specifically address the defined
goals of the CDR; and a weighting factor for each question to
accurately reflect the significance of each element in regard to
the overall evaluation.
[0153] An output of the CDR scorecard may be an Overall Review
Score (see FIG. 5G) such as between 0 and 100 which: rates the
quality of the material presented at the CDR; specifies the issues
and defects that were discovered during the CDR; and states the
technical risk associated with the component requirements baseline
that was presented at the CDR.
[0154] The Overall Review Score may be mapped to "Red" (critical),
"Yellow" (caution), "Green" (satisfactory) status for a summary
assessment of the overall condition of the project. The CDR
scorecard automatically generates a "spider chart" (see FIG. 5G) as
a graphical representation of the scoring per criteria. The spider
chart is a visual reference to highlight the problem areas
discovered during the CDR and measured by the Overall Review Score
for the CDR.
Test System (Step 600)
[0155] FIGS. 6A-6G depict details associated with step 600 of FIG.
1, namely the step of testing the system, in accordance with
embodiments of the present invention.
[0156] FIG. 6A illustrates three aspects of step 600 to be utilized
in developing and testing components which satisfies the component
requirements as baselined in the PDR, namely a Test Readiness
Review (TRR) template 610, a TRR scorecard 620, and the RTVM 800.
The TRR template 610 has TRR aspects 612, which include goals, TRR
entry criteria, and TRR exit criteria. The TRR entry criteria
denote criteria to be satisfied in order to conduct the TRR. The
TRR exit criteria denote criteria to be satisfied in order to
complete the TRR. Satisfying the TRR entry criteria and TRR exit
criteria requires an objective standard, such as achieving a
minimum score relating to the extent to which the TRR entry
criteria and TRR exit criteria are satisfied. The present invention
teaches use of a scorecard for implementing an objective scoring
standard in terms of a numerical score, namely the TRR scorecard
620. The TRR scorecard 620 has aspects 622, which include a
detailed scorecard and an associated spider chart (described infra
in conjunction with FIGS. 6F and 6G, respectively). Upon completion
of the TRR, the RTVM 800 is updated to record the results of the
TRR as described infra in conjunction with FIGS. 8A-8H.
[0157] FIG. 6B depicts process steps 630-637 of the TRR. Step 630
initiates the TRR. Steps 631-637 follow step 630. Updating the RTVM
800 (see FIG. 6A) is the final process step of the TRR after steps
631-637 have been completely executed. Step 631 establishes goals
and objectives of the TRR, as described infra in conjunction with
FIG. 6C. Step 632 reviews the TRR entry criteria for conducting the
TRR, as described infra in conjunction with FIGS. 6D-6E. Step 633
presents materials needed for conducting the TRR session, as
described infra in conjunction with FIGS. 6D-6E. Step 634 records
defects and issues which emerge during the conduct of steps 631-633
and 635-637. Step 635 utilizes the RTVM 800 (see FIG. 6A) to review
the requirements traceability with respect to the business
requirements, system requirements, component requirements,
developing and testing components, and testing the system. Step 636
initiates verification that the TRR exit criteria have been
satisfied, as described infra in conjunction with FIGS. 6D-6E. Step
637 determines objectively in terms of a quantitative metric
whether the TRR exit criteria have been satisfied. If step 637
determines that the TRR exit criteria have been satisfied, then
steps 631-637 are exited and the RTVM 800 is updated to record the
results of the TRR as described infra in conjunction with FIGS.
8A-8H. If step 637 determines that the TRR exit criteria have not
been satisfied, then steps 631-637 are selectively re-executed
iteratively until step 637 determines that the TRR exit criteria
have been satisfied. Note that there is no required sequential
order for executing steps 631-637 and the scope of the present
invention includes execution of steps 631-637 in any desired order,
including the possibility of concurrent performance of some of the
steps. The defects and issues recorded in step 634 may provide
logical and intuitive guidelines for executing steps 631-637 in an
order that makes sense in light of the identified defects and
issues. For example, if the defect or issue relates to an
inconsistency between a TRR exit criteria and a particular goal,
then it may be appropriate to revisit steps 631 and 636 iteratively
until consistency is established between the TRR exit criteria and
the particular goal.
[0158] FIG. 6C describes establishing goals and objectives of the
TRR (see step 631 of FIG. 6B). The systems engineer conducting the
TRR can select from the list of goals for any TRR held. These goals
are intended to provide a guide to the activities the systems
engineer will need to accomplish in order to complete the TRR. This
list of goals can be tailored to each project. The goals in FIG. 6C
include to: create a test baseline, verify that test entry criteria
has been met for each application and document exception; verify
test environment and data readiness; verify test team readiness;
and obtain customer and solution project manager approval for test
execution readiness.
[0159] FIGS. 6D-6E collectively describe a review of TRR entry
criteria for the TRR (see step 632 of FIG. 6B), TRR presentation
content (see step 633 of FIG. 6B), and review of TRR exit criteria
for the TRR (see step 636 of FIG. 6B). The TRR entry criteria, TRR
presentation content, and TRR exit criteria are intended as a guide
for the systems engineer conducting the TRR. The items listed in
the Entry Criteria column are the documents and activities that
must be completed prior to the start of the TRR. The items listed
in the Presentation column are the documents and work products that
must be presented to the stakeholders and team members attending
the TRR. The items listed in Exit Criteria column are the
activities and documents that must be completed before the review
is considered complete. In FIGS. 6D-6E, the TRR entry criteria, TRR
presentation content, and TRR exit criteria each pertain to the CDR
criteria categories of: test strategy; test requirements
verification matrix; application readiness; test environment
readiness; and test team readiness. The detailed descriptions and
explanations of the TRR presentation content and the TRR entry
criteria and TRR exit criteria within each CDR criteria category
are contained within FIGS. 6D-6E.
[0160] FIG. 6F depicts a detailed scorecard (see step 637 of FIG.
6B) for scoring performance relating to the TRR exit criteria in
the TRR exit criteria categories of FIGS. 6D-6E. The TRR criteria
categories in FIG. 6F (called "scorecard criteria") are: test
strategy; test requirements verification matrix; application
readiness; test environment readiness; and test team readiness. In
FIG. 6F, each TRR criteria category includes one or more criteria.
For example, the TRR criteria category of test strategy includes
the criteria of: test schedule; problem, defects, turnaround times
by severity; and documented technical risks, dependencies, and
migration plans. The criteria for each TRR criteria category in
FIG. 6F reflects the criteria within the corresponding TRR exit
criteria category of FIGS. 6E-6F. Each criteria in FIG. 6GF is
scored and each criteria is assigned a weight. The scores in FIG.
6F are analogous to the scores in FIG. 2F described supra. The
meaning and use of the weights and the interpretation of the
"Weighting Factor", "TRR Score", and "USE THIS COLUMN TO ENTER
SCORES" columns in FIG. 6F are analogous to the corresponding
columns of FIG. 2F described supra. Similarly, the Overall Review
Score in FIG. 6F is analogous to the corresponding Overall Review
Score in FIG. 2F described supra and may be computed by any of the
techniques described supra for computing the Overall Review Score
relating to the BRR.
[0161] Various algorithms may be used to determine whether the TRR
process has been successfully completed. In a first exemplary
algorithm, the TRR process has been successfully completed if the
Overall Review Score is no less than a predetermined threshold
score (e.g., 85, within a range of 85 to 100, etc.). In a second
algorithm, the TRR process has been successfully completed if each
criteria category score satisfies a given threshold score (e.g.,
85, within a range of 80 to 90, etc.), and the given threshold
score may be constant or criteria dependent or criteria category
dependent. In a third algorithm, the TRR process has been
successfully completed if each criteria category score satisfies a
given threshold score and if the Overall Review Score is no less
than a predetermined threshold score. Additional algorithms may
impose scoring thresholds on some or all of the criteria and/or
criteria categories. If the algorithm determines that the TRR
process has not been successfully completed, then steps 631-637 are
selectively re-executed iteratively until the pertinent algorithm
determines that the TRR process has been successfully
completed.
[0162] FIG. 6G is a spider chart for graphically representing the
TRR criteria category scores tabulated in the scorecard of FIG. 6G.
Each axis of FIG. 6G represents a TRR criteria category of FIG. 6F
such that points 6A, 6B, 6C, 6D, and 6E respectively represent the
TRR scores of the TRR criteria category of: test strategy; test
requirements verification matrix; application readiness; test
environment readiness; and test team readiness. The dashed polygons
identify the scores at the intersections between the dashed
polygons and the four axes. The points 6A, 6B, 6C, 6D, and 6E
define a polygon 6P that is useful for visualizing the TRR criteria
category scores relative to each other and also for visualizing the
score of each TRR criteria category in relation to the pertinent
threshold score (e.g., 85). Although not shown in FIG. 6G, a
threshold score to ultimately be satisfied by the TRR criteria
category scores and/or Overall Review Score could also be
represented on FIG. 6G. For example, if the threshold score for the
Overall Review Score is 85, then a heavily bolded polygon having
the value 85 (i.e., between the dashed polygons having values of 80
and 90), could be superimposed onto FIG. 6G. The spider chart of
FIG. 6G may be generated by a software tool or algorithm (e.g., a
graphics plotting tool), wherein the software tool or algorithm is
stored on a computer readable medium and is executed by a processor
of a computer system.
[0163] Note that a detailed scorecard and a spider chart could be
utilized for scoring performance in relation to the TRR entry
criteria of FIGS. 6D-6E in a manner analogous to the use of the
detailed scorecard and spider chart of FIGS. 6F and 6G,
respectively, in relation to scoring performance for the TRR exit
criteria of FIGS. 6D-6E.
[0164] If the pertinent algorithm determines that the TRR process
has been successfully completed, then the RTVM is next updated as
described infra in conjunction with FIGS. 8A-8H, followed by
execution of the Put System into Production step 700 of FIG. 1.
[0165] In accordance with the preceding discussion, the Test System
step 600 (see FIG. 1) of the SE process reviews testing the system.
The Test Readiness Review (TRR) template, along with the TRR
scorecard, and the RTVM are used to establish and agree on the test
plan for the system. The TRR template maps out a clear, specific
set of steps to evaluate the system test plan. The TRR Scorecard
quantitatively measures how well the test plan defines the methods,
test articles, procedures and environment that will be used to
verify that the IT solution meets the requirements. The RTVM
provides traceability from the assigned requirements to the design
elements and test methods to ensure that all requirements can be
validated. The present invention enables both the SE team and the
customer to agree on the total testing of the system prior to
starting any tests. The present invention provides a structured
guide to the work required to complete this step of the SE process
and a measurable standard with which to determine the readiness and
risk to proceed forward with the project.
[0166] The TRR template provides standardization and replication of
the TRR review steps for any project using the SE process by
dictating the detailed content and specific sequence of the TRR.
This results in a measurable improvement in quality, content and
consistency of all TRRs. A goal of the TRR is to establish the
system test plan. Successful completion of the TRR will reduce
project risk by identifying defects and issues. Successful
completion of the TRR is achieved when a sufficient number of
defects and issues have been resolved to obtain a success score as
described supra in conjunction with FIG. 6F.
[0167] To achieve successful completion of the TRR, the present
invention lists clear, standardized objectives for the TRR to
ensure the review goals are met.
[0168] To achieve successful completion of the TRR, the present
invention also establishes a standardized set of TRR entry
criteria, TRR presentation content and TRR exit criteria that must
be met in order to successful complete the review. The TRR entry
criteria list the information to be presented within the review.
The TRR presentation material further clarifies what information
should be presented and how it can be presented. The TRR exit
criteria delineate the requirements for accepting the technical
information as complete. If the level of detail required in the
template is not available, then the practitioner is not ready to
hold an TRR.
[0169] To achieve successful completion of the TRR, the present
invention also requires completion of the TRR scorecard to provide
a measurable evaluation of the review's success.
[0170] When the TRR is conducted via the method described herein, a
list of defects and issues is recorded as well as a quantitative
measure (score) which evaluates the content and completeness of the
test plan. The score is tied to the defects and issues that have
been recorded. Correction of the defects and resolution of the
issues are essential to the creation of the test plan. As the
defects are corrected and the issues resolved the TRR criteria are
periodically reapplied to evaluate the content and completeness of
the requirements and a new quantitative measure is developed. The
quantitative measure is used to identify and quantify risk. The
test plan is not baselined until a minimum success score (i.e.,
minimum acceptable success score; e.g., 85, 80-90, or 85-100 in the
Overall Review Score of FIG. 6F) has been achieved.
[0171] The TRR scorecard is a quantitative and consistent method to
measure the quality and completeness of the project's applications,
test environment, test strategy and test requirements according to
a comprehensive set of criteria. It provides a mechanism for
scoring the test plan for accuracy, completeness, quality and risk.
The TRR scorecard contains the BRR criteria and weighting in a
software tool (e.g., a commercially available tool such as the
Microsoft Excel.RTM. spreadsheet tool) to quickly and uniformly
generate a review score. The criteria can be easily tailored to
address specific project complexity or scope situations. Using a
uniform (i.e., project independent) criteria allows teams to
conduct analysis across all projects and develop organization
lessons learned with corrective actions. However, the scope of the
present invention also includes embodiments in which the success
criteria are project dependent.
[0172] The present invention measures the quality of the prescribed
TRR exit criteria by providing: guidelines for performing the
scoring; a prescribed set of scoring criteria for each question; a
prescribed set of questions which specifically address the defined
goals of the TRR; and a weighting factor for each question to
accurately reflect the significance of each element in regard to
the overall evaluation.
[0173] An output of the TRR scorecard is an Overall Review Score
(see FIG. 6F) such as between 0 and 100 which: rates the quality of
the material presented at the review; specifies the issues and
defects that were discovered during the review; and states the
technical risk associated with the test plan presented at the
review.
[0174] The Overall Review Score may be mapped to "Red" (critical),
"Yellow" (caution), "Green" (satisfactory) status for a summary
assessment of the overall condition of the project. The TRR
scorecard automatically generates a "spider chart" (see FIG. 6G) as
a graphical representation of the scoring per criteria. The spider
chart is a visual reference to highlight the problem areas
discovered during the TRR and measured by the Overall Review Score
for the TRR.
Put System into Production (Step 700)
[0175] FIGS. 7A-7G depict details associated with step 700 of FIG.
1, namely the step of putting the system into production, in
accordance with embodiments of the present invention.
[0176] FIG. 7A illustrates two aspects of step 700 to be utilized
in putting the system into production, namely a Production
Readiness Review (PRR) template 710 and a PRR scorecard 720. The
PRR template 710 has PRR aspects 712, which include goals, PRR
entry criteria, and PRR exit criteria. The PRR entry criteria
denote criteria to be satisfied in order to conduct the PRR. The
PRR exit criteria denote criteria to be satisfied in order to
complete the PRR. Satisfying the PRR entry criteria and PRR exit
criteria requires an objective standard, such as achieving a
minimum score relating to the extent to which the PRR entry
criteria and PRR exit criteria are satisfied. The present invention
teaches use of a scorecard for implementing an objective scoring
standard in terms of a numerical score, namely the PRR scorecard
720. The PRR scorecard 720 has aspects 722, which include a
detailed scorecard and an associated spider chart (described infra
in conjunction with FIGS. 7F and 7G, respectively).
[0177] FIG. 7B depicts process steps 730-737 of the PRR. Step 730
initiates the PRR. Steps 731-737 follow step 730. Step 731
establishes goals and objectives of the PRR, as described infra in
conjunction with FIG. 7C. Step 732 reviews the PRR entry criteria
for conducting the PRR, as described infra in conjunction with
FIGS. 7D-7E. Step 733 presents materials needed for conducting the
PRR session, as described infra in conjunction with FIGS. 7D-7E.
Step 734 records defects and issues which emerge during the conduct
of steps 731-733 and 735-736. Step 735 initiates verification that
the PRR exit criteria have been satisfied, as described infra in
conjunction with FIGS. 7D-7E. Step 736 determines objectively in
terms of a quantitative metric whether the PRR exit criteria have
been satisfied. If step 736 determines that the PRR exit criteria
have been satisfied, then steps 731-736 are exited and step 737 is
next executed. If step 736 determines that the PRR exit criteria
have not been satisfied, then steps 731-736 are selectively
re-executed iteratively until step 736 determines that the PRR exit
criteria have been satisfied. Note that there is no required
sequential order for executing steps 731-736 and the scope of the
present invention includes execution of steps 731-736 in any
desired order, including the possibility of concurrent performance
of some of the steps. The defects and issues recorded in step 734
may provide logical and intuitive guidelines for executing steps
731-736 in an order that makes sense in light of the identified
defects and issues. For example, if the defect or issue relates to
an inconsistency between a PRR exit criteria and a particular goal,
then it may be appropriate to revisit steps 731 and 735 iteratively
until consistency is established between the PRR exit criteria and
the particular goal. Step 737 makes a Go/No Go decision as to
whether to put the system into production.
[0178] FIG. 7C describes establishing goals and objectives of the
PRR (see step 731 of FIG. 7B). The systems engineer conducting the
PRR can select from the list of goals for any PRR held. These goals
are intended to provide a guide to the activities the systems
engineer will need to accomplish in order to complete the PRR. This
list of goals can be tailored to each project. The goals in FIG. 7C
include to: verify that the release is ready to be installed into
production by reviewing release content, test results (functional
tests, performance tests, pre-production tests), expected system
availability/performance/response times; and decide whether the
release is ready for production based on the completeness of the
material and the number of open defects resulting from the
review.
[0179] FIGS. 7D-7E collectively describe a review of PRR entry
criteria for the PRR (see step 732 of FIG. 7B), PRR presentation
content (see step 733 of FIG. 7B), and review of PRR exit criteria
for the PRR (see step 735 of FIG. 7B). The PRR entry criteria, PRR
presentation content, and PRR exit criteria are intended as a guide
for the systems engineer conducting the PRR. The items listed in
the Entry Criteria column are the documents and activities that
must be completed prior to the start of the PRR. The items listed
in the Presentation column are the documents and work products that
must be presented to the stakeholders and team members attending
the PRR. The items listed in Exit Criteria column are the
activities and documents that must be completed before the review
is considered complete. In FIGS. 7D-7E, the PRR entry criteria, PRR
presentation content, and PRR exit criteria each pertain to the TRR
criteria categories of: requirements/architecture; test; move to
production; production readiness; and project risks to production.
The detailed descriptions and explanations of the PRR presentation
content and the PRR entry criteria and PRR exit criteria within
each PRR criteria category are contained within FIGS. 7D-7E.
[0180] FIG. 7F depicts a detailed scorecard (see step 736 of FIG.
7B) for scoring performance relating to the PRR exit criteria in
the PRR exit criteria categories of FIGS. 7D-7E. The PRR criteria
categories in FIG. 7F (called "scorecard criteria") are:
requirements/architecture; test; move to production; production
readiness; and project risks to production. In FIG. 7F, each PRR
criteria category includes one or more criteria. For example, the
PRR criteria category of "move to production" includes the criteria
of: major move to production milestones defined; move to production
application and contacts defined; time line defined; move to
production deliverables defined; and Governance needs met. The
criteria for each PRR criteria category in FIG. 7F reflects the
criteria within the PRR exit criteria categories of FIGS. 7D-7E.
Each criteria in FIG. 7F is scored and each criteria is assigned a
weight. The scores in FIG. 7F are analogous to the scores in FIG.
2F described supra. The meaning and use of the weights and the
interpretation of the "Weighting Factor", "PRR Score", and "USE
THIS COLUMN TO ENTER SCORES" columns in FIG. 7F are analogous to
the corresponding columns of FIG. 2F described supra. Similarly,
the Overall Review Score in FIG. 7F is analogous to the
corresponding Overall Review Score in FIG. 2F described supra and
may be computed by any of the techniques described supra for
computing the Overall Review Score relating to the BRR.
[0181] Various algorithms may be used to determine whether the PRR
process has been successfully completed. In a first exemplary
algorithm, the PRR process has been successfully completed if the
Overall Review Score is no less than a predetermined threshold
score (e.g., 85, within a range of 85 to 100, etc.). In a second
algorithm, the PRR process has been successfully completed if each
criteria category score satisfies a given threshold score (e.g.,
85, within a range of 80 to 90, etc.), and the given threshold
score may be constant or criteria dependent or criteria category
dependent. In a third algorithm, the PRR process has been
successfully completed if each criteria category score satisfies a
given threshold score and if the Overall Review Score is no less
than a predetermined threshold score. Additional algorithms may
impose scoring thresholds on some or all of the criteria and/or
criteria categories. If the algorithm determines that the PRR
process has not been successfully completed, then steps 731-736 are
selectively re-executed iteratively until the pertinent algorithm
determines that the PRR process has been successfully
completed.
[0182] FIG. 7G is a spider chart for graphically representing the
PRR criteria category scores tabulated in the scorecard of FIG. 7F.
Each axis of FIG. 7G represents a PRR criteria category of FIG. 7F
such that points 7A, 7B, 7C, 7D, and 7E respectively represent the
SRR scores of the PRR criteria category of:
requirements/architecture; test; move to production; production
readiness; and project risks to production. The dashed polygons
identify the scores at the intersections between the dashed
polygons and the four axes. The points 7A, 7B, 7C, 7D, and 7E
define a polygon 2P that is useful for visualizing the PRR criteria
category scores relative to each other and also for visualizing the
score of each PRR criteria category in relation to the pertinent
threshold score (e.g., 85). Although not shown in FIG. 7G, a
threshold score to ultimately be satisfied by the PRR criteria
category scores and/or Overall Review Score could also be
represented on FIG. 7G. For example, if the threshold score for the
Overall Review Score is 85, then a heavily bolded polygon having
the value 85 (i.e., between the dashed polygons having values of 80
and 90), could be superimposed onto FIG. 7G. The spider chart of
FIG. 7G may be generated by a software tool or algorithm (e.g., a
graphics plotting tool), wherein the software tool or algorithm is
stored on a computer readable medium and is executed by a processor
of a computer system.
[0183] Note that a detailed scorecard and a spider chart could be
utilized for scoring performance in relation to the PRR entry
criteria of FIGS. 7D-7E in a manner analogous to the use of the
detailed scorecard and spider chart of FIGS. 7F and 7G,
respectively, in relation to scoring performance for the PRR exit
criteria of FIGS. 7D-7E.
[0184] If the pertinent algorithm determines that the PRR process
has been successfully completed, then the Go/No Go decision step
737 of FIG. 7B is next executed to complete the SE review process
for the project.
[0185] In accordance with the preceding discussion, the Put System
Into Production step 700 (see FIG. 1) of the SE process reviews
putting the system into production. The Production Readiness Review
(PRR) template and the PRR scorecard are used to establish and
agree on the production (also known as operational) plan for the
system. The PRR template maps out a clear, specific set of steps to
evaluate the production plan. The PRR scorecard quantitatively
measures how well the production plan defines the methods,
production components, procedures and environment to install the IT
solution into production and verify completeness of the production
installation. The present invention enables both the SE team and
the customer to agree on the adequacy of the production plan prior
to starting any installation. This agreement is the formal basis to
place the technical scope of the project into operation. The
present invention provides a structured guide to the work required
to complete this step of the SE process and a measurable standard
with which to determine the readiness and risk to proceed forward
with the project.
[0186] The PRR template provides standardization and replication of
the PRR review steps for any project using the SE process by
dictating the detailed content and specific sequence of the PRR.
This results in a measurable improvement in quality, content and
consistency of all PRRs. A goal of the PRR is to establish the plan
to put the system into production. Successful completion of the PRR
will reduce project risk by identifying defects and issues.
Successful completion of the PRR is achieved when a sufficient
number of defects and issues have been resolved to obtain a success
score as described supra in conjunction with FIG. 6F.
[0187] To achieve successful completion of the PRR, the present
invention lists clear, standardized objectives for the PRR to
ensure the review goals are met.
[0188] To achieve successful completion of the PRR, the present
invention also establishes a standardized set of PRR entry
criteria, PRR presentation content, and PRR exit criteria that must
be met in order to successfully complete the review. The PRR entry
criteria list the information to be presented within the review.
The PRR presentation content further clarify what information
should be presented and how it can be presented. The PRR exit
criteria delineate the requirements for accepting the technical
information as complete. If the level of detail required in the
template is not available, then the practitioner is not ready to
hold an PRR.
[0189] To achieve successful completion of the PRR, the present
invention requires completion of the PRR scorecard to provide a
measurable evaluation of the review's success.
[0190] When the PRR is conducted as described herein, a list of
defects and issues is recorded as well as a quantitative measure
(i.e., score) which evaluates the content and completeness of the
production plan. The score is tied to the defects and issues that
have been recorded. Correction of the defects and resolution of the
issues are essential to the creation of the production plan. As the
defects are corrected and the issues resolved the PRR criteria are
periodically reapplied to evaluate the content and completeness of
the requirements and a new quantitative measure is developed. The
quantitative measure is used to identify and quantify risk. The
production plan is not baselined until a minimum success score
(i.e., minimum acceptable score; e.g., 85, 80-90, or 85-100 in the
Overall Review Score of FIG. 6F) has been achieved.
[0191] The PRR Scorecard is a quantitative and consistent method to
measures the quality and completeness of the project's production
plan according to a comprehensive set of criteria. The PRR
Scorecard provides a mechanism for scoring the production plan for
accuracy, completeness, quality, and risk. The PRR scorecard
contains the PRR criteria and weighting in a software tool (e.g., a
commercially available tool such as the Microsoft Excel.RTM.
spreadsheet tool) to quickly and uniformly generate a review score.
The criteria can be easily tailored to address specific project
complexity or scope situations. Using a uniform (i.e., project
independent) criteria allows teams to conduct analysis across all
projects and develop organization lessons learned with corrective
actions. However, the scope of the present invention also includes
embodiments in which the success criteria are project dependent
[0192] The present invention measures the quality of the prescribed
PRR exit criteria by providing: guidelines for performing the
scoring; a prescribed set of scoring criteria for each question; a
prescribed set of questions which specifically address the defined
goals of the PRR; and a weighting factor for each question to
accurately reflect the significance of each element in regard to
the overall evaluation.
[0193] An output of the PRR scorecard is an Overall Review Score
such as between 0 and 100 which: rates the quality of the material
presented at the review; specifies the issues and defects that were
discovered during the review; and states the technical risk
associated with the production plan presented at the review.
[0194] The Overall Review Score may also be also mapped to "Red"
(critical), "Yellow" (caution), "Green" (satisfactory) status for a
summary assessment of the overall condition of the project. The PRR
Scorecard automatically generates a "spider chart" as a graphical
representation of the scoring per criteria. The chart is a visual
reference to highlight the problem areas discovered during the PRR
and measured by the Overall Review Score for the PRR.
Requirements Traceability and Verification Matrix (RTVM)
[0195] FIGS. 8A-8H depict a Requirements Traceability and
Verification Matrix (RTVM) 800 which provides cumulative
traceability from the business requirements, systems requirements,
component requirements, design elements and test methods to ensure
that the system is verified and validated in all of its facets, in
accordance with embodiments of the present invention. FIGS. 8A, 8B,
8C, 8D, 8E, 8F, 8G, and 8H respectively depict portion 800A, 800B,
800C, 800D, 800E, 800F, 800G, and 800H of the RTVM 800. The RTVM
800 effectively tracks the hierarchical relationships among the
business requirements, the system requirements, and the component
requirements. The RTVM is utilized in each of steps 200, 300, 400,
500, and 600 of FIG. 1.
[0196] FIGS. 8A-8H include a block 810 which common to each of
FIGS. 8A-8H. FIGS. 8A-8H also include Figure-dependent blocks
820A-820H. FIGS. 8A, 8B, 8C, 8D, 8E, 8F, 8G, and 8H include block
820A, 820B, 820C, 820D, 820E, 820F, 820G, and 820H. While FIGS.
8A-8G depict portions 800A-800G of the RTVM 800 on separate and
distinct Figures, the scope of the present invention includes
representing portions 800A-800H as a single continuous spreadsheet
such that the common block 810 appears only once as the leftmost
portion of the spreadsheet, and blocks 820A, 820B, 820C, 820D,
820E, 820F, 820G, and 820H appear on the spreadsheet sequentially
from left to right as well as to the right of common block 810. In
the preceding single spreadsheet layout, duplicate printing of the
common block 800 is avoided.
[0197] The common block 810 of FIGS. 8A-8H includes a Requirements
field which contains business requirements, system requirements,
and component requirements. In FIGS. 8A-8H, the common block 810
depicts: business requirements of R1, R2, R3, and R4; system
requirements of S1.1, S1.2, S2.1, S2.2, S3.1, and S4.1; and
component requirements of C1.1.1, C1.1.2, C1.2.1, C1.2.2, C1.2.3,
C2.1.1, and C2.2.1. Generally, the business requirements, system
requirements, and component requirements are hierarchically linked
such that system requirement of Si.j is linked to component
requirement Ri, and component requirement Ci.j.k is linked to
system requirement Si.j, wherein I, j, and k are positive integers.
Each row of the common block 810 represents a business requirement,
a system requirement, or a component requirement. For example, row
831 represents business requirement R1, row 832 represents system
requirement S1.1 which is hierarchically linked to business
requirement R1, and row 833 represents component requirement C1.1.1
which is hierarchically linked to system requirement S1.1.
[0198] In FIG. 8A, block 820A has fields Requirement Status, Design
Documents, Build Components, and Customer Acceptance Criteria. The
Requirement Status field identifies the concurrent status of the
requirement and may have, inter alia, one of the following values:
design complete, build complete, test verified, withdrawn, etc. The
Design Documents field identifies the section(s) in design or other
documents where the requirement has been addressed (e.g.,
External/Internal, Macro/Micro, etc.). The Build Components field
identifies the module(s), program(s) or other component(s) (e.g.,
software component, documentation, training materials, etc.) where
the requirement is delivered; the highest level that is descriptive
is used. The Customer Acceptance Criteria field identifies the
criteria by which the customer will accept that the requirement has
been met.
[0199] FIGS. 8B-8H each have the fields Test Method, Test Case, and
Test Result. The Test Method may be Analysis, Demonstration,
Inspection, Simulation/Modeling, or Test. The Test Method of
Analysis denotes a systematic appraisal of a requirement and its
derivations to definitely demonstrate the validity of a
requirement, design or test. The Test Method of Demonstration
denotes a presentation of the physical realization of a requirement
in active use in real or simulated conditions, which would apply to
the validation of Human Factors aspects, maintainability and
removal routes. The Test Method of Inspection denotes a visual
review of documentation, materials or mechanical features
associated with the product. The Test Method of Simulation/Modeling
denotes a representation of the design (either physically by a
mockup, or by means of a computer-generated simulation which can be
validated as representative) through which performance
characteristics of the design (or elements of it) can be accurately
assessed. The Test Method of Test denotes a repeatable test with
defined pre-test conditions and quantifiable pass/fail criteria.
Tests may be conducted or repeated at different stages of the
design and integration process to verify required operation. The
Test Case field includes an identifier of the pertinent test case
for the requirement. The Test Result field includes an identifier
of the test result for the Test Case.
[0200] FIG. 8B shows in block 820B the fields Test Method, Test
Case, and Test Result of the RTVM that track the Unit Test
information (testing the separate units that make up the system).
For each requirement (when applicable), the Test Method (Analysis,
Demonstration, Inspection, Simulation/Modeling, Test), Test Case
and Test Result will be recorded in the corresponding column of the
matrix.
[0201] FIG. 8C shows in block 820C the fields Test Method, Test
Case, and Test Result of the RTVM that track the Integration Test
information (integrating the units into a single working system).
For each requirement (when applicable), the Test Method (Analysis,
Demonstration, Inspection, Simulation/Modeling, Test), Test Case
and Test Result will be recorded in the corresponding column of the
matrix.
[0202] FIG. 8D shows in block 820D the fields Test Method, Test
Case, and Test Result of the RTVM that track the System Test
information (testing the integrated units as a single working
system). For each requirement (when applicable), the Test Method
(Analysis, Demonstration, Inspection, Simulation/Modeling, Test),
Test Case and Test Result will be recorded in the corresponding
column of the matrix.
[0203] FIG. 8E shows in block 820E the fields Test Method, Test
Case, and Test Result of the RTVM that track the System Integration
Test information (integrating the system into existing production
environment). For each requirement (when applicable), the Test
Method (Analysis, Demonstration, Inspection, Simulation/Modeling,
Test), Test Case and Test Result will be recorded in the
corresponding column of the matrix.
[0204] FIG. 8F shows in block 820F the fields Test Method, Test
Case, and Test Result of the RTVM that track the Usability Test
information (how well the new system meets it performance
requirements). For each requirement (when applicable), the Test
Method (Analysis, Demonstration, Inspection, Simulation/Modeling,
Test), Test Case and Test Result will be recorded in the
corresponding column of the matrix.
[0205] FIG. 8G shows in block 820G the fields Test Method, Test
Case, and Test Result of the RTVM that track the Acceptance Test
information (testing user acceptance of the system). For each
requirement (when applicable), the Test Method (Analysis,
Demonstration, Inspection, Simulation/Modeling, Test), Test Case
and Test Result will be recorded in the corresponding column of the
matrix.
[0206] FIG. 8H shows in block 820H the fields Test Method, Test
Case, and Test Result of the RTVM that track the Operability Test
information (testing operability of the system). For each
requirement (when applicable), the Test Method (Analysis,
Demonstration, Inspection, Simulation/Modeling, Test), Test Case
and Test Result will be recorded in the corresponding column of the
matrix.
[0207] The RTVM is a technical management tool that is continually
maintained throughout the project lifecycle. The ability to
organize and trace the numerous requirements generated during
system development projects is critical to the project's success. A
software tool (e.g., a commercially available tool such as the
Microsoft Excel.RTM. spreadsheet tool) may be utilized to show both
requirements traceability and verification. Requirements
traceability is the ability to describe and follow a requirement
through project lifecycle. The RTVM shows requirements traceability
from Business Requirements through System Requirements, Component
Requirements, Design Documents, Build Components to Acceptance
Criteria. The RTVM shows how the individual requirements will be
verified via the test method, the test type, and the acceptance
criteria.
[0208] The RTVM is created after the business requirements have
been baselined. Then the data in each column is defined and/or
modified during the various phases of the project lifecycle.
[0209] When developing system requirements, the system requirements
must be traced to business requirements, which provides a map to
illustrate how the business requirements will be implemented and
from where the system requirements were derived. The requirements
traceability is implemented by the (i,j,k) indexing of the business
requirement (Ri), system requirements (Si.j) and component
requirements (Ci.j.k), as described supra. Acceptance criteria and
test types/methods are shown in the RTVM to indicate how the system
requirements will be verified.
[0210] When developing component requirements, the component
requirements are traced to the system requirements. This provides a
map of how the system requirements will be implemented and what
business requirement are supported. The map shows the how component
requirements were derived from the system requirements and the
business requirements. Acceptance criteria and test types/methods
are shown to indicate how the component requirements will be
verified.
[0211] During system design and build, the system's design
documents and build components need to be traced to their
corresponding system and component requirements to demonstrate that
all business processes and system requirements are met within the
proposed solution.
[0212] During system testing, each system and component requirement
is tested to verify its correctness and completeness. The RTVM maps
each requirement to the required test method, test type and
acceptance criteria.
[0213] FIG. 9 illustrates a computer system 90 used for generating
the RTVM and/or generating the spider charts and/or computing the
Overall Review Score, described supra, in accordance with
embodiments of the present invention. The computer system 90
comprises a processor 91, an input device 92 coupled to the
processor 91, an output device 93 coupled to the processor 91, and
memory devices 94 and 95 each coupled to the processor 91. The
input device 92 may be, inter alia, a keyboard, a mouse, etc. The
output device 93 may be, inter alia, a printer, a plotter, a
computer screen, a magnetic tape, a removable hard disk, a floppy
disk, etc. The memory devices 94 and 95 may be, inter alia, a hard
disk, a floppy disk, a magnetic tape, an optical storage such as a
compact disc (CD) or a digital video disc (DVD), a dynamic random
access memory (DRAM), a read-only memory (ROM), etc. The memory
device 95 includes a computer code 97. The computer code 97
includes a commercially available or customized software tool or
algorithm for generating the RTVM (e.g., a commercially available
tool such as the Microsoft Excel.RTM. spreadsheet tool) described
supra, and/or a software tool or algorithm for generating the
spider charts (e.g., a graphics plotting tool) described supra,
and/or computing the Overall Review Score described supra. The
processor 91 executes the computer code 97. The memory device 94
includes input data 96. The input data 96 includes input required
by the computer code 97. The output device 93 displays output from
the computer code 97. Either or both memory devices 94 and 95 (or
one or more additional memory devices not shown in FIG. 9) may be
used as a computer usable medium (or a computer readable medium or
a program storage device) having a computer readable program code
embodied therein and/or having other data stored therein, wherein
the computer readable program code comprises the computer code 97.
Generally, a computer program product (or, alternatively, an
article of manufacture) of the computer system 90 may comprise said
computer usable medium (or said program storage device).
[0214] While FIG. 9 shows the computer system 90 as a particular
configuration of hardware and software, any configuration of
hardware and software, as would be known to a person of ordinary
skill in the art, may be utilized for the purposes stated supra in
conjunction with the particular computer system 90 of FIG. 9. For
example, the memory devices 94 and 95 may be portions of a single
memory device rather than separate memory devices.
[0215] While embodiments of the present invention have been
described herein for purposes of illustration, many modifications
and changes will become apparent to those skilled in the art.
Accordingly, the appended claims are intended to encompass all such
modifications and changes as fall within the true spirit and scope
of this invention.
* * * * *