A very powerful quality control testing tool is a checklist. It is powerful because it statistically differentiates between two extremes. It can be used for fact gathering during problem identification, cause analysis, or for checking progress during implementation of a solution.
Observed results or conditions are recorded by entering or not entering check marks opposite items on a list. Information gathered in this way is limited to simple yes/no answers. It also quantifies or counts the data entered for subsequent tallying and analysis.
The following requirements phase defect checklist is used to verify the functional needs and specifications for the system. A check in the Missing column means that the item was not included. A check in the Wrong column means the item was incorrectly used. A check in the Extra column means that the item has been discovered but was not originally identified. The Total column totals the number of missing and extra items.
The following logical design phase defect checklist is used to verify the logical design of the system. A check in the Missing column means that the item was not included. A check in the Wrong column means that the item was incorrectly used. A check in the Extra column means that the item has been discovered but was not originally identified. The Total column totals the number of missing and extra items.
The following physical design phase defect checklist is used to verify the physical design of the system. A check in the Missing column means that the item was not included. A check in the Wrong column means the item was incorrectly used. A check in the Extra column means that the item has been discovered but was not originally identified. The Total column totals the number of missing and extra items.
The following program unit design phase defect checklist is used to verify the unit design of the system. A check in the Missing column means that the item was not included. A check in the Wrong column means the item was incorrectly used. A check in the Extra column means that the item has been discovered but was not originally identified. The Total column totals the number of missing and extra items.
The following coding phase defect checklist is used to verify the conversion of the design specifications into executable code. A check in the Missing column means that the item was not included. A check in the Wrong column means the item was incorrectly used. A check in the Extra column means that the item has been discovered but was not originally identified. The Total column totals the number of missing and extra items.
The following field test is limited to a specific field or data element and is intended to validate that all of the processing related to that specific field is performed correctly.
The following record test validates that records can be created, entered, processed, stored, and output correctly.
The following file test verifies that all needed files are included in the system being tested, that they are properly documented in the operating infrastructure, and that the files connect properly with the software components that need data from those files.
The following error test identifies errors in data elements, data element relationships, record and file relationships, as well as logical processing conditions.
The following use test checks the end user’s ability to use the system and involves an understanding of both system output and that output’s ability to lead to a correct action.
The following search test verifies the locations of records, fields, and other variables, and helps validate that the search logic is correct.
The following match/merge test ensures that all the combinations of merging and matching are adequately addressed. The test typically involves two or more files: an input transaction and one or more files or an input transaction and an internal table.
The following stress test validates the performance of software that is subjected to a large volume of transactions.
The following attributes test involves verifying the attributes that are quality and productivity characteristics of a system being tested. An example includes the ease of introducing changes into the software.
The following states test verifies special conditions relating to both the operating and functional environments that may occur.
The following procedures test verifies the software to verify the operating, terminal, and communications procedures.
The following control test validates the ability of internal controls to support accurate, complete, timely, and authorized processing. These controls are usually validated by auditors assessing the adequacy of control, which is typically dictated by law.
The following control flow test validates the control flow of transactions through the system under test. It determines whether records can be lost or misprocessed during processing.
Finding the tool that is appropriate for a project can be difficult. Several questions need to be answered before selecting a tool. The following testing tool selection checklist lists the questions that can help the QA team evaluate and select an automated testing tool.
This checklist is used to verify the information available and is required at the beginning of the project. The QA testing manager will assess the impact of every negative response and document it as an issue to the concerned parties for resolution. This can be accomplished through weekly status reports or e-mail. The following is a sample that is also included in the CD at the back of the book.
The impact analysis checklist is used to help analyze the impacts of changes to the system. The test manager will assess the impact of each negative response and document it as an issue to the concerned parties. This can be accomplished through weekly status reports or e-mail. The following is a sample that is also included in the CD at the back of the book.
The purpose of the environment readiness checklist is to verify the readiness of the environment for testing before starting test execution. The test manager will assess the impact of each negative response and document it as an issue to the concerned parties. This can be accomplished through weekly status reports or e-mail. The following is a sample that is also included in the CD at the back of the book.
The project completion checklist is used to confirm that all the key activities have been completed for the project. The following is a sample that is also included in the CD at the back of the book.
The unit testing checklist is used to verify that unit testing has been thorough and comprehensive. The following is a sample that is also included in the CD at the back of the book.
The ambiguity review checklist is used to assist in the review of a functional specification of structural ambiguity (not to be confused with content reviews). The QA project manager will assess and document every negative response as an issue to the concerned parties for resolution. This can be accomplished through weekly status reports or e-mail. The following is a sample that is also included in the CD at the back of the book.
The architecture review checklist is used to review the architecture for completeness and clarity. The test manager will assess and document every negative response as an issue to the concerned parties for resolution. This can be accomplished through weekly status reports or e-mail. The following is a sample that is also included in the CD at the back of the book.
The data design review checklist is used to review the logical and physical design for clarity and completeness. The QA project manager will assess and document every negative response as an issue to the concerned parties for resolution. This can be accomplished through weekly status reports or e-mail, depending on the severity. The following is a sample that is also included in the CD at the back of the book.
The functional specification review checklist is used to review a functional specification for content completeness and clarity (not to be confused with ambiguity reviews). The QA project manager will assess and document every negative response as an issue to the concerned parties for resolution. This can be accomplished through weekly status reports or e-mail. The following is a sample that is also included in the CD at the back of the book.
The prototype review checklist is used to review a prototype for content completeness and clarity. The test manager will assess and document every negative response as an issue to the concerned parties for resolution. This can be accomplished through weekly status reports or e-mail. The following is a sample that is also included in the CD at the back of the book.
The requirements review checklist is used to verify that the testing project requirements are comprehensive and complete. The test manager will assess and document every negative response as an issue to the concerned parties for resolution. This can be accomplished through weekly status reports or e-mail. The following is a sample that is also included in the CD at the back of the book.
The technical design review checklist is used to review the technical design for clarity and completeness. The QA project manager will assess and document every negative response as an issue to the concerned parties for resolution. This can be accomplished through weekly status reports or e-mail, depending on the severity. The following is a sample that is also included in the CD at the back of the book.
This is used to ensure that test cases have been prepared as per specifications. The test manager will assess the impact of every negative response and document it as an issue to the concerned parties for resolution. This can be accomplished through weekly status reports or e-mail. The following is a sample that is also included in the CD at the back of the book.