A
Acceptance testing, 257
approvals, obtaining, 258
complete acceptance test cases, 256–257
complete acceptance test planning, 253–256
documenting acceptance defects, 259
environment, establishing, 256
new acceptance tests, executing, 259
plan, reviewing/approving, 257–258
regression test, acceptance fixes, 258–259
schedule, finalizing, 255
system-level test cases, identification of, 257
tools, installing, 256
Ad hoc testing, 68–71, 318–319
advantages, disadvantages of, 71
Agenda, defining, 156
acceptance testing, 451
continuous improvement, 444
formal requirements, agile user stories, contrasted, 371–372
information gathering, 445
preparing for next spiral, 449
Quick Test Professional, 374
spiral test results, summarizing/reporting, 452
system testing, 450
test case design, 447
test development, 448
Test-Driven Development, 374
test execution/evaluation, 448
test planning, 446
user story, defining, 372
Ambiguity review checklist, 536–537
Application GUI components, identification of, 202
Application-specific function libraries, 336
Approval procedures, defining, 187–188
Approvals, obtaining, 194, 206–208
Architecture review checklist, 538
Attributes testing checklist, 514–516
Automated testing, 399–400, 410–439, 492
acceptance test, 437
acquisition plan, developing, 432, 434
candidate review, conducting, 433
candidate tools, identification of, 433
candidates, scoring, 433
determine whether goals have been met, 439
evaluation methodology, 431–439
evaluation plan, creation of, 436
implement modifications, 438
operating environment, tools in, 438–439
orientation, conducting, 437–438
procure testing tool, 436
proposals, solicitation of, 435
receive tool, 437
request for proposal, generation of, 434–435
requirements, reviewing, 434
selection activities for formal procurement, conducting, 434–435
selection activities for informal procurement, conducting, 432–434
selection criteria, defining, 432–433
selection of tool, 434
set tool objectives, 432
technical evaluation, performing, 435
technical requirements document, creating, 434
test requirements, defining, 431
tool in operating environment, 438–439
tool manager’s plan, creation of, 436–437
tool selection, 434
tool source, selection of, 435
training plan, creation of, 437
training tool users, 438
write evaluation report, 439
B
Baldrige, Malcolm, 34
Baldrige performance excellence criteria, 35
Black-box testing, 39–40, 558–559
extra program logic, 559
Bottom-up testing, 559
Boundary value testing, 559–561
field ranges, 560
GUI, 561
nonnumeric input data, 560
nonnumeric output data, 561
number of outputs, 561
numeric input data, 560
numeric output data, 560
output range of values, 560
Branch/condition coverage testing, 562–563
Branch coverage testing, 561–562
BTO. See Business technology optimization
Burnout tracking, 228–231, 264–266
Business technology optimization, 6
C
Capability maturity model, 29–33
Capability Maturity Model for Software, 34
Cause-effect graphing, 563–567
causes, 565
decision table, 566
methodology, 564
Change request form, 457
Change request procedures, establishing, 184–185
requirements phase defect checklist, 493–494
China, emergence of software companies in, 387
Clarification request, template, 484–485
Client/server challenge, 140–141
Client/server spiral testing, psychology of, 141–146
integration of QA, development, 143–144
iterative/spiral development methodology, 144–146
new school of thought, 141–142
tester/developer perceptions, 142–143
CMM. See Capability maturity model
Coding phase defect checklist, 499–502
Commercial vendor tool descriptions, 410
Compliance testing, 364–365, 375–376
Computer risk analysis, 163
Computer Society of Institute of Electrical and Electronics Engineers, guidelines for writing software requirements specifications, 371–372
Configuration build procedures, defining, 186
Continuous competency development, 382
Continuous improvement phased approach, 88
Continuous improvement spiral testing, 151–154
Continuous quality improvement, 280–281
Control flow testing checklist, 523
Control testing checklist, 518–523
CRUD testing, 568
Cultural differences, 394
D
Data design review checklist, 539–540
Data-driven framework, 338
Data generation strategies, 401–408
cutting-edge test case generator, requirements-based, 404–408
data based on database, generating, 403–404
sampling from production, 401–402
starting from scratch, 402
attributes
definition, 584
vs. relationships, 586
columns, 574
compound primary keys, 575
CRUD testing, 569
customer address table, 589
customer/salesperson table, 584
data modeling essentials, 571–573
database integrity testing, 568–571
definition, 594
dependency constraints constraint rule, 597–600
desk table, 578
domain integrity, 570
employee/project table, 581–582
employee table, 578–579, 581–582, 587, 598–599
entities vs. relationships, 583–584
entity classes, 577
entity integrity, 568
entity subtypes, 594
line-item table, 597
many-to-many, 580
model, defining, 572
model creation, rationale, 572–573
model refinement, 593
model use, in database design, 599
multiple relationships, 582–583
order, 574
order table, 592–593, 595, 597
primary key integrity, 569
problems with unnormalized entities, 587–588
product/model/contract table, 593
purchase agreement table, 583
referential integrity, 570–571, 594–597
referential integrity test cases, 571
relational design, 600
relationships, definition, 577–584
rows, 574
sample table, 574
telephone line table, 580
user-defined integrity, 570
decision table, 601
Defect management process, 301–307
defect category, 303
defect discovery, classification, 301–302
defect meetings, 305
defect reporting, 304
quality control, 301
Defect recording/tracking procedures, establishing, 182–183
Defect report, template, 470–472, 490
Defect severity status, 228, 264
Defects, method of finding, 266–267
Defining metric objectives, test planning, 188–193
Defining system/acceptance tests, 203–206
Deming, Edward
fourteen quality principles of, 77–83
adoption of new philosophy, 78
awarding business on price tag alone, 79
barriers between staff areas, breaking down, 81
education, 82
institute leadership, 80
mass inspection, ceasing dependence on, 78–79
numerical codes, elimination of, 81–82
pride of workmanship, removing barriers to, 82
production, service, improving, 79
slogans, exhortations, targets for workforce, elimination of, 81
transformation, taking action to accomplish, 82–83
Dependencies, defining, 177–178
Design, reviewing/approving, 206–208
Design and execution maturity, 328–329
Desk checking, 601
Development acceptance, obtaining, 25–26
Development methodology overview, 139–154
client/server challenge, 140–141
continuous improvement spiral testing, 151–154
joint application designs, role of, 146
limitations of life-cycle development, 139–140
prototypes, methodology for developing, 148–151
demonstrating prototype to users, 150
production system, developing, 151
prototype, developing, 148–149
prototypes to management, demonstrating, 149
specifications, revising and finalizing, 150–151
psychology of client/server spiral testing, 141–146
integration of QA, development, 143–144
iterative/spiral development methodology, 144–146
new school of thought, 141–142
tester/developer perceptions, 142–143
Division of responsibilities, 316–317
Dynamic testing code, static testing, 131–136
E
Effort estimation
maturity, 328
Elements of software configuration management, 20
Emerging specialized areas in testing, 321–396
Environment readiness checklist, 529–530
Equivalence partitioning, 601–604
equivalence class partitioning, test cases using, 603–604
field ranges, 602
income/tax test cases, 602
income vs. tax percentage, 602
nonnumeric input data, 603
nonnumeric output data, 603
number of items, 603
number of outputs, 603
numeric input data, 602
numeric output data, 602
output range of values, 602
sets of values, 602
tables or arrays, 603
Error handling, defining, 337
Error testing checklist, 506–508
Establish transparency, 394
Estimating test work effort, 292–293
Evaluation of automated testing tools, 431–439
acquisition plan, developing, 432, 434
candidate review, conducting, 433
candidate tools, identification of, 433
candidates, scoring, 433
determine whether goals have been met, 439
evaluation plan, creation of, 436
implement modifications, 438
orientation, conducting, 437–438
perform acceptance test, 437
procure testing tool, 436
proposals, solicitation of, 435
receive tool, 437
request for proposal, generation of, 434–435
requirements, reviewing, 434
selection activities for formal procurement, conducting, 434–435
selection activities for informal procurement, conducting, 432–434
selection criteria, defining, 432–433
set tool objectives, 432
technical evaluation, performing, 435
technical requirements document, creating, 434
test requirements, defining, 431
tool in operating environment, 438–439
tool manager’s plan, creation of, 436–437
tool selection, 434
tool source, selection of, 435
training plan, creation of, 437
training tool users, 438
write evaluation report, 439
Evolution of automated testing tools, 8–11
test case/error exception test matrix, 605
advantages, disadvantages of, 73
art of, 72
Extreme programming, 8
F
Factors limiting testing tools, 429–430
Field testing checklist, 502–503
Final test report, publishing, 273–276
Final test summary report, template, 491–492
First computers, development of, 7
Folder structure, defining, 335–336
Formal requirements, agile user stories, contrasted, 371–372
FORTRAN, first 1GL programming language, 7
Foundation for Malcolm Baldrige National Quality Award, 34–37
Free-form testing, 605
Function, defects by, 264
Function/test matrix
building, 200
template, 464
Function tests, designing, 195–200
Functional specification review checklist, 540–546
Functional test requirements, refining, 195–199
Functions tested and not tested, 267–268
G
Goals of usability testing, 359–364
accessibility testing, 361–364
guidelines for usability testing, 361
GUI-based functional test matrix, template, 465
GUI component test matrix, template, 464
GUI design, guidelines, 200–201
GUI tests
H
High-level business requirements, identification of, 161–162
High-level functional requirements, defining, 170
High-level project activities, identification of, 292
response time histogram, 606
response time of 100 samples, 606
History of software testing, 3–11
business technology optimization, 6
development of first computers, 7
evolution of automated testing tools, 8–11
extreme programming, 8
FORTRAN, first 1GL programming language, 7
historical software testing and development parallels, 6–8
popular scripting techniques, 11
static capture/replay tools with scripting language, 10
static capture/replay tools without scripting language, 10
testing principles, 5
variable capture/replay tools, 10–11
I
Identifying high-level project activities, 292
Impact analysis checklist, 527–528
India, emergence of software companies in, 387
Individual finding, defects by, 267
Industry best processes, 381
Information gathering, 155–165
agenda, defining, 156
high-level business requirements, identification of, 161–162
interview
confirming findings, 165
preparing for, 156
summarizing, 165
participants, identification of, 156
project, understanding, 158–159
project development methodology, understanding, 161
project objectives, understanding, 159–160
project plans, understanding, 160–161
project status, understanding, 160
risk analysis, performing, 162–165
summarize findings, 165
Instinct, 163
Integrated Product Development Capability Maturity Model, 34
Integrated testing, development, 309–313
development methodology, modifying, 312
incorporate defect recording, 313
integrated team, 313
quality control, 309
select integration points, 311–312
tasks to integrate, identification of, 310–311
test methodology training, 312–313
test steps, tasks, customizing, 311
test team, organizing, 310
Integration testing
completeness of integration test conditions, evaluation of, 124–125
integration test conditions, creation of, 124
interfaces for completeness, reconciliation of, 124
methodology for, 123
unit interfaces, identifying, 123–124
Interim report, publishing, 220–221
International Standards Organization, 29
Interviews
confirming findings, 165
preparing for, 156
summarizing, 165
IPD-CMM. See Integrated Product Development Capability Maturity Model
ISO. See International Standards Organization
ISO9000, 29
Iterative/spiral development methodology, 144–146
J
JADs. See Joint application designs
Joint application designs, 40, 608
role of, 146
K
Knowledge acquisition process, 345–346
L
Life-cycle testing, psychology of, 89
Limitations of life-cycle development, 139–140
Load testing, 344
Logical design phase defect checklist, 494–495
M
Maintenance process, defining, 337
Malcolm Baldrige National Quality Award, 34–37
Management acceptance, obtaining, 25
Management overhead, 394
Manual/automated GUI/function tests, scripting, 209–210
Manual/automated new spiral tests, executing, 219
Manual/automated system fragment tests, scripting, 210
Manual/automated test types, identification of, 171
Manual vs. automated testing, 41
Match/merge checklist, 511–512
Methodology checklist, 109–110
Methodology development, 139–154
client/server challenge, 140–141
continuous improvement spiral testing, 151–154
joint application designs, role of, 146
limitations of life-cycle development, 139–140
prototypes, methodology for developing, 148–151
demonstrating prototype to users, 150
production system, developing, 151
prototype, developing, 148–149
prototypes to management, demonstrating, 149
specifications, revising and finalizing, 150–151
psychology of client/server spiral testing, 141–146
integration of QA, development, 143–144
iterative/spiral development methodology, 144–146
new school of thought, 141–142
tester/developer perceptions, 142–143
Metric points, defining, 189–193
Minutes of meeting, template, 476–478
Modern software testing tools, 397–440
N
National Institute of Standards and Technology, 34–37
New school of thought, 141–142
NIST. See National Institute of Standards and Technology
Nonexistent, poor requirements, 68–73
Nonexistent requirements, 68–73
Nonfunctional testing, 343–365
goals of usability testing, 359–364
accessibility testing, 361–364
guidelines for usability testing, 361
knowledge acquisition process, 345–346
load testing, 344
performance deliverables, 350–351
performance monitoring, 344
file integrity checkers, 356–357
log reviews, 356
scope of security testing, identifying, 352
test case generation, execution, 353
virus detectors, 357
vulnerability scanning, 354–355
stress testing, 344
volume testing, 344
Numerical method for evaluating requirement quality, 54–55
O
OAT. See Orthogonal array testing
On-site/offshore model, 383–395
economic trade-offs, determining, 384
application management, 389
detailed design, 388
knowledge transfer, 388
milestone-based transfer, 388–389
steady state, 389
methodology, benefits of, 392–394
outsourcing methodology, 385–388
selection criteria, determining, 385
Open-source freeware vendor tools, 410
Organizational architecture, 315
Organizational relationships, 317
Orthogonal array testing, 608–610
parameter combinations, 610
parameter combinations with total enumeration, 609
Outsourcing methodology, 385–388
Overview of testing techniques, 39–50
gray-box testing, 41
joint application designs, 40
manual vs. automated testing, 41
static vs. dynamic testing, 41–42
taxonomy of software testing techniques, 42–50
white-box testing, 40
P
Parallels in development, software testing, 6–8
Participants, identification of, 156
PDCA. See Plan, do, check, act
People Capability Maturity Model, 33
Performance deliverables, 350–351
Performance monitoring, 344
Physical design phase defect checklist, 496–498
Placeware, 388
Plan, do, check and act, 84, 137–276
continuous improvement through, 83–84
test schedule, template, 481
Plan for review process, 105
Popular scripting techniques, 11
Positive and negative testing, 611–612
Potential acceptance tests, identification of, 206
Potential system tests, identification of, 203–205
Preparation for next spiral, 223–231
acceptance tests, updating, 225
function/GUI tests, updating, 223–225
metric graphics, publishing, 227–231
procedures, reassessing, 225–227
publishing interim test report, 227–231
system fragment tests, updating, 225
team
procedures, and test environment, reassessing, 225–227
test control procedures, reviewing, 226–227
test environment
updating, 227
test team, evaluating, 225–226
Prevention vs. detection, 14–15
Prior defect history testing, 612
Procedures testing checklist, 517
Process evaluation methodology, 324–330
analyzing information, 325–326
analyzing test maturity, 326–330
documenting findings, 330
gathering information, 325–326
identify key elements, 324–325
presenting findings, 330
Process for creating test cases from good requirements, 55–64
requirements, reviewing, 55–58
test case descriptions and objectives, writing, 62
test cases
test plan, writing, 58
test suite, identifying, 58–59
Product quality and project quality, 279–280
Product scope and project scope, 283–284
Program unit design phase defect checklist, 498–499
Project, understanding, 158–159
Project charter, 284
Project completion checklist, 530–532
Project development methodology, understanding, 161
Project framework, 281–283, 318–319
components of, 280
continuous quality improvement, 280–281
executing, monitoring, and controlling phases, 282–283
implement phase, 283
planning phase, 282
where no quality infrastructure exists, 317–318
Project goal, integration of QA, development, 143–144
Project information gathering checklist, 525–527
Project issue resolution procedures, defining, 186–187
Project management framework, 279–290
benefits of, 290
product quality and project quality, 279–280
product scope and project scope, 283–284
project charter, 284
project framework, components of, 280
project framework and continuous quality improvement, 280–281
project framework phases, 281–283
executing, monitoring, and controlling phases, 282–283
implement phase, 283
planning phase, 282
project manager in quality management, role of, 285–286
scope statement, 285
scoping project to ensure product quality, 283
summarizing/reporting tests results, 279
business knowledge, updating, 289
communicate issues as they arise, 288–289
improve process, 289
knowledge base creation, 289
new testing technologies and tools, learning, 289
requesting help from others, 288
test manager in quality management role, 286–288
analyzing requirements, 286
analyzing test results, 288
duplication and repetition, avoiding, 287
quality, 288
test data, defining, 287
validation of test environment, 287–288
Project management methodology, 277–320
Project manager in quality management, role of, 285–286
Project objectives, understanding, 159–160
Project plans, understanding, 160–161
Project quality management, 291–299
effort estimation, model project, 294–296
estimating test work effort, 292–293
identifying high-level project activities, 292
project quality management processes, 291
quality planning, 292
Project status, understanding, 160
Project status report, template, 486–488
Prototypes
demonstrating prototype to users, 150
demonstrating to management, 149
methodology for developing, 148–151
production system, developing, 151
specifications, revising, finalizing, 150–151
application prototyping, 615
cyclic models, 613
data-driven prototyping, 616
early-stage prototyping, 617
evolutionary and throwaway, 615
fourth-generation languages and prototyping, 614
iterative development accounting, 614
prototype systems development, 615–616
replacement of traditional life cycle, 616
user software engineering, 617
Psychology of client/server spiral testing, 141–146
integration of QA, development, 143–144
iterative/spiral development methodology, 144–146
new school of thought, 141–142
tester/developer perceptions, 142–143
Psychology of life-cycle testing, 89
Public Company Accounting Reform and Investor Protection Act of 2002. See Sarbanes-Oxley Act
Publishing final test report, 273–276
Publishing interim report, 220–221, 227–231
Publishing metric graphics, 227–231
Q
QTP. See Quick Test Professional
Quality assurance components, 17–18
Quality assurance framework, 13–37
Baldrige, Malcolm, 34
Baldrige performance excellence criteria, 35
capability maturity model, 29–33
Capability Maturity Model for Software, 34
Foundation for Malcolm Baldrige National Quality Award, 34–37
Integrated Product Development Capability Maturity Model, 34
International Standards Organization, 29
ISO9000, 29
Malcolm Baldrige National Quality Award, 34–37
National Institute of Standards and Technology, 34–37
People Capability Maturity Model, 33
prevention vs. detection, 14–15
quality assurance components, 17–18
software configuration management, 19–23
software quality assurance, 16–17
software quality assurance plan, 23–25
software quality assurance plan, developing/implementing, 23–26
Systems Engineering Capability Model, 34
total quality management program, 29
verification vs. validation, 15–16
software configuration management, 19–23
Quality planning, 292
Quality principles of Deming, 77–83
Quality standards, 26–37, 296–299
Quality through continuous improvement process, 75–84
continuous improvement through plan, do, check, act process, 83–84
Deming, Edward
fourteen quality principles of, 77–83
plan, do, check, act circles, 84
statistical methods, role of, 76–77
case-and-effect diagram, 76
control chart, 77
flowchart, 76
histogram, 77
run chart, 77
scatter diagram, 77
Quick Test Professional, 374
R
Random testing, 618
Range testing, 618
Record testing checklist, 503–505
Recovery functions, defining, 337
establishing strategy, 172–174
manual/automated spiral fixes, 217–219
maturity, 329
range testing test cases, 619
Reporting procedures, establishing, 187
Reporting tests results, 261–276
approvals, obtaining, 273
final test report
findings/recommendations, developing, 269–272
metric graphics, analyzing/creating, 263–269
perform data reduction, 261–262
project framework, 279
project overview, preparing, 263
remaining defects to matrix, posting, 262
review, scheduling/conducting, 272–273
test activities, summarizing, 263
test defects by test number, consolidating, 261–262
tests executed/resolved, ensuring, 261
Requirement changes, identification of, 221
Requirement quality factors, 52–54
modifiable, 53
necessary, 53
nonredundant, 53
within scope, 54
terse, 54
testable, 54
traceable, 54
Requirements, transforming to testable test cases, 51–73
nonexistent, poor requirements, 68–73
numerical method for evaluating requirement quality, 54–55
process for creating test cases from good requirements, 55–64
requirements, reviewing, 55–58
test case descriptions and objectives, writing, 62
test plan, writing, 58
test suite, identifying, 58–59
requirement quality factors, 52–54
modifiable, 53
necessary, 53
nonredundant, 53
within scope, 54
terse, 54
testable, 54
traceable, 54
software requirements as basis of testing, 51–52
transforming use cases to test cases, 64–68
summary, 68
test data, generating, 68
use case diagram, drawing, 64
use case scenarios, identifying, 66
Requirements definition maturity, 326–327
Requirements review checklist, 547–551
Requirements specification, 455–456
Requirements traceability matrix, template, 461–462
Retest matrix, template, 474–475
Reusing generic functions, application-specific function libraries, 336
Review, scheduling
conducting, 194
Review agenda, developing, 106
Review report, creating, 106
Reviewing test planning, 194
Risk analysis, performing, 162–165
Risk-based testing, 620
Robustness, modularize scripts/test data to increase, 11, 336
Root cause analysis, 266
S
Sahi, 374
Sample run chart, 621
Sandwich testing, 621
Schedule review, 105
Scope statement, 285
Scoping project to ensure product quality, 283
Screen data mapping, template, 485
Scripting guidelines and review checklists, developing, 336–337
Search test checklist, 510–511
SECM. See Systems Engineering Capability Model
Security design strategy, 242–243
file integrity checkers, 356–357
log reviews, 356
scope of security testing, identifying, 352
test case generation, execution, 353
virus detectors, 357
vulnerability scanning, 354–355
SEI-CMM. See Software Engineering Institute-Capability Maturity Model
Selenium, 374
Service Oriented Architecture testing, 367–369
SOA testing. See Service Oriented Architecture testing
Software Engineering Institute-Capability Maturity Model, 29–33
Software licensing, 394
Software quality assurance plan, 16–17, 23–25, 453–454
executing, 26
implementation planning, 26
software quality assurance plan, developing/implementing, 23–26
Software requirements as basis of testing, 51–52
Software testing as continuous improvement process, 89–92
Software testing techniques, 557–628
Software testing trends, 399–408
automated capture/replay testing tools, 399–400
data generation strategies, 401–408
cutting-edge test case generator, requirements-based, 404–408
data based on database, generating, 403–404
sampling from production, 401–402
starting from scratch, 402
necessary and sufficient conditions, 400–401
test case builder tools, 400
Southeast Asia, emergence of software companies in, 387
Spiral software testing methodology, 137–276
Spiral test defects, documenting, 219
Spiral testing methodology, 443–452
acceptance testing, 451
continuous improvement, 444
information gathering, 445
preparing for next spiral, 449
spiral test results, summarizing/reporting, 452
system testing, 450
test case design, 447
test development, 448
test execution/evaluation, 448
test planning, 446
Spiral testing summary report, template, 476
SQA. See Software quality assurance plan
State transition testing, 622–623
Statement coverage testing, 622
States testing checklist, 516–517
Static capture/replay tools
with scripting language, 10
without scripting language, 10
Static testing
dynamic testing, contrasted, 41–42
integration testing, 134
system testing, 134
testing coding with technical reviews, 131–132
unit testing, 133
data model, process model, and linkage, 115–117
refining system/acceptance test plan, 118–119
testing logical design with technical reviews, 117–118
completeness of integration test conditions, 124–125
integration test cases, creating, 122–123
interfaces for completeness, reconciliation of, 124
methodology for, 123
technical reviews, testing physical design with, 121–122
test conditions, creation of, 124
unit interfaces, identifying, 123–124
creating unit test cases, 128–129
with technical reviews, 127–128
with ambiguity reviews, 108–109
requirements traceability matrix, 110–111
system/acceptance test plan, building, 111–113
with technical reviews, 109
Statistical methods, role of, 76–77
case-and-effect diagram, 76
control chart, 77
flowchart, 76
histogram, 77
run chart, 77
scatter diagram, 77
Statistical profile testing, 623
Stress testing, 344
Structured walkthroughs, 101–102, 623–625
state transition table, 624
Summarizing/reporting tests results, 261–276
approvals, obtaining, 273
final test report
findings/recommendations, developing, 269–272
metric graphics, analyzing/creating, 263–269
perform data reduction, 261–262
project overview, preparing, 263
remaining defects to matrix, posting, 262
review, scheduling/conducting, 272–273
test activities, summarizing, 263
test defects by test number, consolidating, 261–262
tests executed/resolved, ensuring, 261
Syntax testing, 625
System/acceptance test plan, template, 460–461
System fragment tests, designing, 205–206
System summary report, template, 469–470
backup tests, designing/scripting, 247
compatibility tests, designing/scripting, 244–245
complete system test cases, 239–250
complete system test plan, 233–239
conversion tests, designing/scripting, 245–246
defect types, 268
environment, establishing, 238–239
installation tests, designing/scripting, 248–249
new system tests, executing, 251
other system test types, designing/scripting, 249–250
performance tests, design/script, 239–240
probe, 241
recovery tests, designing/scripting, 248
regression test, system fixes, 251
schedule, finalizing, 235
script documentation tests, design/, 246–247
security tests, designing/scripting, 242–243
stress tests, designing/scripting, 243–244
system defects, documenting, 251–252
test drivers, 241
tools, installing, 239
usability tests, designing/scripting, 246
volume tests, designing/scripting, 243
Systems Engineering Capability Model, 34
T
Taxonomy, software testing tools, 409–430
commercial vendor tool descriptions, 410
factors limiting testing tools, 429–430
open-source freeware vendor tools, 410
testing tool selection checklist, 409–410
Taxonomy of software testing techniques, 42–50
TDD. See Test-Driven Development
Technical design review checklist, 552–554
Technical reviews
as continuous improvement process, 96–100
motivation, 101
Test approvals, template, 478
Test automation assessment, 323–342
identifying applications to automate, 332
identifying best test automation tool, 332–333
test script maintenance, 334
test scripting, 333
Test automation framework, 334–342, 382
automation framework, basic features, 335–337
keyword-driven framework, 11, 339–341, 635
standard automation frameworks, 337–339
Test automation maturity, 329–330
Test automation standard frameworks, 337–339
Test automation strategy, template, 492
Test automation tool identification, 332–333
Test case builder tools, 400
application GUI components, identification of, 202
defining system/acceptance tests, 203–206
design, reviewing/approving, 206–208
function/test matrix, building, 200
function tests, designing, 195–200
functional test requirements, refining, 195–199
potential acceptance tests, identification of, 206
potential system tests, identification of, 203–205
review, scheduling/preparing for, 206
system fragment tests, designing, 205–206
Test case execution status, 227–228
Test case log, template, 466–467
Test case preparation review checklist, 554–555
Test case template, 466
Test condition vs. test case, template, 486
Test coverage through traceability, 213–216
Test defect details report, template, 489
Test deliverables, defining, 174–175
Test development, 209–212, 346–350
manual/automated GUI/function tests, scripting, 209–210
manual/automated system fragment tests, scripting, 210
test scripts, developing, 209–210
Test-Driven Development, 374
Test environment, establishing, 177
Test execution/evaluation, 217–221
manual/automated new spiral tests, executing, 219
publishing interim report, 220–221
regression test, manual/automated spiral fixes, 217–219
requirement changes, identification of, 221
spiral test defects, documenting, 219
test schedule, refining, 220–221
Test execution plan, template, 479
Test execution tracking manager, template, 490
Test exit criteria, identification of, 171–172
Test log summary report, template, 468
Test management constraints, 315–320
division of responsibilities, 316–317
organizational architecture, 315
organizational relationships, 317
project framework where no quality infrastructure exists, 317–318
traceability/validation matrix, 319
traits of well-established quality organization, 315–316
analyzing requirements, 286
analyzing test results, 288
business knowledge, updating, 289
communicate issues as they arise, 288–289
duplication and repetition, avoiding, 287
improve process, 289
knowledge base creation, 289
new testing technologies and tools, learning, 289
quality, 288
requesting help from others, 288
test data, defining, 287
validation of test environment, 287–288
Test planning, 167–194, 293–294
approval procedures, defining, 187–188
change request procedures, establishing, 184–185
configuration build procedures, defining, 186
defect recording/tracking procedures, establishing, 182–183
defining metric objectives, 188–193
dependencies, defining, 177–178
high-level functional requirements, defining, 170
manual/automated test types, identification of, 171
metric points, defining, 189–193
project issue resolution procedures, defining, 186–187
regression test strategy, establishing, 172–174
reporting procedures, establishing, 187
review, scheduling/conducting, 194
schedule test, 95
specifications development, 95
test deliverables, defining, 174–175
test environment
defining, 95
establishing, 177
test exit criteria, identification of, 171–172
test objectives, defining, 93
test schedule, creating, 178
test team, organizing, 175–176
test tools, selection of, 178–182
version control procedures, establishing, 185–186
Test process assessment, 323–324
Test project milestones, template, 480
Test schedule
creating, 178
Test scripts, developing, 209–210
Test strategy, template, 481–484
Test team, organizing, 175–176
clarification request, 484–485
final test summary report, 491–492
function/test matrix, 464
GUI-based functional test matrix, 465
GUI component test matrix, 464
plan, do, check, act test schedule, 481
project status report, 486–488
requirements traceability matrix, 461–462
screen data mapping, 485
spiral testing summary report, 476
system/acceptance test plan, 460–461
system summary report, 469–470
test approvals, 478
test automation strategy, 492
test case, 466
test condition vs. test case, 486
test defect details report, 489
test execution plan, 479
test execution tracking manager, 490
test log summary report, 468
test project milestones, 480
Test tools, selection of, 178–182
Tester, defects by, 264
Tester/developer perceptions, 142–143
Testing center of excellence, 377–382
continuous competency development, 382
industry best processes, 381
test automation framework, 382
testing metrics, 381
Testing metrics, 381
Testing physical design with technical reviews, 121–122
Testing principles, 5
Testing tool selection checklist, 409–410, 524–525
Thread testing, 626
Total quality management program, 29
TQM program. See Total quality management program
Traceability, test coverage through, 213–216
Traceability/validation matrix, 319
Traits of well-established quality organization, 315–316
Transforming requirements to testable test cases, 51–73
Transforming use cases to test cases, 64–68
U
Unit test plan, template, 459–460
Unit testing checklist, 532–535
Use cases, transforming to test cases, 64–68
summary, 68
test data, generating, 68
use case diagram, drawing, 64
use case scenarios, identifying, 66
V
Variable capture/replay tools, 10–11
Verification vs. validation, 15–16
Version control procedures, establishing, 185–186
Volume testing, 344
W
Waterfall development methodology, 87–88
Waterfall testing review, 85–136
Watir, 374
Web applications, functional testing tools, 374
Weighting risk attributes, 164–165
regression test acceptance fixes, 258–259
regression test manual/automated spiral fixes, 217–219
X
XP. See Extreme programming