Test Plan for ABC Video Order Processing

This section shows the elements of a software test plan and how one company designs testing to validate that specification, design, and coding mesh with the functional and non-functional requirements of the system. The development of a test plan takes into consideration the programming language. Selection of a programming language can involve considering older languages, like COBOL, first used in 1960. Suppose a company has legacy business systems and a project application is well known to the company (meaning that it is a system with precedent) with stable requirements. In that case, COBOL might be a good choice. Indeed, a scan of job openings 70 years after COBOL was first used often finds several openings for COBOL programmers.

As you review this test plan, think about a plan to verify and validate the goals, design (organization and flow), and content of this course, CS302. Recall that at a high enough level of abstraction, the SDLC can be applied to other types of development, including course development.

Test Strategy

Subsystem or Integration Testing

Guidelines for Integration Testing 

The purpose of integration testing is to verify that groups of interacting modules that comprise an execute unit perform in a stable, predictable, and accurate manner that is consistent with all related program and systems design specifications. 

Integration tests are considered distinct from unit tests. That is, as unit tests are successful, integration testing for the tested units can begin. The two primary goals of integration testing are compatibility and intermodule processing accuracy.


System Prompt

User Action

Explanation


Menu

Press mouse, move to Rent/Return, and release

Select Rent/Return from menu

Rent/Return screen, cursor at request field

Scan customer bar code 1234567

Dummy bar code

Error Message 1: Illegal Customer or Video Code, Type Request

Enter: 1234567

Dummy bar code

Customer Data Entry Screen with message: Illegal Customer 10, enter new customer

<cr>

Carriage return entered to end Create Customer process

Rent/Return screen, cursor at request field

Scan customer bar code 2221234

Legal customer 10. System should return customer and rental information for M. A. Jones, Video 12312312, Copy 3, Terminator 2, Rental date 1/23/94, not returned.

Cursor at request field

Scan 123123123

Cursor moves to rented video line

Cursor at return date field

Enter yesterday's date

Error message: Return date must be today's date.

Cursor at return date field

Enter today's date

Late fee computed and displayed ... should be $4.00.

Cursor at request field

Scan new tape 10–

123412345

New tape entered and displayed. Video #12341234, Copy 5, Mary Poppins, Rental date 1/25/94, Charge $2.00.

Cursor at request field

Press <cr>

System computes and displays Total Amount Due ... should be $6.00.

Cursor at Total Amount Paid field

Enter <cr>

Error Message: Amount paid must be numeric and equal or greater than Total Amount Due.

Cursor at Total Amount Paid field

Enter 10 <cr>

System computes and displays Change Due . .. should be $4.00. Cash drawer should open.

Cursor at request field

Enter <cr>

Error Message: You must enter P or F5 to request print.

Cursor at request field

Enter P <cr>

System prints transaction

 

 

 


Go to SQL Query and verify Open Rental and Copy contents

Open Rental tuple for Video 123123123 contents should be:       

     22212341231231230123940200012594040000000000000

Open Rental tuple for Video 123412345 should be:       

     22212341234123450125940200000000000000000000000

Copy tuple for Video 12312312, Copy 3 should be:

     12312312311019200103

Copy tuple for Video 12341234, Copy 5 should be:

     12341234511319010000

Verify the contents of the receipt.

 

FIGURE 17-19 ABC Video Unit Test Example-Rent/Return


Integration tests are considered distinct from unit tests. That is, as unit tests are successful, integration testing for the tested units can begin. The two primary goals of integration testing are compatibility and intermodule processing accuracy. 

Compatibility relates to calling modules in an operational environment. The test verifies first that all modules are called correctly, and, even with errors, do not cause abends. Intermodule tests check that data transfers between modules operate as intended within constraints of CPU time, memory, and response time. Data transfers tested include sorted and extracted data provided by utility programs, as well as data provided by other application modules.

Test cases developed for integration testing should be sufficiently exhaustive to test all possible interactions and may include a subset of unit test cases as well as special test cases used only in this test. The integration test does not test logic paths within the modules as the unit test does. Instead, it tests interactions between modules only. Thus, a black-box strategy works well in integration testing. 

If modules are called in a sequence, checking of inputs and outputs to each module simplifies the identification of computational and data transfer errors. Special care must be taken to identify the source of errors, not just the location of bad data. Frequently, in complex applications, errors may not be apparent until several modules have touched the data and the true source of problems can be difficult to locate. Representative integration test errors are listed in Table 17-7.

Integration testing can begin as soon as two or more modules are successfully unit tested. When to end integration tests is more subjective. When exceptions are detected, the results of all other test processing become suspect. Depending on the severity and criticality of the errors to overall process integrity, all previous levels of testing might be re-executed to re-verify processing. Changes in one module may cause tests of other modules to become invalid. Therefore, integration tests should be considered successful only when the entire group of modules in an execute unit are run individually and collectively without error. Integration test curves usually start low, increase and peak, then decrease (see Figure 17-20). If there is pressure to terminate integration testing before all errors are found, the rule of thumb is to continue testing until fewer errors are found on several successive test runs.


TABLE 1 7-7 Sample Integration Test Errors

Intermodule communication
Called module cannot be invoked
Calling module does not invoke all expected modules
Message passed to module contains extraneous information
Message passed to module does not contain correct information
Message passed contains wrong (or inconsistent) data type
Return of processing from called module is to the wrong place
Module has no return
Multiple entry points in a single module
Multiple exit points in a single module
Process errors
Input errors not properly disposed
Abend on bad data instead of graceful degradation
Output does not match predicted results
Processing of called module produces unexpected results does not match prediction
Time constrained process is over the limit
Module causes time-out in some other part of the application


ABC Video Integration Test 

Because of the redesign of execution units for more efficient SQL processing, integration testing can be concurrent with unit code and test work, and should integrate and test the unit functions as they are complete. The application control structure for screen processing and for calling modules is the focus of the test.


FIGURE 17-20 Integration Test Errors Found . Over Test Shots


Black-box, top-down testing is used for the integration test. Because SQL does not pass data as input, we predict the sets that SQL will generate during SELECT processing. The output sets are then passed to the control code and used for screen processing, both of which have been unit tested and should work. To verify the unit tests at the integration level, we should: 

1. Ensure that the screen control structure works and that execute units are invoked as intended. 

2. Ensure that screens contain expected data from SELECT processing. 

3. Ensure that files contain all updates and created records as expected. 

4. Ensure that printed output contains expected information in the correct format.

First, we want to define equivalent sets of processes and the sets' equivalent sets of data inputs. For instance, the high level processes from IE analysis constitute approximately equivalent sets. These were translated into modules during design and, with the exception of integrating data access and use across modules, have not changed. These processes include Rent/Return, Customer Maintenance, Video Maintenance, and Other processing. If the personnel are available, four people could be assigned to develop one script each for these equivalent sets of processing. Since we named Rent/Return as the highest priority for development, its test should be developed first. The others can follow in any order, although the start-up and shutdown scripts should be developed soon after Rent/Return to allow many tests of the entire interface. 

First, we test screen process control, then individual screens. Since security and access control are embedded in the screen access structure, this test should be white box and test every possible access path, including invalid ones. Each type of access rights and screen processing should be tested . For the individual screens, spelling, positioning, color, highlighting, message placement, consistency of design, and accuracy of information are all validated (see Figure 17-21).

The integration test example in Figure 17-22 is the script for testing the start-up procedure and security access control for the application. This script would be repeated for each valid and invalid user including the other clerks and accountant. The startup should only work for Vic, the temporary test account, and the chief clerk. The account numbers that work should not be documented in the test script. Rather, a note should refer the reader to the person responsible for maintaining passwords.

1. Define equivalent sets of processes and data inputs. 

2. Define the priorities of equivalent sets for testing. 

3. Develop test scrips for Rent/Return, Other processing, Customer Maintenance, Video Maintenance. 

4. For each of the above scripts, the testing will proceed as follows: 

a. Test screen control, including security of access to the Rent/Return application. 

b. Evaluate accuracy of spelling, format, and consistency of each individual screen. 

c. Test access rights and screen access controls. 

d. Test information retrieval and display. 

e. For each transaction, test processing sequence, dialogue, error messages, and error processing. 

f. Review all reports and file contents for accuracy of processing, consistency, format, and spelling.

FIGURE 17-21 ABC Integration Test Plan

System Prompt

User Action

Explanation

C:>

StRent<cr>

StRent is Exec to startup the Rental/Return Processing application

Enter password:

<cr>

Error

Password

 

 

Password must be alphanumeric and six characters.

 

 

Enter Password:

123456<cr>

Error-illegal password

Password illegal, try again.

 

 

Enter Password:

Abcdefg

Error-illegal password

Three illegal attempts at password. System shutdown

 

 

G:>

StRent<cr>

Error-3 illegal attempts requires special start-up.              

Illegal start-up attempt

 

 

System begins to beep continuously until stopped by system administrator. No further prompts.

 

 

 

 

 

Single User Sign-on

 

 

C:>

StRent<cr>

StRent is Exec to startup the Rental/Return Processing application

Enter Password:

<cr>

Error

Password illegal, try again.

 

 

Enter Password:

VAC5283

Temporary legal entry

User Sign-on menu

 

 

Enter Initials:

<cr>

Error

You must enter your initials.

 

 

Enter initials:

VAV

Error

Initials not authorized, try again.

 

 

Enter initials:

VAC

Legal entry <VAC is Vic?

Main Menu with all Options

Begin Main Menu Test

 


FIGURE 17-22 ABC Video Integration Test Script


In the integration portion of the test, multiuser processing might take place, but it is not necessarily fully tested at this point. File contents are verified after each transaction is entered to ensure that file updates and additions are correct. If the integration test is approached as iteratively adding modules for testing, the final run-through of the test script should include all functions of the application, including start-up, shutdown, generation and printing of all reports, queries on all files, all file maintenance, and all transaction types. At least several days and one monthly cycle of processing should be simulated for ABC's test to ensure that end-oi-day and end-of-month processing work. 

Next, we discuss system testing and continue the example from ABC with a functional test that is equally appropriate at the integration, system, or QA levels.