Skip to main content

Checklist for conducting Unit Tests

- Is the number of input parameters equal to number of arguments?
- Do parameter and argument attributes match?- Do parameter and argument units system match?
- Is the number of arguments transmitted to called modules equal to number of parameters?
- Are the attributes of arguments transmitted to called modules equal to attributes of parameters?
- Is the units system of arguments transmitted to called modules equal to units system of parameters?
- Are the number of attributes and the order of arguments to built-in functions correct?
- Are any references to parameters not associated with current point of entry?
- Have input only arguments altered?
- Are global variable definitions consistent across modules?- Are constraints passed as arguments?

When a module performs external I/O, additional interface tests must be conducted:

- File attributes correct?
- OPEN/CLOSE statements correct?- Format specification matches I/O statement?- Buffer size matches record size?
- Files opened before use?- End-of-file conditions handled?
- I/O errors handled?- Any textual errors in output information?
The local data structure for a module is a common source of errors.

Test cases should be designed to uncover errors in the following categories:

- improper or inconsistent typing- erroneous initialization or default values
- incorrect (misspelled or truncated) variable names- inconsistent data types
- underflow, overflow and addressing exceptions

From a strategic point of view, the following questions should be addressed:

- Has the component interface been fully tested?
- Have local data structured been exercised at their boundaries?
- Has the cyclomatic complexity of the module been determined?
- Have all independent basis paths been tested?
- Have all loops been tested appropriately?
- Have data flow paths been tested? Have all error handling paths been tested?

Comments

Popular posts from this blog

What is Installation testing?

Installation testing is done to verify whether the hardware and software are installed and configured properly. This will ensure that all the system components were used during the testing process. This Installation testing will look out the testing for a high volume data, error messages as well as security testing

What is Smoke Testing ?

Smoke testing is a relatively simple check to see whether the product "smokes" when it runs. Smoke testing is also sometimes known as ad hoc testing, i.e. testing without a formal test plan. With many projects, smoke testing is carried out in addition to formal testing. If smoke testing is carried out by a skilled tester, it can often find problems that are not caught during regular testing. Sometimes, if testing occurs very early or very late in the software development life cycle, this can be the only kind of testing that can be performed. Smoke testing, by definition, is not exhaustive, but, over time, you can increase your coverage of smoke testing. A common practice at Microsoft, and some other software companies, is the daily build and smoke test process. This means, every file is compiled, linked, and combined into an executable file every single day, and then the software is smoke tested. Smoke testing minimizes integration risk, reduces the risk of low quality, suppo...

Software Testing - Boundary-value analysis

Boundary-value analysis is a variant and refinement of equivalence partitioning, with two major differences: First, rather than selecting any element in an equivalence class as being representative, elements are selected such that each edge of the EC is the subject of a test. Boundaries are always a good place to look for defects. Second, rather than focusing exclusively on input conditions, output conditions are also explored by defining output ECs. What can be output? What are the classes of output? What should I create as an input to force a useful set of classes that represent the outputs that ought to be produced? The guidelines for boundary-value analysis are: · If an input specifies a range of valid values, write test cases for the ends of the range and invalid-input test cases for conditions just beyond the ends. Example: If the input requires a real number in the range 0.0 to 90.0 degrees, then write test cases for 0.0, 90.0, -0.001, and 90.001. · If an input specifies a numb...