Thursday, 27 March 2014

Test Plan

0 comments
Test Plan
“A document describing the scope, approach, resources, and schedule of intended testing activities. It identifies test items, the features to be tested, the testing tasks, who will do each task, and any risks requiring contingency planning.”
This standard specifies the following test plan outline:

1.    Test Plan Identifier

2.    Introduction

3.    Test Items

4.    Features to be Tested

5.    Features Not to Be Tested

6.    Approach

7.    Item Pass/Fail Criteria

8.    Suspension Criteria and Resumption Requirements

9.    Testing Tasks

10.  Test Deliverables

11.  Environmental Needs

12.  Responsibilities

13.  Staffing and Training Needs

14.  Schedule

15.  Risks and Contingencies

16.  Approvals


1)Test Plan Identifier
§  A unique identifier

2)Introduction

§  Summary of the items and features to be tested
§  Need for and history of each item (optional)
§  References to related documents such as project authorization, project plan, QA plan, configuration management plan, relevant policies, relevant standards
§  References to lower level test plans

3) Test Items

§  Test items and their version
§  Characteristics of their transmittal media
§  References to related documents such as requirements specification, design specification, users guide, operations guide, installation guide
§  References to bug reports related to test items
§  Items which are specifically not going to be tested (optional)

4) Features to be Tested

§  All software features and combinations of features to be tested
§  References to test-design specifications associated with each feature and combination of features

5) Features Not to Be Tested

§  All features and significant combinations of features which will not be tested
§  The reasons these features won’t be tested

6) Approach

§  Overall approach to testing
§  For each major group of features of combinations of features, specify the approach
§  Specify major activities, techniques, and tools which are to be used to test the groups
§  Specify a minimum degree of comprehensiveness required
§  Identify which techniques will be used to judge comprehensiveness
§  Specify any additional completion criteria
§  Specify techniques which are to be used to trace requirements
§  Identify significant constraints on testing, such as test-item availability, testing-resource availability, and deadline

7)Item Pass/Fail Criteria

§  Specify the criteria to be used to determine whether each test item has passed or failed testing

8)Suspension Criteria and Resumption Requirements

§  Specify criteria to be used to suspend the testing activity
§  Specify testing activities which must be redone when testing is resumed

9)Test Deliverables

§  Identify the deliverable documents: test plan, test design specifications, test case specifications, test procedure specifications, test item transmittal reports, test logs, test incident reports, test summary reports
§  Identify test input and output data
§  Identify test tools (optional)

10) Testing Tasks

§  Identify tasks necessary to prepare for and perform testing
§  Identify all task interdependencies
§  Identify any special skills required

11) Environmental Needs

§  Specify necessary and desired properties of the test environment: physical characteristics of the facilities including hardware, communications and system software, the mode of usage (i.e., stand-alone), and any other software or supplies needed
§  Specify the level of security required
§  Identify special test tools needed
§  Identify any other testing needs
§  Identify the source for all needs which are not currently available

12) Responsibilities

§  Identify groups responsible for managing, designing, preparing, executing, witnessing, checking and resolving
§  Identify groups responsible for providing the test items identified in the Test Items section
§  Identify groups responsible for providing the environmental needs identified in the Environmental Needs section

13) Staffing and Training Needs

§  Specify staffing needs by skill level
§  Identify training options for providing necessary skills

14) Schedule

§  Specify test milestones
§  Specify all item transmittal events
§  Estimate time required to do each testing task
§  Schedule all testing tasks and test milestones
§  For each testing resource, specify its periods of use

15) Risks and Contingencies

§  Identify the high-risk assumptions of the test plan
§  Specify contingency plans for each

16)Approvals

§  Specify the names and titles of all persons who must approve the plan
§  Provide space for signatures and dates

Test case Format in Excel sheet

0 comments




Equivalence Partitioning

0 comments

Equivalence Partitioning:-

Equivalence partitioning is a black-box testing method that divides the input domain of a program into classes of data from which test cases can be derived. Test case design for equivalence partitioning is based on an evaluation of equivalence classes for an input condition. An equivalence class represents a set of valid or invalid states for input conditions. Typically, an input condition is either a specific numeric value, a range of values, a set of related values, or a boolean condition. Equivalence classes may be defined according to the following guidelines:
  1. If an input condition specifies a range, one valid and two invalid equivalence classes are defined.
  2. If an input condition requires a specific value, one valid and two invalid equivalence classes are defined.
  3. If an input condition specifies a member of a set, one valid and one invalid equivalence class are defined.
  4. If an input condition is boolean, one valid and one invalid class are defined.
As an example, consider data maintained as part of an automated banking application.
The user can access the bank using a personal computer, provide a six-digit password, and follow with a series of typed commands that trigger various banking functions. During the log-on sequence, the software supplied for the banking application accepts data in the form:
  • area code—blank or three-digit number
  • prefix—three-digit number not beginning with 0 or 1
  • suffix—four-digit number
  • password—six digit alphanumeric string
  • commands—check, deposit, bill pay, and the like
The input conditions associated with each data element for the banking application can be specified as area code:
  • Input condition, Boolean—the area code may or may not be present.
  • Input condition, range—values defined between 200 and 999, with specific exceptions.
  • prefix: Input condition, range—specified value >200
  • Input condition, value—four-digit length
  • password: Input condition, Boolean—a password may or may not be present.
  • Input condition, value—six-character string.
  • command: Input condition, set—containing commands noted previously.
Applying the guidelines for the derivation of equivalence classes, test cases for each input domain data item can be developed and executed. Test cases are selected so that the largest number of attributes of an equivalence class are exercised at once.

Boundary Value Analysis (BA)

0 comments

Boundary Value Analysis:-

Boundary value analysis is one of the block box testing technique ,to test input object input data size or length/range there are 6 ways to test the data size or length
  • Min=
  • Max=
  • Min+1=
  • Max+1=
  • Min-1=
  • Max-1=
these are useful for test the input data size or length/range

Example for boundary value analysis:-

test data preparation for username object ,customer requirement is username size should be min size is 4 and max size is 16 now we test the data by using boundary value analysis

min=4-Should be accepted
max=16-Should be accepted
min+1=5-Should be accepted
max+1=17-should be rejected
min-1=3-Should be rejected
max-1=15-Should be accepted

Priority,Severity

0 comments

Priority:-

This field describes importance of test case.in general test engineers divided test cases as below types 
  • Functional test cases-High priority(P0)
  • Non functional test case-Medium priority(p1)
  • Cosmetic test case (GUI and usability test cases)-Low priority(P2)

Severity:-

This field describes seriousness of defect with respect to functionalities and testing also
        test engineers are divided defect severity into below types 

High severity:-

if you are finding any defect ,due to that defect we are not able to proceed for further testing

Medium severity:-

if you are finding any defect, we can proceed further testing but with respect to customer requirement resole it must

Low severity:-

if you are identified any defect we can proceed further testing but may or may not resolve that defect

Error, Defect and Bug

0 comments

     Error, Defect and Bug:


A mistake in code is called Error. Due to errors in coding, test engineers are getting mismatches in application called defects. If defected accepted by development to solve called Bug.


Integration Testing

0 comments

Integration testing:-

After completion of unit testing for each and every module units then programmers are integrating all the modules as a system or software based HLD document .after completion of integration of modules programmers are applying integration test to find out all the modules correctly integrated or not from base module to end module as per HLD document
                 Some times to implement coding for all the modules and to integrated all the modules with in specified for system testing that is not possible for .then programmers are following below types of integration approaches to integrate modules and to perform system testing
  • Top down approach
  • Bottom up approach
  • Hybrid approach
  • System approach

Top down approach:-

in this integration some of sub modules under code constructions instead of that sub module programmers are using dummy module called as "STUB" only for integration purpose

Bottom up approach:-

in this integration some of main module is under construction instead of that module programmers are using temporary module called "DRIVER" only for integration purpose

Hybrid approach:-

in this approach programmers are using both "STUB" and "DRIVER" for main module and sub module is called hybrid approach

System approach:-

coding is completed for all the modules at a time then programmers are following this type of integration approach
 

Copyright 2008 All Rights Reserved Revolution Two Church theme by Brian Gardner Converted into Blogger Template by Bloganol dot com