PROJECT INFORMATION
Application: Online Banking Account
Document Owner: The primary contact for questions regarding this document is: X
Author: S P, Sr. QA Analyst
Project Name: Online Account Management
Phone: (999) 854- 3015
Email: Sp@CFD.com
======================
Privacy Information: This document may contain sensitive information related to the projects within C F B. This information should not be given to any person other than those who are involved in the Online Account Management Application. It cannot be used for the purposes other than that prescribed by the project guidelines. Any unauthorized use of this document could result in severe legal recrimination.
Copyright © 2005
Revision History of the Test Plan
INTRODUCTION
APPLICATION OVERVIEW
The project involved developing the B’s module for an online mortgage application, which allows customers to apply for mortgage, checks the status of their application, a mortgage calculator and many mortgage tools. It provides the customers interactive tools in helping them decide whether to buy or rent a house, refinance, interest rate, property rate etc. Provides the customers graphical analysis, tabular analysis depending on their inputs. It provides the users with secure access and helps to increase the efficiency and the quality of loan decisions taken by the company.
PURPOSE AND SCOPE OF THE Q / A TEST PLAN
The purpose of the test plan is to describe the scope, objective, focus and approach of the testing application on online access to their accounts. This document serves as a guideline for the methodology to be approached for the testing process either manual or automated, generating test cases, bug documentation and bug reporting. This test plan defines the approach to be adopted by the QA team with the responsibilities of each group outlined in the chapters to come.
ROLES / RESPONSIBILITIES OVERVIEW
The chart below gives the list of tasks associated with this project outlining the roles and responsibilities of each participant. One or more participants will be held responsible for completing the task. Each task has one participant who will be held accountable. Some participants will be consulted before the task is completed and some will be informed after the task is completed. The assignments will be carried out by the QA Manager and assigned by the Project Manager.
RESPONSIBILITIES
PROJECT LEAD CONTACTS
QUALITY ASSURANCE ENGINEERS
DOCUMENTATION
Listed below are pertinent documents for the project. It also provides the contact information of the person in charge of the document.
TEST ENVIRONMENT
HARDWARE
The Quality Assurance group will have control of one application server, and the client interfaces separate from any used by non-test members of the project team. The QA group will be provided with six workstations i.e. six computers operating on Windows NT. The hardware requirements are listed below:
¨ Operating System : Windows 2000
¨ CPU : 2200 MHz (Pentium 4)
¨ RAM :512 MB
¨ Disk Storage : 40 GB
¨ CD / DVD Drive : 56 X Max
¨ Multimedia : 32-bit sound card with speakers
¨ Video Adaptor : 32 MB AGP Graphics
¨ Monitor : 17” color SVGA
¨ Removable Media :1.44MB 3.5” floppy Zip 100 Drive
¨ Network Access :10 Base-T Ethernet or v.90 modem
¨ NIC :3 Com Ethernet 10 / 100
SOFTWARE
The Quality Assurance group will use the following software to test the project:
¨ Load Runner 7.5
¨ WinRunner 7.0
¨ Test Director 6.02
¨ QTP 6.5
DATABASE
The Quality Assurance group will have control of one database separate from any used by non-test members of the project team. The name of the database will be Mortgage. This database will be accessible only by the authorized QA personnel.
OPERATING SYSTEM
BROWSERS
TESTING STRATEGY / APPROACH
Functional Testing
The purpose of functionality testing is to reveal defects related to the project components functionality. It aims to assist in ensuring that the components conform to the functional requirements. It determines whether the window or application functions correctly or not based on its specifications and relevant standard documentation. Both positive and negative testing will be performed to verify that the application responds correctly. The GUI testing will be performed using WinRunner 7.0.
Regression Testing
Repeatedly test the same application after some fixes or modifications were made to the application or its environment. It is difficult to determine how much re-testing is needed, especially near the end of development cycle. Automated tools like WinRunner can be especially useful for this type of testing.
Acceptance Testing
The purpose of acceptance testing is to provide a minimal set of tests that ensure conformance to milestone acceptance criteria. The approach to acceptance testing is outlined below:
Ø The minimal functional requirement of each query is outlined
Ø The minimal functionality of GUI controls (objects) is specified
Ø Scripts in WinRunner are to be generated as per the minimal functional, performance requirements.
Stress Testing
The purpose of stress testing is to reveal defects that arise in the system in extreme/non-normal operating environments such as low system resources, low network bandwidth and long duration.
Security Testing
This test confirms that unauthorized access is denied to the application. The testing will be done using the concepts of data driven testing. Here the Database Administrator will provide a view with the username and password. Tests are to be generated to check which functionalities are available to the user depending on the level of permission granted by the application.
Back-End Testing
In this sort of testing we ensure that data does not get corrupted when moving from front end to back end or vice versa. We will use WinRunner to perform this test, though this can be done manually.
Usability Testing
This testing is for user friendliness. Users those who are not software programmers or testers will perform the tests. A listing of the test criteria’s is discussed later.
Performance Testing
Performance Testing is performed to ensure that the system provides acceptable response times (which should not exceed 5 seconds). The outline for performance testing is given below:
q The performance measurements to be carried out are Memory, processor utilization and execution/transaction time.
q Performance testing is to be done using LoadRunner.
q Extensive analysis of Online Monitoring to be used in LoadRunner.
QUALITY ASSURANCE TEST PROCESS
COVERAGE
The QA testing effort is designed to test the completeness, consistency and accuracy of the product and application being tested. The Q / A test Plan ensures not only that the program functions as specified but that all internal and interface logic is tested.
REQUIREMENTS COVERAGE
This section identifies all program requirements that are covered in the test cases. The following types are included:
Ø All input editing, validation and processing
Ø All outputs
Ø All decision points in the program
Ø Each entry and exit point
Ø All error messages
Ø All limits or constraints
Ø Data flow
TEST CASE IDENTIFIER NAMING CONVENTION(S)
Each test case will have a unique identifier
TEST DATA REQUIREMENTS
The data for the testing will be from the users implementing the testing, computers, and software being tested.
Name of the Test Data: Mortgage
TEST DATA GENERATION
Test data from user and hardware/software will provide:
Ø Nominal and expected input data values to test the effect of normal data
Ø Default and null data to test default processing.
Ø Critical values to validate calculations
Ø Maximum and minimum values to test range processing
TEST REPORTING
TEST REPORTING DOCUMENTS
Test reporting is done using Test Director that includes documents such as the Test Matrix, Test Cases, Test Defect Log, Test Incident/Bug Reports and a Change Control Log.
TEST DATA RESULTS AND RETENTION
All test reporting will be logged to the Matrix, Test Cases, Test Defect Logs, Test Incident/Bug Reports and a Change Control Log when and where appropriate.
RELEASE NOTES
Release Notes will possibly be provided to the Configuration Management Representative or Lead for project. The Release Notes may / may not include the following for each module/program turnover:
Ø Date of scheduled iteration or final release
Ø Code freeze date
Ø Versions of software, libraries, hardware, database, etc. used
Ø Q / A Test Results
Ø Completed Matrix
Ø Build (make file) instructions
Ø Installation/Setup instructions
Ø Dependencies (other executables, database, browser, test drivers, etc)
Ø Scheduled functionality for this iteration or release
Ø New versions of shared programs
Ø Change requests
Ø Open Problems/Problem bug reports
Ø Known limitations
QA TEST ENTRANCE / EXIT CRITERIA
ENTRANCE CRITERIA
Ø The Entrance Criteria specified by the system test controller (QA Manager) should be fulfilled before system test can commence. In the event, that any criterion has not been achieved, the system test may commence if Business Team and Test Controller are in full agreement that the risk is manageable.
Ø All developed code must be unit tested. Unit and Link Testing must be completed and signed off by development team.
Ø Business Analyst and Test Controller must sign off system test plans.
Ø All human resources must be assigned and in place.
Ø All test hardware and environments must be in place, and free for system test use.
Ø The acceptance tests must be completed, with a pass rate of not less than 80%.
EXIT CRITERIA
Ø The Exit Criteria detailed below must be achieved before phase 1 software can be recommended for promotion to Operations Acceptance status. Furthermore, it is recommended that there be a minimum two days effort in final integration testing after the final fix/change has been retested.
Ø All high priority errors from system test must be fixed and tested.
Ø If any medium or low-priority errors are outstanding – Business Analyst and Business Expert must sign off the implementation risk as acceptable.
Ø Test Controller and Business Analyst must sign off project integration test.
Ø Business expert must sign off business acceptance test.
TEST EVALUATION
PASS / FAIL CRITERIA
In order for Q / A to begin testing, the project must be at an established phase or beta so that one or more complete test cases can be done.
The pass/fail criterion is based on the test cases. If a row within a test case passes or fails, we mark it as such. If one row fails within a test case, then the full case fails and vice-versa for a pass of a row within a test case.
RESULTS ANALYSIS
Analysis will be upon all test cases of pass to fail ratio as well as the Matrix. The QA lead in charge of the project will be the person in charge of final analysis.
QA TEST PLAN CHANGE LOG
This is to be used to document the QA Test Plan’s original version and all modification for all iterations.
Use the chart below to document the QA Test Plan’s original version and all modifications for change control and tractability.
TEST CASE DESIGN
VALIDATION MATRIX
The validation matrix is the matrix, which gives the status of all the test cases. It indicates whether the bug is open or closed or deferred. The matrix gives total needed information about various bugs and test cases such as Test case author, tester, developer to whom the bug has been assigned to for fixing. The matrix also gives the severity level of the bugs. For total information about the bug and the bug remarks as given by the tester, the appropriate test case can be checked. The matrix is outlined below:
Online Account Management Application Validation Matrix
P = Passed
F = Failed
Status: O = Open, F = Fixed, D = Deferred
Severity: U = Urgent, H = High, M = Medium, L = Low
QA TEST CASES
Area of Project: Test Case Identifier sequence
Requirements Satisfied:
TEST CASE DESCRIPTION:
INPUT: User provides input
OUTPUT:
CONTROL DATA: This data is the type of computers used and the way they are setup for the testing.
ALL EXECUTIONS PASSED? Yes
No {Check appropriate box}
TEST INFORMATION: {Fill in the following information}
Tester Name Version Date Problems Logged Corrected
Testing comments / Recommendations
Test case author Date
Modification Dates
QA TEST MATRIX
The tabulated form of test result data, which is accessible to the higher authorities of the project like QA Manager, Project Lead etc. This matrix is updated on daily basis. It shows the performance of the developers and testers.
GUI and Usability Checklist:
The GUI checklist has to cover the following points:
TESTING DELIVERABLES
SCHEDULE
The table below gives the deadlines for the various tasks associated with the testing of the Quote Agent Application.
The following are the Task, Start Date, and End Date
Test Environment Setup,
03/27/04,
04/05/04
WinRunner Installation
04/06/04
04/08/04
LoadRunner Installation
04/09/04
05/01/04
Test case generation
05/01/04
06/26/04
Functional Testing (Phase 1)
06/27/04
07/05/04
Regression Testing (Phase 1)
07/06/04
07/23/04
Acceptance Testing (Phase 1)
07/24/04
08/03/04
Stress Testing (Phase 1)
08/04/04
08/16/04
Security Testing (Phase 1)
08/17/04
09/03/04
Back-End Testing (Phase 1)
09/04/04
09/26/04
Functional Testing (Phase 2)
09/27/04
10/18/04
Regression Testing (Phase 2)
10/19/04
11/04/04
Acceptance Testing (Phase 2)
11/05/04
12/01/04
Stress Testing (Phase 2)
12/02/04
12/14/04
Security Testing (Phase 2)
12/15/04
01/05/05
Back-End Testing (Phase 2)
01/06/05
01/30/05
Test Release Notes First Draft
02/01/05
03/01/05
Test Release Notes Final Draft
03/02/05
03/29/05
Test Release Notes Approval
03/30/05
04/25/05
==========================================================================
APPENDIX
GUI Testing
GUI testing determines the appearance of the GUI objects on the window or application. We will check that the GUI objects should not overlap with each other and their properties are being maintained. In the present e-Hospital Management System we check the properties of various web objects.
Functionality Testing
Functionality testing determines whether the window or application functions correctly or not based on its specifications and relevant standard documentation. Both positive and negative testing will be performed to verify that the application responds correctly. The GUI functionality is to be checked used WinRunner 7.0.
Acceptance Testing
The purpose of acceptance testing is to provide a minimal set of tests that ensure conformance to milestone acceptance criteria. The approach to acceptance testing is outlined below –
· The minimal functional requirement of each query is outlined.
· The minimal functionality of GUI controls (objects) is specified.
· Scripts in WinRunner are to be generated as per the minimal functional, performance requirements.
System Testing
This testing is to be carried out once integration of front end and back end is carried out. Using WinRunner, different values will be entered from the front end and the database will be checked to see correctness of data. The functionalities of the system as a whole will be tested and perfection of data read from the database will be the prime focus of this test.
Regression Testing
We retest the same application after fixes or modifications are done to the application, its components or the environment. It is difficult to determine how much retesting is needed, especially nearing the end of the development cycle. Automated tools like WinRunner are especially useful for this testing methodology. The approach to regression testing is outline below –
· The regression testing is to be done using WinRunner 7.0
· The GUI testing should be Regression based.
· Data Driven testing to be done using WinRunner 7.0 to check for backend performance of SQL database.
Backend Testing
This sort of testing ensures that the data does not get corrupted in the process of being transferred from the front end to back end or vice-versa. We will be using WinRunner for this testing, although significant parts of it may be manually performed.
Security Testing
This test confirms that the software allows only authorized users to access the system. This testing will be done using the concepts of data driven testing. Here the DB administrator provides the table of username and passwords. Tests generated will test to see which functionalities are available to the user depending on the level of permission granted by the application.
No comments:
Post a Comment