Sunday, November 23, 2008

What to analyze in test result and how to make recommendations?

It depends on what sort of testing we have conducted and what types of report we have for particular test result, and what types of statistical diagram resulting there of, and more.

Such as, the scalability index could show the average transaction per second (TPS) throughput. Transaction distribution chart could be useful to audit a test to validate the result by identifying usueally long or short response rates or trends.

So, in general load testing scenario, we have to analyze transaction per second (tps) results as per test operates; analyze the range of response times for the duration of each test use case; analyze the timing values for each transaction on each test node in a set of XML formatted files; analyze the resource utilization on CPU, network, and memory utilization for each test node for each test use case; analyze the results of multiple test scenarios.

How to design test/ test case/ test plan?

Before designing any test, test cases, or test plan; first of all we have to think about the core point - how could we maximize the quality of the software? Depending on the limitation of human resources, budget, time, risk, market, and the future requirement we have to be revolved around the 'quality'and follow the centralized theme of the 'quality'.

The testing strategy to be undertaken might be driven by different factors, basically resource/time/and cost oriented factors. Looking at the nature of the development and the requirement of the project, the testing strategy could vary from flexible strategy to strict one. We need to analyze the risk and need to know how to handle overall test management, error management, and how to execute the overall test cases.

Based on the size of the project, we have to choose the strategy to be taken by either going for manual testing or automated testing. We should not go for automated one unless it is really required. But, we can not avoid manual testing. If we choose automated testing, we need to think about its environment set up, technical resources requirement, updates requirements of the tool, and cost incurred from these activities. It all depends on how we formulate the strategy.

Thursday, November 20, 2008

Software Development Life Cycle

Software Development Life Cycle (SDLC) - The process of the development of the application or software product is called the software development life cycle.

Phases of SDLC -
  1. Project Initiation (Clients identify their business requirements and provide it to the Software Company)
  2. Requirement Analysis and Gathering (Business Analysts fetch the requirement from clients to form the requirement document and Use Case document)
  3. System Design (The application design and layout is developed and documented by design team)
  4. System Development (The application will be coded and developed by the team of developers)/(The developers also perform unit tests after completing the development)
  5. System Testing (The application will be tested for functionality, navigation, security, and performance by the test team)
  6. User Acceptance Testing (The application will be tested by business users to make sure system is acceptable and meets business requirements)
  7. System Implementation/ Production (The application will be deployed in the production environment)
  8. System Maintenance (Bug fixes, Change Requests or New Requirements)
(Reference: http://www.learningdom.com)

Wednesday, October 15, 2008

What should we know as a ETL Tester?

First of all we should know what does ETL stand for? ETL is Extract, Transfer, and Load. When the data is fed into the system in the front end, it goes to the database after batch processing. Then the data is transfered to the ETL system. As per the business transformation rule, the ETL developer/tester run the jobs in ETL. To run these jobs they use proprietary language of ETL's company/or/Unix. As per the business need jobs are created. Once we run the jobs, the data from source file get transfered to the target file. We might also need to use a lot of basic and advanced SQL to map and verify those data and see whether the data is transfered in accordance with the business transforamtion rule or not. Normally, we have to map data from source file to stage table; stage table to development database table; development database table to production database table.

  • Tool: Informatica / Business Object's Data Integrator (OR other tools in the market). Actually ETL tools are proprietary.
  • Should know how to test mappings
  • Should be familiar with at least one programming language or scripting language. As ETL languages are proprietary, it would be helpful to know some language in advance. Keep in mind that each ETL's proprietary language has interface to call SQL. Most of the ETL tools have interfaces to use the bulk loading/unloading tools associated with the major databases.
  • If we have Unix Platform, we should have knowledge of UNIX shell scripting
  • We should have a knowledge of SQL. SQL can be used to extract data in most of the cases.
  • Strong database (Oracle/SQL Server/Sybase whatever, one of them) knowledge
  • Knowledge of working with large data sets
  • Know how to analyse data quality
  • Generally ETL tester involve in validating the source database, target database, data extraction, data transformation and data loading
  • In addition, ETL tester creates UNIX shell scripts to access data and move data from production to development
  • Further, they test different transformations (rules) to move data to the Staging area, ODS and finally Target Data base using Informatica or Data Integrator Designer.
  • Should know how and where to use DDL and DML Statements
  • Should be able to test ETL code

Although, ETL tools are common in market today, many Data Warehouse projects (especially in the past ) did not use the external ETL tool. In those cases, load utility is used to extract the data and populate tables which gives the flat file structure. So, in the absense of ETL tool, its job of transform and load of the staging and production table is still possible by using Transact SQL or PL/SQL or DB2/SQL.

Thursday, October 2, 2008

Unix Platform

"vi is a visual editor (default) in Unix. Because of this we will be able to see the document we are editing. We can edit files really quickly on vi editor, as it is extremely economical with the keystrokes. Due to its different modes for inserting and issuing commands, it is much faster than most non-mode based editors." (ref:roxanne.org). So, vi editor is basically used to create a new file or edit the existing file.

To open a vi editor, first we have to go to the Unix shell (black screen).

Commands:
vi filename (press enter) - to open the vi editor with the given filename - we can see the filename at the bottom of the vi editor screen. At first, when we open a new file we can observe that blank lines started with tilda sign (~).
At the botoom of the screen; type :q! (press enter) - to quit and come out of the vi editor
Before typing in vi editor, we should know that there are 2 modes in vi, i.e. Insert Mode and Command Mode. In the Insert Mode, typed input is recognized only as text and goes into the file ie opened. In the command mode, whateve we typed is interpreted as command.
When we first open the file, it will be in command mode. To go back to insert mode we have to type 'i' or 'a'; and to go to command mode we have to hit 'Esc'.
To save (:w) anything we typed, we have to go back to the command mode (type 'Esc') to execute it. To save and exit at a time, type ':wq' and enter.
To go to the next line - type 'o', and it will take your cursor to the beginning of the 2nd line in order to ADD a NEW LINE. Uppercase 'O' will add a new line before our existing line, the line where the cursor is blinking.
To navigate between lines - to go up type 'j' to go down, 'k' to go up, 'l' to go right, 'h' to go left
'Shift ^' will take your cursor to the beginning of the line. [We Should Be in the Command Mode]
d - will delete the letter [We Should Be in the Command Mode]
dw - will delete the entire word (or, Shift + d) [We Should Be in the Command Mode]
dd - will delter the entire line [We Should Be in the Command Mode]
u - will undo the things [We Should Be in the Command Mode]
2 d - will delete 2 lines [We Should Be in the Command Mode]
In unix, yanking is kind of cut and paste. To do so, we have to be in command mode. To copy a line and paste it, we have put the cursor at the end of the line and type # (how many lines we want to paste) e.g. if we want to copy 3 lines from the line where the cursor is we have to type 3yy and go to the line where we want to paste it and type 'p'.

To insert a file - :r (filename)
To search forward - / e.g. if we have a number of word 'box' in the file and want to search 'box'; at the bottom of the screen we do forward slash /box
To search backward - ? e.g. ?box
To repeat the last search - ?? or //
To find the next occurance - n or next
To repeat the last search - N
To undo the changes in last command - u
To undo the changes in current line - U
To reload the current document without saving any changes - :e!
To put back the last nth deletion - "p
To go to the end of the file (say when we are at the beginning of the file) - just type press colon : and $
To go to the end of the line - just type $
To get back to the beginning of line - just type caret '^'
To replace any word in the line with another word - e.g. say the word 'budda' which is in 3rd line of the file has to be replaced by the word 'budhi'; then type => :3s/budda/budhi and ENTER (colon, s-substitute, / is forward slash for the word ahead of us substituted with the word budhi). If we want to do the same thing not for the particular line, then we have to use % instead of 3 (line#) and also need to add /g at the end to mention that it has to be acted globally. So, the command would be like :%s/budda/budhi/g and ENTER

Thursday, September 18, 2008

For Quality Analyst with DATA ANALYST/MODELER; It is good to know the following...

  1. Conduct business and data analysis, and implementing physical design and ETL mappings for Data Warehouse and Business intelligence solutions,
  2. Familiarity with Oracle database technologies,
  3. SQL and PL/SQL skills,
  4. Able to participate in analysis sessions with the user community to capture data requirements clearly, completely and correctly, and represent them in a formal and visual way through the data models,
  5. Understand Business Intelligence solutions,
  6. Strong knowledge of relational and multi-dimensional database architectures, database lifecycle design and application development,
  7. Data analysis/profiling and reverse engineering of data,
  8. Developing detailed mapping documents that will provide the reporting and data warehouse team with source to target data mapping,
  9. Create test strategy and test cases that will ensure the seamless implementation of the data warehouse,
  10. Conduct string and system testing according to developed test scenarios,
  11. Work with business users to define data requirements,
  12. Ability to develop, integrate, manage and promote physical data models and data model standards that source data from multiple origins (internal data sources, external data sources, third party data, etc.),
  13. Ability to conduct data analysis and data profiling,
  14. Read / parse PL/SQL procedures and packages,
  15. Ability to develop detailed mapping documents that will provide the data warehouse reporting and ETL teams with source to target data mapping which will include logical names, physical names, data types, corporate meta-data definitions, and translation rules,
  16. Ability to define, develop, and implement joint data testing strategies with the business and IT QA,
  17. Ability to define, develop and implement test cases, based on performed data analysis and data mappings,
  18. Ability to execute test cases to ensure accurate and appropriate use of data during the project system, integration test and UAT.

Wednesday, August 6, 2008

Testing Terms and Terminology

(Source: Anonymous)
A

Acceptance Testing
Testing conducted to enable a user/customer to determine whether to accept a software product. Normally performed to validate the software meets a set of agreed acceptance criteria.

Accessibility Testing
Verifying a product is accessible to the people having disabilities (deaf, blind, mentally disabled etc.)

Ad Hoc Testing

Similar to exploratory testing, but often taken to mean that the testers have significant understanding of the software before testing it. Also, we can call it Monkey Testing.

Agile Testing

Testing practice for projects using agile methodologies, treating development as the customer of testing and emphasizing a test-first design paradigm.

Alpha Testing
Testing of an application when development is nearing completion; minor design changes may still be made as a result of such testing. Typically done by end-users or others, not by programmers or testers.

Application Binary Interface (ABI)
A specification defining requirements for portability of applications in binary forms across different system platforms and environments.

Application Programming Interface (API)
A formalized set of software calls and routines that can be referenced by an application program in order to access supporting system or network services.

Automated Software Quality (ASQ)
The use of software tools, such as automated testing tools, to improve software quality.

Automated Testing
Testing employing software tools that execute tests without manual intervention. Can be applied in GUI, performance, API, etc. testing.
The use of software to control the execution of tests, the comparision of actual outcomes to predicted outcomes, the setting up of test perconditions, and other test control and test reporting functions.

B

Backus-Naur Form
A Meta language used to formally describe the syntax of a language.

Basic Block
A sequence of one or more consecutive, executable statements containing no branches.

Basis Path Testing
A white box test case design technique that uses the algorithmic flow of the program to design tests.

Basis Set
The set of tests derived using basis path testing.

Baseline
The point at which some deliverable produced during the software engineering process is put under formal change control.

Beta Testing
Testing when development and testing are essentially completed and final bugs and problems need to be found before final release. Typically done by end-users or others, not by programmers or testers.

Binary Portability Testing
Testing an executable application for portablity across system platforms and environments, usually for confrormation to an ABI specification.

Black Box Testing: Functional Testing
Testing based on an analysis of the specification of a piece of software without reference to its internal workings. The goal is to test how well the component conforms to the published requirements for the component.

  • Testing the features and operational behavior of a product to ensure they correspond to its specifications.
  • Testing that ignores the internal mechanism of a system or component and focuses solely on the outpurs generated in response to selected inputs and execution conditions.

Bottom Up Testing
An approach to integration testing where the lowest level components are tested first, then used to facilitate the testing of higher-level components. The process is repeated until the component at the top of the hierarchy is tested.

For further terms and terminology - Click Here (Thankful to ISTQB)

SQL

SQL (Structured Query Language) is simply a language that is used to access the database. Ultimately, by accessing the database we can also manipulate data (information) over there.

In the market we have so many types of databases. Such as Oracle, DB2, Informix, Microsoft SQL Server, Sybase, MS Access, and other database systems.

SQL (Structured Query Language) is the only one language that we use to access and manipulate data in these database systems.

Tuesday, August 5, 2008

Test Director Overview

Application Testing is a complex process. Test Director helps organize and manage all phases of the testing process that encompasses the following steps:


  • Organize and manage testing requirements [SPECIFYING REQUIREMENTS]; where we define testing scope, create requirements, detail requirements, and analyze requirements.
  • Planning test; where we define test strategy, define test subjects, define tests, create requirement coverage, design test steps, automate tests, and analyze test plan.
  • Executing test [RUNNING TESTS]; where we create test sets, schedule runs, run tests, and analyze test results.
  • Tracking Defects, where we find defects, review new defects, repair open defects, test new build, and analyze defect data.

Q. How do you create tests from requirements in TD?

Once you have created the requirements tree, you use the requirement as a basis for defining the tests in your test plan tree and running tests in a test set.

There are 2 ways: (1) Convert requirement to tests, and (2) Generate a Test from requirements.

(1) Tools > Convert to Test > Convert All/ Convert Selected

(2) Right Click any requirement on Tree View ==> Generate Test.

Q. What are Test Director Projects?

TD projects are repositories that contain requirements, tests, testsets, test runs, defects, project documentation and customization information.

Q. What database application are used to store and manage TD information?

MS Access, Sybase, MS SQL, Oracle

Q. What are the 4 modules of Test Director?

  1. Requirements
  2. Test Plan
  3. Test Lab
  4. Defects

Q. How do you clear 'History' in TD?

  1. Click TOOLs on the Upper right corner --> Clear History
  2. Select the entity whose history you want to delete
  3. Select the field whose history you want to delete
  4. Upto '...date', select a date.

SCHEDULING, REQUIREMENT, TEST PLANNING

  1. How do you schedule a test in Test Director?
  2. How do you specify requirements?
  3. How do you plan the tests?
  4. How do you run the tests?
  5. How do you do defect tracking?
  6. How do you filter records in TD?
  7. What information you put in requirements?
  8. How do you view requirement history?
  9. How do you view associated defects for a test requirement?
  10. How do you attach or open WR script from TD?
  11. How do you E-mail test?
  12. What is Test Management Process in Test Director?
  13. What is a Test Grid?
  14. What data column is displayed by Test Grid?
  15. What does Test Planning Involve?
  16. What is a Test Plan Tree?
  17. How do you associate defect with a Test?
  18. What is the relationship of requirement and tests?
  19. How do you design test steps in Test Plan module?
  20. How do you generate an automated test template?
  21. How do you add system test to a test plan tree?

TEST EXECUTION

  1. What are the stages of Test Execution?
  2. What are the elements of Execution Flow Diagram?
  3. How do you set the Test Set on Failure Rules?
  4. How do you add a Test Set in Test Lab Module?
  5. What is scheduling test runs?
  6. How do you run an automated test in TD?
  7. What considerations should be considered when running tests on remote host?

DEFECT TRACKING

  1. How do you track Defects?
  2. How do you add defects?
  3. What information do you include in Defect Reporting/ Tracking?
  4. What are the status of defects?
  5. How do you find matching defects?
  6. How do you update defects?

TEST DIRECTOR ANALYSIS

  1. What reports are available in Requirement Module?
  2. What reports are available in Test Plan Module?
  3. What reports are available in TEST Lab Module?
  4. What reports are available in Defects Module?
  5. How do you create a Report?
  6. Tell about generating graphs in TD.
  7. What are the different types of Graph in Requirement Module?
  8. What are the different types of Graph in Test Plan Module?
  9. What are the different types of Graph in Test Lab Module?
  10. What are the different types of Graph in Defects Module?
  11. What is Project Documents in TD?
  12. How do you launch Document Generator?

Answers


How do you schedule a test in Test Director?

Go to TEST LAB --> Execution Flow --> Right Click on any Test --> Test Run Schedule --> Time Dependency --> Run at any time /or/ Run at a specified time []Date e.g.5/8/2005


How do you specify requirements?

Defining Testing Scope ==> Create Requirements ==> Detail Requirements ==> Analyze Requirements


How do you plan the tests?

Create a Test Plan based on Testing Requirements

Define Testing Strategy (How? What?) ==> Define Test Subjects ==> Define Tests ==> Create Requirment Coverage (Link each test with requirement) ==> Design Test Steps ==> Automate Tests ==> Analyze Test Plan


How do you run the tests?

Create Test Sets ==> Schedule Runs ==> Run Tests Manually/or/Automatedly ==> Analyze Test Results


How do you do defect tracking?

Add Defects ==> Review New Defects ==> Repair Open Defects ==> Test New Build ==> Analyze Defect Data


How do you filter records in TD?

  1. Click Set Filter/Sort button => Filter dialog box opens and displays filter tab
  2. Set a filter condition for a specific column
  3. Define the filter condition
  4. Ok [To add cross filters click Advanced Link]


What information you put in requirements?

Attachment, Author, Cover Status (Pass, Fail), Creation time/date, Modified, Name (Requirement Name), Priority, Product, Requirement ID, Reviewed, Type (hardware/software)


How do you view requirement history?

  1. Select a Requirement in the tree
  2. Click the History Tab


How do you view associated defects for a test requirement?

Select View >>> Associated Defect

OR, Right click a requirement >>> Associated Defect


How do you attach or open WR script from TD?

For attaching -> Go to TEST PLAN --> Attach ==> Open the script and the script gets attached. [You can attach file, URL, snapshot, and system info]

For opening WR script ==> From tree ==> Go to expand and find a WR script ==> copy from ==> browse to the script and open

Now, you can run the script from TD ==> by LAUNCH button


How do you E-mail test?

Right click any test on TREE VIEW ==> MAIL TESTS in TEST PLAN Module


What is Test Management Process in Test Director?

SPECIFY REQUIREMENTS (Analyze your application and determine your testing requirement) ==> PLAN TESTS (Create a test plan based on testing requirements) ==> EXECUTE TESTS (Create testsets and perform test runs) ==> TRACK DEFECTS (Report defects detected in your application and track how repqirs are progressing)


What is a Test Grid?

A Test Grid desplays all the tests in a TD Project. Each row diplays a separate test record. Each column represents a separate data-item.

To view the test grid, choose VIEW--> TEST GRID

Test Grid contains

  1. Test Grid Toolbar (Copy, Paste, Set Filter, Sort, Select Column, Find etc.)
  2. Grid Filter
  3. Description tab
  4. History tab


What data column is displayed by Test Grid?

Creation Data, Description, designer, estimated development time, execution status, modified, path, status, steps, subject, template, Test Name, Type.


What does Test Planning Involve? (What can you do in Test Plan Module?)

  1. Creating a Test Plan Tree
  2. Adding tests to a test plan tree
  3. Viewing the test plan tree
  4. Associating defects with a test
  5. Mailing test
  6. Finding tests in the test tree
  7. Sorting
  8. Modifying


What is a Test Plan Tree?

The typical application is too large to test as a whole. The Test Plan module enables you to divide your application according to functionality. You divide your application into units, or subjects, by creating a plan tree.


How do you associate defect with a Test?

View >>> Associate Defects

OR, right click the test and choose Associated Defects


What is the relationship of requirement and tests ?

It could be:

  1. 1 to 1
  2. 1 to many
  3. many to 1

You can do it both ways Requirement ==> to ==> Tests; OR, Test ==>to==> Requirements


How do you design test steps in Test Plan module?

You add steps to a test using Design Step Editor. To create a test step;

  1. Choose a test and click DESIGN STEPS tab
  2. Click NEW STEP button or Right Click DESIGN STEP TAB and choose NEW STEP
  3. Type a description and expected results
  4. To insert a PARAMETER, click INSERT PARAMETER button
  5. Click CLOSE to close Design Step Editor


How do you generate an automated test template?

  1. In the test plan tree, choose the mannual test that you want to automate
  2. Click the Design STEPS tab and click the GENERATE SCRIPT button
  3. Choose an automated test type to convert to: WR-automated, LR-scenario or Quick Test -test.
  4. Click the TEST SCRIPT tab to view the test template
  5. Click LAUNCH button


How do you add system test to a test plan tree?

  1. Choose Subject Folder in the Test Plan Tree
  2. New Test Button OR Planning > NEW TEST
  3. In the Test Type: Select SYSTEM TEST
  4. In the TEST NAME Box, type a name of the test and OK

How does it look? Screen Shot View! (Source:Prakash Nepal's www.qaquestions.com)

Ref -HP/Mercury Interactive

Friday, July 25, 2008

How to facilitate all environmental changes to test environments

It includes builds migrations, server bounces, database structure changes, configuraiton changes.