Software Testing-Manual Testing
MANUAL TESTING
Mailny in Manual Testing the following documents are required
i)Test Policy --QC
ii)Test Strategy--Company Level
iii) Test Methodology--QA
iv)Test Plan----TL
v)Test Case
vi)Test Procedure
vii)Tester Test Script
viii)Test Log
ix)Defect Report
x)Test Summary Report-----TL
I) Test Policy: This is a company level document and will be developed by QC People.( At most management). This document defines
�Testing Objective� in that organization
- Small-scale company test policy
- Testing Def:
Verification + Validation
- Testing Process:
Proper planning before testing
- Testing Standard:
Defect per 280 LOC / Defect per 10 functional points
- Testing Measurements:
QAM (Quality Assessment Measurements), TMM ( Test Management Measurements),
PCM (Process Capability Measurements)
II) Test Strategy: It is also a company level document and developed by QA people. It defines testing approach followed by testing team.
Components in Test Strategy:
-
Scope and Objective:
About testing need and their purpose
- Business Issues:
Budget control for testing in terms of time and cost
100%----Project Cost
64% Development & Maintenance and 36% for Testing
- Test Approach:
It defines mapping between development stages and testing issues
- Test Matrix (TM)/ Test Responsibilities Matrix (TRM)
 - Test Deliverables:
Required documents to prepare during testing of a project
Ex: Test Methodology, Test Plan, Test case etc.,
- Roles & Responsibilies:
Names of jobs in testing team and their responsibility
- Communication and Status Reporting:
Require negotiations between two consecutive job in testing team
- Automation Testing Tools:
Need of automation in our organization level project testing
- Testing Measurements and Metrics:
QAM, TMM, & PCM
- Defect Reporting and Tracking:
Required negotiations between testing team and development team
- Risks & Mitigations:
Possible risks and mitigations to solve( risks indicates a future failure)
- Change and Configurations Management:
How to handle change requests coming from customers during testing and maintenance
- Training Plan :
Need of training to tester before starts of every project testing
III)Test Factors: To define quality S/W, quality analyst defines
15 testing issues. Test factor or issue means that a testing issue to
apply on S/W to achieve quality.
The test factors are:
- Authorization:
Whether user is valid or not to correct application
- Access Control:
Authorized to access specific services
- Audit Trail:
Meta data about user operations
- Continuity of processing:
Inter process communication (IPC) during execution.
- Correctness:
Meet customer requirements in terms of inputs and out puts
- Coupling :
Co-existence with other existing S/W
- Ease of Use :
User friendliness of screens
- Ease of operate :
Installation, un installation, dumping, exporting etc.,
- File integrate:
creation of internal files (ex back up)
- Reliability:
Recover from abnormal situation
- Portable :
Run on different plat forms
- Performance :
speed of processing
- Service Levels :
order of services
- Methadology :
Follow standards
- Maintainable :
Long time serviceable to customers
IV)Test Factors VS Black Box Testing Techniques
- Authorization :
Security Testing, Functional or requirement testing
- Access Control :
Security Testing ,if there is no separate team then Functional or Requirement testing
- Audit Trail :
Functionality or requirements, error handling testing
- Correctness :
Functionality or requirements testing
- Continuity of processing :
execution testing , operation testing (white Box)
- Coupling :
Intersystems testing
- Ease of use :
Usability testing
- Ease of operate :
Installation testing
- File Integrate :
Recovery , error handling testing
- Realiability :
Recovery ,Stress testing
- Portable:
Compatability,Configuration testing
- Performance:
Load & Stress ,Storage & Data volume testing
- Service Level :
Functionality or requirements testing
- Maintainable:
Compliance Testing
- Methodology:
Compliance Testing
V)Test Methodology: It is a project level document and developed
by QA or PM. It is a refinement form of Test Strategy. To prepare test
methodology , QA or PM depends on below factors.
- Step1 :
Determine project type such as Traditional, Out sourcing and
Maintanence (depends on project type QA decreases no of columns in TRM)

- Step 2 :
Determine application requirements (depends on application requirements QA will decrease no of rows in TRM)
- Step 3:
Determine tactical risks (depends on risks, QA decrease no of factors in selected list)
- Step 4:
Determine scope of application (depends on expected future
enhancements, QA add some of the deleted factors to TRM)
- Step 5 :
Finalize TRM for current project
- Step 6 :
Prepare system test plan (defining scheduling for above finalized approach)---Test Lead will do
- Step 7 :
Prepare module test plans if require
VI)Testing Process
VII)PET Process: (Process experts, tools and technology) This
testing process developed by HCL and approved quality analyst form of
India. It is a refinement of V- Modal to define testing process along
with development stages.
VIII)Test Planning: After completion of test initiation, TL of
the testing team concentrate on test planning to define �What to test�,
�How to test� ,�When to test�, �Who to test�?. Test plan author follows
below work bench (process) to prepare test plan document.
1.Team formation: In general test planning process starts with testing team formation. In this step test plan author depends on below factors
a) Availability of testers
b) Test duration
c) Availability of test environment resources
Case Study: Test duration
C/S, Web, ERP ---- 3 to 5 months of functional & system testing
Team size 3:1
2.Identify tactical risks: After completion of team formation test plan author study possible risks raised during testing of that project
Ex Risk 1: Lack of knowledge on domain
Risk 2: Lack of budget (time)
Risk 3: Lack of resources
Risk 4: Delay in delivery
Risk 5: Lack of development process rigor (seriousness of dev team)
Risk 6: Lack of test data (some times test engg conducting Ad Hoc testing)
Risk 7: Lack of communication
3) Prepare Test Plan: After completion of team formation and risk analysis test plan author prepare test plan document in IEEE format.
FORMAT:
1) Test Plan ID: Unique number
2) Introduction: About project and test team
3) Test Items: Modules/Features/services/functions
4) Features to be tested: Responsible modules to prepare test cases
5) Features to be not tested: which ones and why not?
6) Approach: Required list of testing techniques (depends on TRM)
7) Feature Pass or Fail Criteria: When a feature is pass when a feature is fail
8) Suspension Criteria : Possible abnormal situations raised during above features testing
9) Testing Tasks (Pre requisite): Necessary tasks to do before starts of every feature testing
10) Test Deliverables: Required test documents to prepare during testing
11) Test Environment: Required HW and SW including testing tools
12) Staff and Training Needs: Names of selected test engineers
13) Responsibilities: Work allocation
14) Schdule: Dates & Time
15) Risks and Mitigations :
16) Approvals : Signatures of test plan author, QA or PM
IX) Review Test Plan: After completion of test plan preparation,
test plan author review the document for completeness and correctness.
In this review, responsible person conducts coverage analysis. Topics
in test plan review based on
a) BRS & SRS based coverage
b) Risks based coverage
c) TRM based coverage
X) Test Design: After completion of test plan finalization,
selected test engineers involved in required training sessions to
understand business logic. This type of training provided by business
analyst or functional lead or business consultant. After completion of
required training sessions test engineers are preparing test cases for
responsible modules
There are 3 methods to prepare core level test cases (UI,
Functionality, Input domain, error handling, and manual support
testing). They are
1) Business Logic based test case design (80%)
2) Input domain based test case design (15%)
3) User Interface based test case design (5%)
1 Business Logic: In general functionality and error handling
based test cases prepared by test engineers depends on usecase in SRS.
A usecase describes that how a user use specific functionality in our
application. A test case describes that a test condition to apply on
that application to validate. To prepare this type of test cases
depends on usecases, we can follow below approach.
Step 1: Collect responsible usecases
Step 2: Select Usecase and their dependencies
2.1 Identify entry condition (base state)
2.2 Identify input required (test data)
2.3 Identify exit condition (end state)
2.4 Identify out put and out come (expected)
2.5 Study normal flow (call states)
2.6 Identify alternative flows and exceptions
Step 3: Prepare test cases depends on above study
Step 4: Review the test case as per completeness and correctness
Usecase 1 From a usecase and data modal, a login process
allows userid and password. Userid is taking alphanumeric and lowercase
from 4 to 16 characters long. Password allow alphabets in lower case
from 4 to 8 characters long.
Test case 1: Successful entry of userid
BVA (Size)------------------------------------ECP (Type )
Min 4-- Pass----------------------------------Valid----Invalid
Max 16-- Pass--------------------------------0-9-----A-Z
Max �1-- Pass--------------------------------a-z-----Special Char and Blank Spaces
Min+1-- Pass
Max+1-- Fail
Min-1-- Fail
Test Case 2: Successful entry of password
BVA (Size)------------------------------------ECP (Type )
Min 4-- Pass----------------------------------Valid Invalid
Max 8 --Pass---------------------------------A-Z
Max �1 -- Pass-------------------------------a-z-----Special Char and Blank Spaces
Min+1 -- Pass--------------------------------0-9
Max+1 --Fail
Min-1-- Fail
Test Case 3: Successful log in
User id------------------Password--------------------------- Criteria
Valid--------------------Valid---------------------------------Pass
Valid--------------------Invalid-------------------------------Fail
Invalid------------------Invalid-------------------------------Fail
Valid--------------------Blank--------------------------------Fail
Blank------------------- Valid---------------------------------Fail
UseCase 2 In a insurance application, user can apply for
different types of insurances. When a user select type B insurance,
system asks age to enter. Age value should be greter than 18 years and
should be less than 60
Test case 1: Successful selection of type B Insurance
Test Case 2 : Successful focus to age when you select type B
Test Case 3: Successful entry of age value
BVA (Size)---------------------------------------------------ECP (Type)
Min 19 -- Pass------------------------------------------------Valid Invalid
Max 59 --Pass----------------------------------------------- 0-9 A-Z
Max �1 -- Pass-----------------------------------------------Special Char and Blank Spaces
Min+1 -- Pass------------------------------------------------ a-z
Max+1-- Fail
Min-1 -- Fail
Usecase 4: In shopping application customer can try for purchase
order creation. Application takes item no & qty. Item no allows
alphanumeric from 4-6 charaters long and quantity allows upto 10 items
to purchase. After filling item no & qty , our system returns price
of one item & total amount
Test Case 1: Successful item no
BVA (Size) --------------------------------------------------ECP (Type )
Min 4 --Pass--------------------------------------------------Valid Invalid
Max 6 --Pass--------------------------------------------------0-9 A-Z
Max �1-- Pass-------------------------------------------------a-z Special Char and Blank Spaces
Min+1 -- Pass--------------------------------------------------A-Z
Max+1 -- Fail
Min-1 -- Fail
Test Case 2: Successful selection of Qty
BVA (Size)--------------------------------------------------ECP (Type )
Min 1 -- Pass--------------------------------------------------Valid Invalid
Max 10 -- Pass------------------------------------------------0-9 A-Z
Max �1 -- Pass------------------------------------------------Special Char and Blank Spaces
Min+1 -- Pass
Max+1 -- Fail
Min-1 -- Fail
Test case 3: Successful calculation
Total=price * qty
Usecase 4 In banking application user can dial bank using his
person computer. In this process user can use 6 digit pwd & below
fields.
Area code--3 digit no & allows blank
Prefix-- 3 digit no, not starts with 0 or 1
Suffix-- 6 digit & alphanumeric values
Commonds-- Deposit ,balance enquiry, mini statement ,bill pay
Test Case 1: Successful entry of password
BVA (Size) ECP (Type )
Min 6--Pass Valid Invalid
Max 6 --Pass 0-9 A-Z
Max �1 -- Fail Special Char and Blank Spaces
Min+1 -- Fail a-z
Max+1 -- Fail
Min-1 -- Fail
Test Case 2: Successful area code
BVA (Size) ECP (Type )
Min 3 -- Pass Valid Invalid
Max 3 -- Pass 0-9 A-Z
Max �1-- Fail Blank Special Char
Min+1 --Fail a-z
Max+1 -- Fail
Min-1 --Fail
Test Case 3 : Successful prefix:
BVA (Size) ECP (Type )
Min 200 -- Pass Valid Invalid
Max 999 -- Pass 0-9 A-Z
Max �1-- Pass Special Char and Blank Spaces
Min+1 -- Pass a-z
Max+1 -- Fail
Min-1 --Fail
Test Case 4: Successful suffix
BVA (Size) ECP (Type)
Min 6 -- Pass Valid Invalid
Max 6 -- Pass 0-9
Max �1 -- fail A-Z Special Char and Blank Spaces
Min+1 -- Fail a-z
Max+1 -- Fail
Min-1 --Fail
Testcase 5: Successful commands such as deposit, balance enquiry etc.,
Test Case 6: Successful dialing with valid values
Test Case 7: Unsuccessful dialing without filling all field values except area code
Test Case 8: Successful dialing with out filling area code
Test Case Format: During test design, test enggs are preparing test case documents in IEEE format.
1) Test Case ID: Unique name or number
2) Test Case Name: Name of the test condition
3) Feature to be tested: Module or feature or service or component
4) Test Suite ID: Batch name, in which this case is a member
5) Priority: Importance of test case
P0--- Basic Functionality
P1--- General functionality ( ex I/P domain, compatibility, error handling, intersystem testing etc.,)
6) Test Environment: Require H/W and S/W including testing tools
7) Test Effort (person / hr): Time to execute this case, ex 20 min
8) Test Duration: Date & Time
9) Test Setup: Necessary tasks to do before starts this case execution
10) Test Procedure *: Step by step procedure from base state to end state
Step No Action I/P Required Expected Result Defect ID Comments
Test Design
11) Test Case Pass or Fail Criteria : When this case is pass, when this is case is fail?
Note: In general test engg�s are preparing test case documents with step-by-step procedure only (i.e 10th field only)
Ex 1: Prepare test case document for successful mail reply.
Step No Action IP Required Expected
1
2 Log on to site
Click inbox link Valid UID and PWD
-- Inbox page appear
Mail box appear
3 Select Received mail subject -- Mail message appear
4 Click reply -- Compose window appears with to
Received mailed
Sub:Re
CC: off
BCC: off
Message : Received message with comments
5 Enter new message and click save --- Ack from web server
2 Input Domain Based Test Case: Usecases are describing functionality
in terms of inputs, flow and output. But usecases are not responsible
to define size and type of input objects. Due to this reasons test
engineers are reading LLD�s also. (data model or ER diagrams). To
steady data model test engineer follows below approach.
Step 1: Collect data models of responsible modules
Step 2: Study the data model to under every input attribute in terms of size , type and constraints
Step 3: Identify critical attributes, which are participating in data retrievals and data manipulation.
Ex: AC no
Account Name
Balance Non Critical
Address
Step 4 : Identify non critical attributes, which are just input/ output type.
Step 5: Prepare data matrices for very input attribute in terms of BVA & ECP
Input Attribute ECP
Valid Invalid BVA
Max Min
Ex 1: From usecase a bank application allows a fixed deposit form. From the data model that form consists of below fields
Customer Name: Alphabets in lower case middle _
Amount : 1500 to 100000
Tenor : upto 12 months
Interest: Numeric with decimal From
this usecase, if the tenor is greater than 10 months our system allows
interest also as greater than 10%. Prepare test case document from
above scenario.
Test Case 1: Successful entry of customer name
Input Attribute ECP BVA
Valid Invalid Min Max
Customer Name a-z
Middle with _ A-Z
0-9
Blank,
Special cha
Init_
End_ 1 256
Test Case 2: Successful entry of amount
Input Attribute ECP BVA
Valid Invalid Min Max
Amount 0-9 A-Z
a-z
Blank,
Special cha
1500 100000
Test Case 3: Successful entry of tenor
Input Attribute ECP BVA
Valid Invalid Min Max
Tenor 0-9 A-Z
a-z
Blank,
Special cha
1 12
Test Case 4: Successful entry of input
Input Attribute ECP BVA
Valid Invalid Min Max
Intrest 0-9
with decimal A-Z
a-z
Blank,
Special cha
1 100
Test Case 5: Successful fixed deposit with all valid values
Step No Action IP Required Expected
1
2 Log on bank server
Select fixed deposit form Valid UID
-- Menu option appear
Fixed deposit form appear
3 Fill fields and click ok Valid name, amount, tenor and interest
OR
Invalid Ack from bank server
Or
Error msg
Test Case 6: Un successful operation due to tenor is greater than 10 months and interest is less than 10%.
Step No Action IP Required Expected
1
2 Log on bank server
Select fixed deposit form Valid UID
-- Menu option appear
Fixed deposit form appear
3 Fill fields and click ok Valid name, amount, tenor >10 months interest<10%
Error msg
Test Case 7: Unsuccessful operation due to without filling all field values
Step No Action IP Required Expected
1
2 Log on bank server
Select fixed deposit form Valid UID
-- Menu option appear
Fixed deposit form appear
3 Fill fields and click ok Valid name, amount, tenor and interest
But some as blank
Error msg
3 User Interface based test case design: To conduct user
interface testing, test engineers are preparing UI test cases depends
on our organization, user interface rules, global UI conventions
(Microsoft 6 rules) and interest of customer site people.
Example:
1) Spelling check
2) Graphic check
3) Meaningful error messages
4) Meangful help documents (Manual support testing)
5) Accuracy of data displayed
a. Amount DOB
b. Amount DOB
DOB dd/mm/yy
6) Accuracy of data in the database as a result of user input
Form table Report
7) Accuracy of data in the data base as a result of external factors
Ex: Imported files Test
case selection review: After completion of all possible test cases
writing, test lead and test engg are concentrating on test case
selection review for completeness and correctness. In this review test
lead apply coverage analysis on that cases
a) BR based
b) Usecase based
c) Data model based
d) UI based
e) TRM based
At the end of this review, TL creates Requirement Trace ability Matrix.
This is the mapping between BRS and prepared test cases. This is also
known as Requirement Validation Matrix. (RVM).
Business Requirements Source (UC, Data Model) Test Cases
*******
******* ******
****** ********
*******
*******
*******
*******
*******
VI Test Execution
After completion of all possible test cases writing for responsible
modules and their review, testing team concentrate on test execution to
detect defect in build.
1 Test Execution Levels
Development Testing Team
Initial Build
Stable Level-0 (Sanity/BVT/TAT)
Build
(Test Automation)
Defect Fixing Defect Level-1 (Comprehensive Testing)
Report
8-9 Cycles
Bug Modified Level-2 (Regression Testing)
Resolving Build
Level-3 (Final Regression)
2) Test Execution vs Test Cases
Level �0 -- All P0
Level �1 - All P0, P1 & P2 test cases as batches
Level � 2 Selected P0, P1 & P2 test cases wrt modification
Level � 3 Selected P0,P1 & P2 test cases wrt build
3) Build Version Control: Testing team receive build from development team through below process
Build Softbase
FTP
-----------------------------------------------------------------------------------------------
Test Environment
------------------------------------------------------------------------------------------------
From the above model, testing team receives build from development
through �File Transfer Protocol�. To distinguish between old and
modified builds, development use unique no version system. This system
is understandable to test engineers. For this version controlling
development, team people are using Visual SourceSafe.
4) Level �0: After receiving initial build from development team,
testing team covers basic functionality of that build to estimate
stability. During this testing, testing team applies below factors to
check whether the build is stable for complete testing or not?
Understandable
Operatable
Observable
Consistency
Simplicity
Controllable
Maintainable
Automatable
These are Testability Factors to do Sanity testing.
From the above factors, Sanity testing is also known as Testability
testing, BVT, Octagonal testing and other shake-ups are called smoke
testing.
5) Test Harness (Ready for testing)
Test harness= Test Environment + Test Bed
(req HW/SW) (Req documents like test case& test procedures)
6) Test Automation: After receiving stable build from development,
testing team concentrate on test automation to create automated test
script if possible.
Test Automation
Complete Selective
( all P0 and selective P1 test cases)
From the above model test engg�s are following selective automation for repeatable and critical test cases only
7) Level -- 1 (Comprehensive testing): After receiving stable build
from development team and completion possible automation, testing team
concentrate on test execution to detect defects. They execute tests as
batches. Test batch is also known as test suite or test set. Every test
batch consists of a set of dependent test cases. During this test case
execution as manual or automated, test engineer�s create �Test Log�. It
consists of 3 types of entries.
Passed, all expected equal to actual
Failed ,any one expected vary with actual
Blocked, due to failing of parent test
Pass
Failed Closed
Partial
Pass/Fail
Level -- 2 (Regression testing): During comprehensive test
execution, test engg�s are reporting defects to development team. After
bug resolving, testing team receives modified build. Before
concentrating on remaining comprehensive testing, testing team
Re-execute their previous tests on that modified build to ensure bug
fix work and possibility of side effects. This type of re- execution of
tests is called regression testing.
Resolved Bug Severity High Medium Low All P0, All P0 Some of P0,
P1 & P2 All P1 Carefully Carefully Selected
Selected P2 P1 & P2
Note: If development team release modified builds due to project
requirement changes, test engineer executes all P0, P1 and carefully
selected P2 test cases.
|