My other Blogs
Testing notes
Hi.. I will be posting all the testing related stuff here. The content posted here is a collection from different websites.
Wednesday, January 30, 2013
sample webservices hosted on web
http://www.webservicex.net/WS/wscatlist.aspx
http://www.webservicex.net/globalweather.asmx?WSDL
http://www.webservicex.net/CurrencyConvertor.asmx?WSDL
http://webservices.daehosting.com/services/TemperatureConversions.wso?WSDL
SOAPUI testing
http://www.soapui.org/ - download SOAPUI 4.5.1 from this link(Windows 32-bit installer)
http://www.soapui.org/ Downloads/latest-release.html - download SOAPUI Pro4.5.1(Window 32- bit installer)
http://www.filehippo.com/ download_jre_32/ - JDK Software
http://www.filehippo.com/ download_notepad/ - notepad++ software
http://www.soapui.org/
http://www.filehippo.com/
http://www.filehippo.com/
Friday, December 14, 2012
Decision Analysis and Resolution
We all come across bottle necks every now and then, not just in our works but also in our personal issues….
Then “You must choose …
but choose wisely.”
The higher the risks are depending on the sensitiveness the more you need to take a more conscious decision. Making a good choice
implies analyzing alternatives. Because analyzing alternatives actually is making a comparison, you must have a consistent criteria defined.
Decision Analysis and Resolution
helps us to choose the right path.
Definition:
Book Answer - The purpose of Decision Analysis and Resolution is
to analyze possible decisions using a formal evaluation process that
evaluates identified alternatives against established criteria
In other words
– a process to make key decisions in your
organization more objectively and wisely.
Few of the Lessons Learned from the Field:
1. People are generally impatient in decision making
2. Using a formal decision making process is not a
natural act
3. Make sure you have criteria in place for
when to
invoke DAR
4. Just getting started is half the battle
5. Implementing is easy
– but institutionalizing takes
time
“There's a difference between knowing the path, and
walking the path.”
Morpheus, The Matrix
Just DAR it
How to perform an effective RCA
Root cause analysis
(RCA) is a method of problem solving that tries to identify the root causes of faults or
problems that cause operating events. Here are some steps to perform an effective RCA process in software
development or testing projects:-
1. Step 1: Define the Problem
•
What do you see happening?
•
What are the specific symptoms?
2. Step 2: Collect & Analyze Data
•
What proof do you have that the problem exists?
•
How long has the problem existed?
•
What is the impact of the problem?
•
Use Pareto Charts
3. Step 3: Identify Possible Causal Factors
•
What sequence of events leads to the problem?
•
What conditions allow the problem to occur?
•
What other problems surround the occurrence of the central problem?
•
Use tools like Cause & Effect diagram, 5 Whys etc.
4. Step 4: Identify the Root Causes
•
Why does the causal factor exist?
•
What is the real reason the problem occurred?
5. Step 5: Recommend and Implement Solutions
•
What can you do to prevent the problem from happening again?
•
How will the solution be implemented?
•
Who will be responsible for it?• What are the risks of implementing the solution?
Friday, July 20, 2012
Thursday, June 14, 2012
Performance Testing - Main Activities
1.Identification of the Test
Environment: In this phase performance team identifies the
physical test environment and the production environment as well as the tools
and resources available to conduct the performance tests. The physical
environment includes hardware, software, and network configurations. After
having a thorough understanding of the entire test environment at the beginning
enables performance team to design and plan the testing more
resourcefully.
2.Identification of the Performance
Acceptance Criteria: In this phase the cross functional team
defines the response time benchmarks, baselines for throughput, and resource
utilization constraints. In general terms, response time is a user concern,
throughput is a business concern, and resource utilization is a system concern.
Additionally, they identify project success criteria that may not be captured by
those goals and constraints; for example, using performance tests to evaluate
what blend of configuration settings will result in the most desirable
performance results.
3.Planning and Designing of
Tests: Here the performance team identifies the key business
scenarios to be tested, settle on variability among representative users and how
to simulate that variability, define test data, and establish metrics to be
collected for result evaluation. Then they consolidate this information into one
or more models of system usage to be implemented, executed, and
analyzed.
4.Configure Test
Environment: At this stage the test environment, testing
tools, and resources necessary to execute each strategy as features and
components become available for test.
5.Test Design:
Develop the performance tests in accordance with the test
design.
6.Test Execution:
In this phase performance tests are executed and monitored. Before starting
the actual executing the tests, it is advised to validate the tests, and the
test data to give accurate results.
7.Analysis of Results, Report, and
Retesting: After receiving the consolidated performance
metrics from the test team the results are shared among the cross- functional
team. After reprioritizing the tests objectives, re-execution of tests is done
until the desired SLA’s are achieved. When all of the metric values are within
accepted limits, none of the set thresholds have been violated, and all of the
desired information has been collected, you have finished testing that
particular scenario on that particular configuration.
Disclaimer:
- The original work can be found under this link.
- All credit to J.D. Meier, Carlos Farre, Prashant Bansode, Scott Barber, and
Dennis Rea
Microsoft Corporation®
Performance Testing
Performance testing is defined as the technical investigation done to determine or validate the speed, scalability, and/or stability characteristics of the product under test. Performance-related activities, such as testing and tuning, are concerned with achieving response times, throughput, and resource-utilization levels that meet the performance objectives for the application under test. Because performance testing is a general term that covers all of its various subsets, every value and benefit listed under other performance test types in this chapter can also be considered a potential benefit of performance testing in general.
Key Types of Performance Testing
The following are the most common types of performance testing for Web applications.
Term
|
Purpose
|
Notes
|
Performance
test
|
To determine or validate
speed, scalability, and/or stability.
|
|
Load
test
|
To verify application
behavior under normal and peak load conditions.
|
|
Stress
test
|
To determine or validate an
application’s behavior when it is pushed beyond normal or peak load
conditions.
|
|
Capacity
test
|
To determine how many users
and/or transactions a given system will support and still meet performance
goals.
|
|
Summary Matrix of Benefits by Key Performance Test Types
Term
|
Benefits
|
Challenges and
Areas Not Addressed
|
Performance
test
|
|
|
Load
test
|
|
|
Stress
test
|
|
|
Capacity
test
|
|
Additional Concepts / Terms
You will often see or hear the following terms when conducting performance testing. Some of these terms may be common in your organization, industry, or peer network, while others may not. These terms and concepts have been included because they are used frequently enough, and cause enough confusion, to make them worth knowing.
Term
|
Notes
|
Component
test
|
A component test
is any performance test that targets an architectural component of the
application. Commonly tested components include servers, databases, networks,
firewalls, clients, and storage devices.
|
Investigation
|
Investigation is
an activity based on collecting information related to the speed, scalability,
and/or stability characteristics of the product under test that may have value
in determining or improving product quality. Investigation is frequently
employed to prove or disprove hypotheses regarding the root cause of one or more
observed performance issues.
|
Smoke
test
|
A smoke test is
the initial run of a performance test to see if your application can perform its
operations under a normal load.
|
Unit
test
|
In the context of
performance testing, a unit test is any test that targets a module of
code where that module is any logical subset of the entire existing code base of
the application, with a focus on performance characteristics. Commonly tested
modules include functions, procedures, routines, objects, methods, and classes.
Performance unit tests are frequently created and conducted by the developer who
wrote the module of code being tested.
|
Validation
test
|
A validation test
compares the speed, scalability, and/or stability characteristics of the product
under test against the expectations that have been set or presumed for that
product.
|
Performance testing is a broad and complex activity that can take many forms, address many risks, and provide a wide range of value to an organization.
It is important to understand the different performance test types in order to reduce risks, minimize cost, and know when to apply the appropriate test over the course of a given performance-testing project. To apply different test types over the course of a performance test, you need to evaluate the following key points:
- The objectives of the performance test.
- The context of the performance test; for example, the resources involved, cost, and potential return on the testing effort.
- The original work can be found under this link.
- All credit to J.D. Meier, Carlos Farre, Prashant Bansode, Scott Barber, and
Dennis Rea
Microsoft Corporation®
Introduction to Apache JMeter
Apache
JMeter is a 100% pure Java desktop application designed to load test
functional behavior and measure performance. It was originally designed for
testing Web Applications but has since expanded to other test functions. Apache
JMeter may be used to test performance both on static and dynamic resources
(files, Servlets, Perl scripts, Java Objects, Data Bases and Queries, FTP
Servers and more). It can be used to simulate a heavy load on a server, network
or object to test its strength or to analyze overall performance under different
load types. You can use it to make a graphical analysis of performance or to
test your server/script/object behavior under heavy concurrent load.
Stefano Mazzocchi of
the Apache Software Foundation was the original developer of JMeter. He wrote it
primarily to test the performance of Apache JServ (a project that has since been
replaced by the Apache Tomcat project).
Apache JMeter
features include:
-
Can load and performance test many different server types like Web - HTTP, HTTPS, SOAP, Database via JDBC, LDAP, JMS, Mail - POP3(S) and IMAP(S) etc.
-
Complete portability and 100% Java purity .
-
Full multithreading framework allows concurrent sampling by many threads and simultaneous sampling of different functions by separate thread groups.
-
Careful GUI design allows faster operation and more precise timings.
-
Caching and offline analysis/replaying of test results.
-
Highly Extensible:
- Pluggable
Samplers allow unlimited testing capabilities.
- Several load
statistics may be chosen with pluggable timers .
- Data analysis
and visualization plugins allow great extensibility as well as
personalization.
- Functions can
be used to provide dynamic input to a test or provide data
manipulation.
- Scriptable Samplers (BeanShell is fully supported; and there is a sampler which supports BSF-compatible languages)
- Data analysis
and visualization plugins allow great extensibility as well as
personalization.
Essential
Components of JMeter:
1.Test Plan: The Test Plan is
where the overall settings for a test are specified. Static variables can be
defined for values that are repeated throughout a test, such as server names.
For example the variable SERVER could be defined as www.example.com, and the
rest of the test plan could refer to it as ${SERVER}. This simplifies changing
the name later.
2.Thread
Group: A Thread Group defines a pool of users
that will execute a particular test case against your server. In the Thread
Group GUI, you can control the number of users simulated (num of threads), the
ramp up time (how long it takes to start all the threads), the number of times
to perform the test, and optionally, a start and stop time for the test. When
using the scheduler, JMeter runs the thread group until either the number of
loops is reached or the duration/end-time is reached - whichever occurs first.
Note that the condition is only checked between samples; when the end condition
is reached, that thread will stop. JMeter does not interrupt samplers which are
waiting for a response, so the end time may be delayed arbitrarily.
3.WorkBench:The Workbench simply
provides a place to temporarily store test elements while not in use, for
copy/paste purposes, or any other purpose you desire. When you save your test
plan,WorkBench items are not saved with it. Your WorkBench can be saved
independently, if you like (right-click on WorkBench and choose
Save).
Certain test elements
are only available on the WorkBench:
- HTTP Proxy
Server
- HTTP Mirror
Server
- Property Display
4.Samplers: Samplers perform the
actual work of JMeter. Each sampler (except Test Action) generates one or more
sample results. The sample results have various attributes (success/fail,
elapsed time, data size etc) and can be viewed in the various
listeners.
Various types of
samples are listed below:
- FTP
Request
- HTTP
Request
- JDBC
Request
- Java
Request
- SOAP/XML-RPC
Request
- WebService(SOAP)
Request
- LDAP
Request
- LDAP Extended
Request
- Access Log
Sampler
- BeanShell
Sampler
- BSF
Sampler
- JSR223
Sampler
- TCP
Sampler
- JMS
Publisher
- JMS
Subscriber
- JMS
Point-to-Point
- JUnit
Request
- Mail Reader
Sampler
- Test
Action
- SMTP Sampler
5. Logic
Controllers: Logic Controllers determine the order in which
Samplers are processed.
Various types of Logic
Controllers are listed below:
- Simple
Controller
- Loop
Controller
- Once Only
Controller
- Interleave
Controller
- Random
Controller
- Random Order
Controller
- Throughput
Controller
- Runtime
Controller
- If
Controller
- While
Controller
- Switch
Controller
- ForEach
Controller
- Module
Controller
- Include
Controller
- Transaction
Controller
- Recording Controller
6. Listeners: These are means to
view, save, and read saved test results. Listeners are processed at the end of
the scope in which they are found. The saving and reading of test results is
generic. The various listeners have a panel whereby one can specify the file to
which the results will be written. By default, the results are stored as XML
files, typically with a ".jtl" extension. Results can be read from XML or CSV
format files.
Various types of
Listeners are listed below:
- Sample Result
Save Configuration
- Graph Full
Results
- Graph
Results
- Spline
Visualizer
- Assertion
Results
- View Results
Tree
- Aggregate
Report
- View Results
in Table
- Simple Data
Writer
- Monitor
Results
- Distribution
Graph (alpha)
- Aggregate
Graph
- Mailer
Visualizer
- BeanShell
Listener
- Summary
Report
- Save
Responses to a file
- BSF
Listener
- JSR223
Listener
- Generate
Summary Results
- Comparison Assertion Visualizer
7. Configuration Elements:
Configuration elements can be used to set up defaults and variables for later
use by samplers. Note that these elements are processed at the start of the
scope in which they are found, i.e. before any samplers in the same
scope.
Various types of
Configuration Elements are listed below:
- CSV Data Set
Config
- FTP Request
Defaults
- HTTP
Authorization Manager
- HTTP Cache
Manager
- HTTP Cookie
Manager
- HTTP Request
Defaults
- HTTP Header
Manager
- Java Request
Defaults
- JDBC
Connection Configuration
- Keystore
Configuration
- Login Config
Element
- LDAP Request
Defaults
- LDAP Extended
Request Defaults
- TCP Sampler
Config
- User Defined
Variables
- Random
Variable
- Counter
- Simple Config Element
References
Subscribe to:
Posts (Atom)