To do performance testing for web-services

Use and application of the eValid server loading (LoadTest) capability. And in the cloud computing context for monitoring and loading.

To do performance testing for web-services

Postby ksteve » Fri Jun 22, 2012 3:29 pm

Hi friends,

I am asked to do performance testing for web-services.

The scenario is like: Login to a portal, input username and password, upload a XML file, the file goes to web services, it checks and returns back another XML fie.(The web service takes 3 inputs ( username, password, input XML file) and returns output as XML file ), this is whole happening on a web based application.

But here the concern is to do the performance testing for web services to check how much time it is taking to take the input parameters, doing whatever is the activity, and returning the output xml file. Also have to check that whether it can accept 3000 input, output process at a time or not?

Can you suggest how to do this ?
ksteve
 
Posts: 3
Joined: Thu Nov 17, 2011 4:42 pm

Re: To do performance testing for web-services

Postby eValid » Fri Jun 22, 2012 3:48 pm

Here is an outline of how we would use to address the problem you describe:

(1) The eValid browser-based test engine will let you record a functional test session that handles the login to the portal, uploads a file, sends the file somewhere on the web, and then waits for the result of it being processed by the web service to arrive.

The last step, measuring how long the final result takes, probably would be done in eValid with a DOM-based test playback synchronization command.

Timing precision of the result would be to the nearest 10 msec; this is the minimum retry interval for the DOM-based playback synchronization command.

A single test like this probably could be created and perfected in eValid in somewhat less than 1 day.

(2) Next you would provision this test for use in a LoadTest run that would launch 100 Browser Users (BUs) on a single user account on your basic machine.

At this step we would modify the script to report just the single response time parameter plus a timestamp to a local file. This is a very low overhead way to collect basic data from within each playback into a single easily handled file.

From experience, we know that 100 BUs would probably be the limit for a single user account, due to limitations in heapspace and RAM.

(3) To get to 1,000 BUs, you would need a strong machine -- we use Amazon's "m2.4xlarge" images -- with 10 separate user accounts, where each user account is running 100 BUs.

It's relatively inexpensive to use the large machines in the cloud compared with buying the hardware for your own lab.

(4) To get to your goal of 3000 BUs, simply repeat (3) on three different Amazon images.

(5) At the end of this experiment it is a simple matter to collect the files of timestamped response time data into a spreadsheet and generate a response time versus imposed load graph.

Here is a detailed writeup of a load-test run we made that involved driving 1,000 "mobile device" users that uses this same approach:

http://www.e-Valid.com/Products/Documentation.9/Mobile/1000-BUs.html

________________________
eValid Tech Support
eValid
 
Posts: 2392
Joined: Tue Jan 01, 2008 12:48 pm
Location: USA


Return to Performance/Load Testing

Design Downloaded from free phpBB templates | free website templates | Free Web Buttons