Thursday, January 02, 2014

Modifying JMeter scripts programatically

Problem : Is there any scripting or GUI way to massively change from WebService(SOAP)
Request (DEPRECATED) to SOAP/XML-RCP Request? http://jmeter.512774.n5.nabble.com/Question-about-WebService-SOAP-Request-DEPRECATED-td5718693.html

Solution : The JMeter script file is an actual XML file so any solution that manipulates XML can be used to solve the above problem.  As Michael Kay's excellent books on XSLT were staring at me , the way I implemented this was in XSLT.
The first thing we need to do was use what is called an identity transform
<xsl:template match="@*|node()">
 <xsl:copy>
  <xsl:apply-templates select="@*|node()" /> 
  </xsl:copy>
</xsl:template>

i.e. essentially copy the source XML to the destination. With that done we can now get to the problem at hand - which is intercept all the deprecated requests and transform them to the new one

In order to do so we can create a JMeter script which has both of these elements and look at what XML gets generated so that we know the source and destination XML we want
The deprecated XML
 <WebServiceSampler guiclass="WebServiceSamplerGui" testclass="WebServiceSampler" testname="WebService(SOAP) Request (DEPRECATED)" enabled="true">
          <elementProp name="HTTPsampler.Arguments" elementType="Arguments">
            <collectionProp name="Arguments.arguments"/>
          </elementProp>
          <stringProp name="HTTPSampler.domain">server</stringProp>
          <stringProp name="HTTPSampler.port">port</stringProp>
          <stringProp name="HTTPSampler.protocol">http</stringProp>
          <stringProp name="HTTPSampler.path">path</stringProp>
          <stringProp name="WebserviceSampler.wsdl_url"></stringProp>
          <stringProp name="HTTPSampler.method">POST</stringProp>
          <stringProp name="Soap.Action">ws-soapaction</stringProp>
          <stringProp name="HTTPSamper.xml_data">ws-SOAP-DATA</stringProp>
          <stringProp name="WebServiceSampler.xml_data_file">C:\Users\dshetty\Downloads\apache-jmeter-2.9\bin\logkit.xml</stringProp>
          <stringProp name="WebServiceSampler.xml_path_loc"></stringProp>
          <stringProp name="WebserviceSampler.timeout"></stringProp>
          <stringProp name="WebServiceSampler.memory_cache">true</stringProp>
          <stringProp name="WebServiceSampler.read_response">false</stringProp>
          <stringProp name="WebServiceSampler.use_proxy">false</stringProp>
          <stringProp name="WebServiceSampler.proxy_host"></stringProp>
          <stringProp name="WebServiceSampler.proxy_port"></stringProp>
          <stringProp name="TestPlan.comments">comment</stringProp>
        </WebServiceSampler>


The non deprecated SOAP request
        <SoapSampler guiclass="SoapSamplerGui" testclass="SoapSampler" testname="SOAP/XML-RPC Request" enabled="true">
          <elementProp name="HTTPsampler.Arguments" elementType="Arguments">
            <collectionProp name="Arguments.arguments"/>
          </elementProp>
          <stringProp name="SoapSampler.URL_DATA">url</stringProp>
          <stringProp name="HTTPSamper.xml_data">data</stringProp>
          <stringProp name="SoapSampler.xml_data_file">c:/test.xml</stringProp>
          <stringProp name="SoapSampler.SOAP_ACTION">soapaction</stringProp>
          <stringProp name="SoapSampler.SEND_SOAP_ACTION">true</stringProp>
          <boolProp name="HTTPSampler.use_keepalive">false</boolProp>
          <stringProp name="TestPlan.comments">comment</stringProp>
        </SoapSampler>

Knowing this , the template is easy
  <xsl:template match="WebServiceSampler ">
    <xsl:element name="SoapSampler">
        <xsl:attribute name="guiclass">SoapSamplerGui</xsl:attribute>
        <xsl:attribute name="testclass">SoapSampler</xsl:attribute>
        <xsl:choose>
            <xsl:when test="@testname ='WebService(SOAP) Request (DEPRECATED)' ">
                <xsl:attribute name="testname">SOAP/XML-RPC Request</xsl:attribute>
            </xsl:when>
            <xsl:otherwise>
                <xsl:attribute name="testname"><xsl:value-of select="@testname"/></xsl:attribute>
            </xsl:otherwise>
        </xsl:choose>
        <xsl:attribute name="enabled"><xsl:value-of select="@enabled"/></xsl:attribute>
        <elementProp name="HTTPsampler.Arguments" elementType="Arguments">
            <collectionProp name="Arguments.arguments"/>
        </elementProp>
        <stringProp name="TestPlan.comments"><xsl:value-of select="stringProp[@name='TestPlan.comments']/text()" /></stringProp>
        <stringProp name="SoapSampler.URL_DATA"><xsl:value-of select="stringProp[@name='HTTPSampler.protocol']/text()" />://<xsl:value-of select="stringProp[@name='HTTPSampler.domain']/text()" />:<xsl:value-of select="stringProp[@name='HTTPSampler.port']/text()" /><xsl:value-of select="stringProp[@name='HTTPSampler.path']/text()" /></stringProp>
        <stringProp name="HTTPSamper.xml_data"><xsl:value-of select="stringProp[@name='HTTPSamper.xml_data']/text()" /></stringProp>
        <stringProp name="SoapSampler.xml_data_file"><xsl:value-of select="stringProp[@name='WebServiceSampler.xml_data_file']/text()" /></stringProp>
        <stringProp name="SoapSampler.SOAP_ACTION"><xsl:value-of select="stringProp[@name='Soap.Action']/text()" /></stringProp>
        <stringProp name="SoapSampler.SEND_SOAP_ACTION"><xsl:value-of select="(stringProp[@name='Soap.Action']/text() != '')" /></stringProp>            
        <boolProp name="HTTPSampler.use_keepalive">false</boolProp>        
    </xsl:element>
  </xsl:template>

Where we just map the older elements into their newer forms (wherever possible). So for example we match an element named "WebServiceSampler" and replace it with "SoapSampler" . We transform the testname to either use the default if the deprecated element used the default or we use the name the user specified. We copy attributes and nested elements.
 You can then use ANT or any other way to style your input scripts into newer ones for one or many files
<target name="transformWS">
    <xslt
        in="CompareXML-RPCRequest.xml"
        out="test.jmx"
        style="transformWS.xsl"/>

</target>
An alternate approach would have been to generate java code that could parse XML and remove/replace objects but the identity transform mechanism makes this technique hard to beat. Any Object binding mechanism would run into issues when JMeter changes it schema but the XSLT technique is limited to only having to worry about source and destination.
I've also always toyed with the idea of creating a functional suite of tests that user's would  run at will without being limited to the tests defined in the file and this is the approach I would use there as well - more to come.

Example - https://skydrive.live.com/?cid=1BD02FE33F80B8AC&id=1BD02FE33F80B8AC!892

Monday, April 22, 2013

Simulating Abanadoned flows in JMeter

A user wanted to simulate abandoned flows in JMeter - e.g.a  multi step checkout scenario where some number of users drop off at every step

This was my suggestion

+Sampler1
+ThroughPutController (90 , percent executions , uncheck per user = 10% abandon)
++Sampler2
++ThroughPutController (80 , percent executions , uncheck per user = 20% more abandon at this step)
+++Sampler3



Wednesday, March 27, 2013

Sharing session ids across threads in jmeter

Problem i want to use the same JSESSIONID across the test plan which includes several Thread Groups(http://mail-archives.apache.org/mod_mbox/jmeter-user/201303.mbox/%3Cloom.20130311T124606-446@post.gmane.org%3E).
Which is actually two problems
a. How do I share data between threads in JMeter?
b. How do I connect to an existing session in JMeter given the session id?
Note : A good question to ask at this stage is do I really , really want to do this? Usually a Jmeter thread maps to actions that a user can take (AJAX aside) - doing something like the above goes away from the JMeter model of things and an alternative to consider is why don't I change my test so that a user's session is only needed for a thread? (mini rant - This is a common scenario in support where there is a problem X , the user thinks that doing A will solve his problem, A needs B to be done , B needs C and C needs D and the user then asks "How do I do D?"without revealing his cards - in this case why is the test plan for a user split across multiple thread groups?) The user should really be asking how do I solve problem X because some person might say why not do Action Y which is simple) .

But because we will have a blog post to write , we will assume the answer to the above question is Yes, I really want to do this and this is the best way to solve the problem I have.

How to share data between threads in JMeter
JMeter's only out of box way to share data between threads is the properties object and the shared namespace in BeanShell[2] both of which have limitations. The properties object needs you to come up with some scheme to share data and also needs you to write your own code for e.g. if you consume data much faster than you can generate it.

One technique is that if ALL the data that is to be shared is to be obtained before the next threads need it then one option is to use the CSV Data Set config - The first set of threads write to the file and the next set of threads read from this file - but essentially any shared memory(e.g. a file, a database etc) will do. However if you need to read and write this data at the same time then another way is to use the standard Java data structures which are thread safe.

In this post we will look at two ways
a. The CSV data set config
b. Using a java object
Note that there is a plugin available which will do b. from Jmeter plugins InterThreadCommunication [1] . If you aren't a programmer then the plugin is probably best for you - though I will say that good testers these days need to have programming and scripting skills.

However we will just write our own for fun. You also might need to do this to implement a complicated scheme.

Before we begin we will need an application to allow us to test whatever we write. So I have deployed a web application which has 2 URLs
a. set.jsp which sets an attribute into the session with the name uuid and the value a random generated value and also returns it as part of its output. Since this is the first page we access it will also create a session and set a cookie.


b. get.jsp which reads the value of the attribute uuid and returns it as its output. If you have the same session and previously accessed set.jsp then you should see the same value as step a.
If you dont have a session(or didnt access set.jsp) then you will see the value as "null"


So let us first write a simple test to verify that all is well. Here is a screen of the test.


We simply access set.jsp. Extract out its response . Then we access get.jsp and assert that the value we extracted out matches whatever is being returned by get.jsp. If the same session is being used for both the requests then the assertion will pass , otherwise it will fail.

Remember that Java web applications allow sessions to be maintained in two ways - either the Session ID is passed as part of the URL or as a cookie (usually named JSESSIONID) or both.
For this example we will assume we are using cookies.

Lets cause the test to fail - A simple way of doing this is by disabling the HTTP Cookie Manager. If we disable this JMeter wont accept cookies and hence wont be able to maintain sessions and every request will be a new session.

So we disable the HTTP Cookie manager and run the test. It fails as we expected it. Lets verify that it's because of the session
a. Note the first request gets a Set-Cookie back from the server - the server is trying to get the client to store the SessionID so that the next request can send the SessionID back

b. Note that the next request does not send a cookie header from JMeter

c. Note that the server responds to this request with another set-cookie (because it will try to create a new session as it didnt get any existing session cookie


Now lets enable the HTTP Cookie Manager. It works!. Lets verify the cookie was passed


Now that we have some degree of confidence that our assertions work (i.e. failures are flagged as failures and successes as successes) lets rerun the test with multiple threads and multiple iterations. All work. We can also see that different set requests get different cookies. This is because a cookie manager is scoped to a single thread (so each thread gets its own cookie and session) and because we have checked  "Clear cookies at each iteration" on the cookie manager.

Sharing Data using Java Objects in realtime
Lets first create a class that will allow data to be shared. For this purpose we use a  LinkedBlockingQueue  [4]. The javadoc for the interface BlockingQueue.
"Note that a BlockingQueue can safely be used with multiple producers and multiple consumers."
Which is a fancy way of saying something can be accessed by multiple threads without the programmer needing a degree in Computer Science. Since we will be reading and writing from different threads in different thread groups , the fact that this data structure natively supports thread safety frees us up from having to implement it - no synchronized keywords are needed in our code
However we do need to store this queue object somewhere , so we simply create a wrapper class that holds on this Queue statically and provide getters and setters. Because we dont know the rate at which we will read/write we simply put a timeout on the get.
package org.md;

import java.util.concurrent.LinkedBlockingQueue;
import java.util.concurrent.TimeUnit;

public class Queue {
    private static LinkedBlockingQueue<Object> q = new LinkedBlockingQueue<Object>();
    public static void put(Object o) throws Exception{
        q.put(o);
    }
    public static Object get(long timeout) throws Exception{
        return q.poll(timeout, TimeUnit.MILLISECONDS);
    }
    public static void clear() throws Exception{
        q.clear();
    }
    public static void main(String [] args) throws Exception{
        
        Queue.put("one");
        Queue.put("two");
        //q.clear();
        System.out.println(Queue.get(1000));
        System.out.println(Queue.get(1000));
        System.out.println(Queue.get(1000));
    }
}


We run a simple single threaded test on this class to see that everything is fine (it is), we compile this into a jar, place the jar in JMeter's lib directory and our class is ready to be used.
 The test structure is shown in the screen below



a. The setup thread group clears any data that might already be there in the queue. Because we are holding onto the Queue statically and the JMeter GUI is a single JVM , we need to clear out any data that might be there from a previous run. This is the code in the Beanshell sampler.

org.md.Queue.clear();

b. We create a threadgroup and use a request to create the sessions , extract out the data that we need - the sessionid and the value returned by the page so that we can assert that we are connecting to the same session. We then call the class we have just created to put these values into the queue (as an array with two values) and finally we introduce a random delay values
The regular expression to extract the cookie is in the screenshot above

And here is the BeanShell code to add the cookie and the extracted value to the Queue

String[] data = new String[2];
data[0] = vars.get("jsessionid");
data[1] = vars.get("uuid");
org.md.Queue.put(data);

c. Next we create a threadgroup that is going to use the session ids set from the previous threadgroup using a BeanShell preprocessor. The preprocessor also has the code to add the session id cookie so that requests will connect to the existing session. As before we assumed that the Server manages session in the form of cookies. But we could also have it as part of the URL once we are able to successfully read the SessionID.
We also configure the thread groups to have different number of threads so that there will be some variation in the rate of requests (note that if you have more Sessions created in the "set" threadgroup than you can process in the "get" threadgroup you will eventually run out of memory unless you make the queue bounded in size. If you have more in the "get" threadgroup then they will keep waiting (hence we have a timeout when we ask for a session id)

import org.apache.jmeter.protocol.http.control .CookieManager;
import org.apache.jmeter.protocol.http.control.Cookie;
String[] data = (String[])org.md.Queue.get(60000);
if(data != null) {
    vars.put("jsessionid",data[0]);
    vars.put("uuid",data[1]);
    Cookie cookie = new Cookie("JSESSIONID",vars.get("jsessionid"),"localhost","/", false,-1);
    CookieManager manager = sampler.getCookieManager();
    manager.add(cookie);
} 

The code above simply asks the queue for a value, and will wait upto 60 seconds for it (This only applies if we are consuming the data much faster than we can produce it). Then if we did get a valid value we use the JMeter API's[3] to access the Cookie Manager and add a cookie to it. Note that we do need a Cookie Manager for this to work.  Some values have been hardcoded for simplicity (like the domain name)

We run the test and its all successful! Remember we do have assertions so a success is indeed a success. We can also verify using the view results tree listener and we can also check the cookie is being sent in the GET request.


Cons : Note that anything that needs data to be shared between threads , that has wait and synchronization( irrespective of whether you do it or whether the JVM does) adds an overhead to the test - so the amount of load you can generate from a single JMeter instance will reduce. Also if you need more sophisticated schemes to share data (we used a simple FIFO model) then your code is more complicated , more prone to errors and likely to add more overhead.

Sharing data using files
Using a file is usually not a good candidate for sessionids because sessionids are short lived and may timeout. You also need to have everything available in the file before you can start reading the file. They are more suited to usecases like testing for Remember me or keep me signed in type of cookies. However the flexibility a CSV data set config provides may make it worth your time.
Here is the structure of the test.

The setUp thread group sends a set of requests to create the session and get the values as before.  To write all these values to a file we use a simple data writer and configure JMeter (by modifying jmeter.properties) to write the two variables we are interested in only. Note that this is probably not a good way to do it - in addition JMeter appends the data to an existing file so either you need to delete the file before you run this or use a property that varies dynamically so that a new file is created.

Remember that the setUp thread group will always execute before any other threadgroup , so this file will be created and have all the data before the next thread group runs.  The Simple data writer unchecks all the checkboxes and we edit jmeter.properties to have the following line


 sample_variables=jsessionid,uuid

The next threadgroup simply reads the file and attaches the session id to the cookie manager.

We can have as many threads as we want (we ensure that clear cookies at every iteration is checked so that every time we request something we use the cookie that is set by the beanshell preprocessor and not the one the cookie manager had previously saved). The Beanshell code to add the cookie is the same as before

import org.apache.jmeter.protocol.http.control .CookieManager;
import org.apache.jmeter.protocol.http.control.Cookie;

Cookie cookie = new Cookie("JSESSIONID",vars.get("jsessionid"),"localhost","/", false,-1);
CookieManager manager = sampler.getCookieManager();
manager.add(cookie);
 
we run the test - all successful. The file generated has the values


3C7C0BD5CA87C16FA65EAECEEBB54CF6,2db51396-6db2-44cb-9336-c732e52f8ac1
5E83FA9D286583DE32E6CC7C4E799599,f173a3ac-9779-408e-8cdb-7d23dfd5e2df
AFB458860FDF0376B9C8B9F52117A668,9f2da7f4-e848-4267-af9d-806e229f9c9f
540A42700742141A8BB8C2941666C8C9,84355a19-d2e1-45b7-ac42-23e9ab03a195
A7791553761A3613064E6BF7564B76E9,c3b14e92-93bc-4500-93be-ecbe43e70a5a
DB79953CEBA4C6E4FE708AE127E53F26,c99a880d-297d-4c00-9ff2-964449d9d106
3A7BBBC042EDB07D03901EEED8D37DC1,b2b29cb8-5261-4345-9405-80973d9f7334
913AFEC1A2D22EF6C64D7B99675DC86B,13bb3aa1-36cc-4e60-942c-d3a29a74b99f
DD06AEC9AF9D7CFF4FFB0AA2462E63C0,51b7b486-e503-43f8-a3ca-e36eb0df88a8

The cookies get added to the request

We can change the CSV data set config to keep reading the same file instead of terminating at the end - that works as well. You can also check that if you disable the setup threadgroup and rerun the test , it still works (because the sessions are still active - assuming they havent timed out) - so if the data you are saving in this file is still valid it can be used anytime and anywhere. You can also bounce the server in which the JSP web application was deployed. This causes all the sessions to get invalidated and hence the file to have invalid data. Now if you run the test , all the samples will fail because all the sessions are invalid - so even though we added the session id , the server has no data for this session.

Cons : As mentioned above , this isn't realtime sharing , it relies on you already having the data to be used , and the data being valid.

Files used in this post available at https://skydrive.live.com/#cid=1BD02FE33F80B8AC&id=1BD02FE33F80B8AC!886

References
[1] InterThread Communication- http://code.google.com/p/jmeter-plugins/wiki/InterThreadCommunication
[2] BeanShell Shared namespace - http://jmeter.apache.org/usermanual/best-practices.html#bsh_variables
[3] Cookie Manager - http://jmeter.apache.org/api/org/apache/jmeter/protocol/http/control/CookieManager.html
[4] LinkedBlockingQueue - http://docs.oracle.com/javase/6/docs/api/java/util/concurrent/LinkedBlockingQueue.html



Friday, May 25, 2012

JSON in JMeter

http://jmeter.512774.n5.nabble.com/Require-help-in-Regular-Expression-td5713320.html
Roughly user gets a JSON response and wants to use some of that data for the next JSON request. Using a regular expression to extract out multiple related data is sometimes painful. So the alternative I suggest is to use JMeter to parse the JSON object and then use it (The other would be to have JMeter to natively support JSON , but unless JMeter also has a mapping tool , this only saves two lines of code , so I don't think its that useful).

So first we download Douglas Crockford's reference implementation from JSON in java , compile it and generate a jar and copy it into JMeter's lib directory (please ensure you compile it with a java version that is compatible with whatever you will use to run JMeter) - and don't use this library for evil.

Next launch JMeter.


To spoof this example , we create a BeanShell sampler that will return the JSON string we are interested in. To this sampler we add a BeanShell post processor that will parse the response and form the object that we need for the next request. The code is given below

import org.json.JSONArray;
import org.json.JSONObject;

String jsonString = prev.getResponseDataAsString();
JSONArray equipmentParts = new JSONArray(jsonString);
JSONArray parts = new JSONArray();

for(int i=0;i<equipmentParts.length();i++ ){
    JSONObject equipmentPart = equipmentParts.getJSONObject(i).getJSONObject("equipmentPart");
    JSONObject allAttributes = equipmentPart.getJSONObject("allAttributes");
    JSONObject part = new JSONObject();
    part.put("partId",allAttributes.getLong("equipmentPartId"));
    part.put("partNumber",allAttributes.getString("compositePartName"));
    // add more here
    parts.put(part);
}

vars.put("jsonResponse", parts.toString());
Note we first get the response as a string. Then we just use the JSON libraries to parse this string into a JSON array (equipmentParts). After which we iterate over every object, extract out the attributes we are interested in (you could even have an array of attributes and have a simpler loop) and then we start forming the object we want to post (i.e. the parts JSON array and the part JSON object)

Next we simply convert the JSON parts array into a String to be used by the next sampler by setting it into the jmeter variable jsonResponse.


The next sampler simply uses a POST and only specifies the variable ${jsonResponse} in the value field(So that it will be used as is in the body).
We run the script. The Debug Sampler is used to verify that the value we setup is correct and that it does get posted in the next request



Sample code is available here https://skydrive.live.com/#cid=1BD02FE33F80B8AC&id=1BD02FE33F80B8AC!876

Wednesday, April 25, 2012

JMeter

Paraphrased Question on the JMeter mailing list
While working on JMeter I prepared some Test Plans in JMeter and executed
them with different combinations and I put forward the Results,graphs etc.
so please guide me what kind of details/screens/graphs/files should my
presentation include and what should not?


The first Shakespearean to be or not to be dilemma I have when I read the above is should I try to answer this question ?. On one hand I'd like to help the person asking the question, and this is more of a theoretical exercise where I can blab away without fear of consequence or effort needed to actually do what I'm saying. On the other hand the question is so generic that much of what I say will have to have so many caveats to be accurate, there will be a bunch of follow up questions I will have to answer, that my blabbing away might mislead said person because I don't really know what the person wants and as an engineer I hate to speculate without knowing. Perhaps a brusque "what are your requirements?" with an additional dose of the sad state of software testing where people don't know their goals or their requirements , if I'm in bad mood. with a slight guilty conscience since I've been here and done this and asked such questions. Should I feed the person or teach the person to fish? Since the tester seems to be a beginner will my long drawn caveat ridden, opinionated, biased opinion help the tester or make the matter worse?

Any Road.

The single most important thing to know is when you ran the tests what did you want to know about the system? What did you expect to prove? What would you consider a successful test? What would you consider a failure? or in other words What were your requirements?
If you have ever read a technical work on Testing systems , you should know that there are various types of tests and indeed there are even various types of Performance tests. Unless you know this you don't know what type of tests you need to run and you don't know how to interpret the results, indeed you probably don't even know which system to look at.

To illustrate with some examples
a. You want to know , for a specified load, whether all your operations succeed (an operation could be anything like placing an order or searching a value).
b. You want to know, for a specified load, whether your system performs acceptably
c. You want to know, for a specified load, which has been kept on the system for a sufficiently long time, your system shows acceptable characteristics.

You will notice there are a lot of weasel words in the examples above. What is "specified" load and how do we determine that?. What is "performs acceptably" and how do we determine that? What is "acceptable characteristics" and how do we determine that?

And the answer is - it depends , hence the weasel words.
Take specified load for example - How many users? How many of them hitting the site concurrently? What think time?
And answer goes back to "why are you running your tests?".
Suppose you have changed some functionality - say you refactored some code. then usually the specified load is the same as the one you see in production. i.e. you are trying to find out if your change has no impact. Success is determined if the system performs the same or better with your change.
Suppose you are doing capacity planning - Then the specified load is the number of users you expect to access your site (perhaps sometime in the future as your site grows). Success is determined by the response time and other system characteristics
Say you are going live with a new site. Then specified load is the maximum load you expect the site to see. Success is determined by the response time being lower than those specified and system characteristics being acceptable
Say you want to check if your site can take the added shopping load when Thanksgiving arrives. Then specified load is the peak burst you expect to see and so on.

It isn't easy. Most testers don't even think this is part of their job - I guess it isn't. But asking these questions should be.

Now lets complicate it some more. Lets say you like the large majority of testers are interested in the "response time" for some load. Lets break this up into two parts
a. The accuracy of the simulation and
b. The limitation of the tool under test.

a) When you run any load test, you are running a simulation (except in a very rare number of cases). The accuracy of the simulation depends on how good your test script is, but it also depends on your test environment. This is it's own blog post but unless you know your simulation is reasonably accurate or you have safety margins built in , your tests aren't any good. This isn't particularly earth shattering - but how many testers actually validate that their simulation is accurate? Many don't. If you run a JMeter with 500 threads on a single windows machine, with no think time - and you dont get an out of memory error , is your simulation accurate? In most cases the answer would be no. The JMeter machine will not be able to generate the same load as 500 windows machines each running a single user. Having 500 machines is not a practical scenario for most people (perhaps this will change with the cloud) - but the key here is if I can live with some margin of error , what is the number of machines I should have and how do I determine this? The answer is again - it depends. It depends on the application, It depends on your test, It depends on the machine you are running from, It depends on the network etc etc.
One of the ways to validate this is run the test , see some of the values you are getting from Jmeter. Now while the test is running (this is important!) , using some other machine , that is as independent from the Jmeter machine as possible , access the site under test with a browser (or even a single user JMeter thread) and see the value you get. if this is close enough to the average Jmeter value - then great!. If it varies wildly then you might have an issue. Another way is half the load on the Jmeter client machine and add another Jmeter client machine with half the load. Do you see substantial changes in response times? Do you see many more concurrent threads on the server? if so you have a problem. Repeat this till you see no significant  difference. Or you can monitor the jmeter client machine - is memory/cpu pegged at highest value? If so probably the test isn't reliable.

b) Jmeter is not a browser yadda yadda. Do not expect to be able to tell browser render times without some additional tests. Running tests from an intranet is probably not going to give you the same time as when a user from some far off country accesses your site and gets a page of 500KB. Remember to factor in the network (either guesstimate or have different machines with different locations if possible)

Remember however that you can never have 100% accuracy (well perhaps you could - it depends :). An important part of engineering is knowing what tradeoffs to make when you have limited time and resources.



















Dynamically changing sampler name in a JMeter test

This came up when a user was reading URL's from a file and needed to extract out a value from the Response that he wanted to use as the sampler name (so that his report would show this information)

The solution is roughly
a. Extract out the comment using regex post processor or equivalent
b. Add a BeanShell Post Processor  as child of the sampler write code like
prev.setSampleLabel(vars.get("
variablenamefromstepa")); => replace the name with the value you extracted

Wednesday, February 08, 2012

The 17th law of defects

"If a functionality is not tested for 3 months by business users , the maximum severity that can be assigned to it is S3"

Source : Deepak's book of software testing laws

Saturday, November 26, 2011

And the winner is - Stripes!

Recently I had the unpleasant experience of having to select a "framework" for a J2EE application.

In my day job , the decisions have been usually made for me (Weblogic Portal aka BEA Weblogic Portal aka Oracle Weblogic Portal). Pageflow was ok (even though the stateful behavior made many normal HTTP operations painful). But the Portal parts were decent and we did have reusable(the holy grail!) portlets.

When it was time to choose something for the web application I wanted to develop , there were quite a few options. Years ago , this would have been simple. Free , documented, widely used == Struts. Now this isn't quite so simple.

The first thing I eliminated was a portal container. The standard ones use JSR-168 or 286 and in their pursuit of WSRP, Federation and other portal goodies forget that they are actually using HTTP. A web based framework that needs you to jump hoops to get the HTTPServletRequest usually indicates that the creators of the framework weren't thinking their actions through. If there ever was a J2EE spec that was worse than EJB , it's JSR 168. In their crazy pursuit of the least common denominator , we get crippled products. No thank you.

I did however want something that would let me easily reuse menu's , footers , right hand modules etc. jsp:include just doesn't do it so well , where does the underlying code that fetches data for these common pages reside? If the JSP is to be just a dumb view, this is a problem.

Struts shows it age (and you need Tiles to support the above and tiles is painful) . Perhaps my views are clouded due to a project that I had worked in a past life. I assume the architect(s) wanted to beef up their CV so they had Struts, Tiles, Spring, Hibernate, iBatis , ofBiz, freemarker, ofbiz's persistence framework and every single open source technology that was somewhat popular.  Getting all of these to work together was an exercise in frustration (note this was years ago) - but as soon as I am told to use Tiles I remember this project and I stop.

Enter Spring. When I first used it , I was impressed with how well it did basic stuff. I loved Spring JDBC - this was better than crappy ORM frameworks(yes Hibernate , you too). It let people who knew SQL and enjoyed SQL work with SQL without the normal associated JDBC trappings (to this day , my homegrown SQL utilities closely resemble Spring JDBC API's). Spring MVC should be good , right? Perhaps it is. But the documentation sucks and I don't have that much time to invest. It also felt that Spring MVC was more concerned with Spring Integration and the MVC pattern than it was in solving the problem of a easy to use web framework (totally biased , totally unjustified view , based on a cursory reading of documentations - and apologies to the Spring team, who I salute - but really your documentation should be better if you want to draw in people like me)

Enter Wicket - But then two lines from the documentation
"JSP is by far the worst offender" "Wicket does not introduce any special syntax to HTML. Instead, it extends HTML in a standards-compliant way via a Wicket namespace that is fully compliant with the XHTML standard".
Look at the feature list - http://wicket.apache.org/meet/features.html . Does it start with simple to use , web framework , supports all web related features? No it starts with we use POJO's (i.e. you can unit test!). Wow. Because Unit testing a web application is value for money.
Bye Bye Wicket. You've been clean bowled. I never understood the dislike JSP gang. It's like looking at bad java code saying Java is by far the worst offender. I like tag libraries when they dont attempt to be
HTML encapsulators. I like being able to read Java code instead of the mental shift associated with interpreting yet another template framework syntax. I like the ability to embed Java code instead of new syntax , and I use it sparingly, and it works well thank you very much. I understand you might have a valid different philosophy , but it's not for me.

Enter the Play framework. Seemed simple (though non JSP). All in all looked to be great and simple. Seemed to get web applications. Targets restful structures. It would be nice to learn a new framework and see what it does better and what it does worse. But the tutorials seemed to be too command line heavy. There's nothing wrong with that of course. One problem I had is since I was trying to implement this project for a friend and I would have to hand it over to someone, I needed to use something that could be easily learnt or was widely used. Play could be it - but it is so different from the rest, that I hesistate.

Enter Stripes. Did I like the documentation. The HowTo seemed to cover everything I want, and the framework seemed to hit the right balance between what it did and what you had to do. Yay we have a winner. The proof of simplicity , my wife who is a PHP developer with rudimentary java knowledge picked up this framework in a couple of hours (or she is exceptionally smart and yes she reads this blog)