Do not abuse JMeter for -complex- Automated Functional Web Service Testing

Ooh ooh oh, more often JMeter is used to create complex test suites to execute functional test scripts against a Web Service. It’s perfectly possible with JMeter, but is this the most efficient way? It has to has to do with the project scope and priority, I think. As stated on their website, JMeter is designed to test performance.

“Apache JMeter may be used to test performance both on static and dynamic resources (files, Servlets, Perl scripts, Java Objects, Data Bases and Queries, FTP Servers and more). It can be used to simulate a heavy load on a server, network or object to test its strength or to analyze overall performance under different load types.” JMeter website

In my view, JMeter is not the best tool to write a bunch of functional test scripts for a Web Service (infrastructure). You will mostly end up with some unmaintainable throw-away test automation scripts, which took quite some effort to create. However, JMeter is one of the best performance testing tools.

You might want to perform some pre (insert data in a database) and post (assert complex XML / JSON responses) actions while executing functional test scripts. Here for, JMeter is not that handy and flexible. Better is to write your test scripts in a programming language, this allows you to do more complex things and create a maintainable test suite (using abstraction). You can apply programming principles (such as: DRY (Don’t Repeat Yourself), KISS (Keep it Simple, Stupid!), etc.) and you can create an abstraction layer (Each significant piece of functionality in a program should be implemented in just one place in the source code.).

I will put my ‘RESTful Web Service Automation Testing Framework’ publicly available on Github, if I find some spare time. (This will include detailed example)

Share This:

Functional testing over a proxy with BrowserMob Proxy

Last week we experienced a little outage on our websites (We have a website with a lot of dependencies with external systems and third-party content). It gave us the opportunity to take a closer look at our products while having a production issue. One thing we noticed is that we don’t tell the end-user that we are having issues. We have learned that we have to built-in a more resilient user experience during issues.

We can simulate these kinds of failures while testing over a proxy. BrowserMob Proxy can be controlled by a REST interface and has some great capabilities, such as blacklisting and whitelisting certain URL patterns, simulating various bandwidth and latency and controlling DNS and request timeouts.

As I said, BrowserMob Proxy is a REST tool. In order to perform REST calls we have to install cURL.

Windows

  • Download cURL from http://curl.haxx.se/
  • Extract the compressed file
  • Put curl.exe in c:/Windows/ (to enable curl from the command prompt)

Linux

  • sudo apt-get install curl

Or some other installation command which is related to the Linux distribution.

Starting the proxy

In the first command prompt

  • navigate to the bin directory
  • Start the proxy by entering
    browsermob-proxy.bat -port 9090

    the Linux equivalent

    ./browsermob-proxy -port 9090

In the second command prompt

  • Perform the following curl command to create a new proxy,
    curl -X POST http://localhost:9090/proxy
  • This will return a new proxy port, something like: {“port”:9091}
  • You can create a new HAR to start recording data, like this:
    curl -X PUT -d ''initialPageRef=newHar'' http://localhost:9090/proxy/9091/har

InternetOptions-423x288

Set the proxy in browser

  • Navigate to Control Panel -> Internet Options * Goto LAN settings in the Connections tab
  • Thick Use a proxy server for you LAN
  • Fill in the address: localhost
  • Fill in the port: 9091 (the one returned from the curl command)
  • Click on OK twice

Limit the connection speed
Perform the following curl command in the command prompt:

curl -X PUT -d "downstreamKbps=50" http://localhost:9090/proxy/9091/limit

Now you can visit the website with a low connection speed.

Blacklist third-party content

Perform the following curl command in the command prompt:

curl -X PUT -d "regex=http://example.com/*.*&status=404" http://localhost:9090/proxy/9091/blacklist

Now you can visit the website while blacklisting some content.

Reference
Check the BrowserMob Proxy readme (on Github) for all available API commands.

Share This:

Performance testing of an AJAX web application

For my current assignment, I was asked to test the performance of an AJAX-based web application, which was quite interesting. The customer connects different local authorities, so they use the same solution for their services and operations. Hosting and development are done by two different parties and they blame each other for the (bad) performance of the web application. The customer hired Polteq as an independent party to measure server response times as well as client-side rendering times.

Server response measurement
Jmeter was the tool of choice to simulate all the requests towards the server and measure their response times. Besides the obvious issue of JMeter not rendering JavaScript, the tricky part was that the server responded differently, 6 out of 10 times for the same request. This was solved by implementing Jmeter’s ‘logic if’ controllers. The next thing was making sure that all the relevant request headers and request parameters were present, inherited and reused. You can easily record all requests and responses with Fiddler 2 proxy.

capture-all-http-traffic-300x210A requirement for this project was to make it as maintainable and transferable as possible since the client is not too technical. This can be achieved by implementing ‘CSV data set config’ and ‘HTTP requests defaults’ elements in Jmeter.

One of the most important things of a performance test is the reporting facility. I was suggested by Martijn de Vrieze to use jmeter-plugins, with which we can easily generate sophisticated reports. Jmeter-plugins can easily create the following reports: response times vs. threads, response times over time, response latencies over time and transactions per second.

Client-side rendering measurement
MavenTestNG and Selenium WebDriver were the tools of choice to measure the client-side rendering times. I built a command line tool which executes testscripts based on given command line arguments (browser, testscript, runs). The tool executes the test scenario in the browser and stores some measurements while running the scenario. Based on the measurements a graph is created after execution.

The fun thing was that I could put the server under load by the Jmeter scripts and then measure the client-side impact with my WebDriver tool. First testresults showed a badly performing system and a lot of 404’s.

I will put the censored client-side measurement code on my public github after finishing this project.

Share This:

Testautomation, it couldn’t be more fun!

As a technical test consultant, I visit many companies to support them with testautomation. Every time I notice that there are still a lot of things to improve in this area. I mean testautomation on all levels, from planning to the specification anROId the execution of tests. It seems that companies think that the step towards the implementation of a structured testautomation framework is too high because the return on investment is not immediately visible. Next to that, there are not a lot of success stories yet, so that the restraint strengthens. Furthermore, sometimes the illusion arises that testautomation makes the work of testers superfluous.

However, the opposite is true … The main benefits and added value of testautomation are described below.

Tool integration
Testautomation has many benefits which contribute to better quality software. In the first place, testautomation forces –indirectly- that the testing process is at a fairly mature level. You need to have specified test cases which you can automate. However, if there are no test cases specified, testautomation is still possible, but you will have to think about two things at once while automating testscripts. (Desired behavior and ‘how’ to automate it) With a proper testautomation implementation, all tools, such as specification tooling, bug tracker, test execution tooling, etc., communicate and integrate with each other. Through this total integration, a lot of manual administrative tasks are no longer required and you can easily relate the test results to the original requirements.

Shortens the feedback loop
Adding value to a product can be done by releasing new functionality. The frequency at which this happens is the heartbeat of a project. In larger organizations, you see that they can release only a few times a year. That could be a lot more! You can execute the regression tests much faster, by implementing testautomation. Thus, more new functionality can be released more often.

Consistent quality factor
One of the goals of testautomation is that the tester is able to execute tests scripts repeatedly, without doing a lot of maintenance on those testscripts. It’s a good practice to start with the core functionality, because this functionality is subject to change the least. You can speak of a consistent quality factor, when all automated testscripts give the same positive results every time.

Tester’s motivation
By implementing testautomation, you can repeatedly execute the exact same checks. The power of a check is that the outcome is always binary; it is either right or wrong. There is no human interpretation involved by verifying the result. The automated checks remove a lot of work from the functional tester. The testers now have the opportunity to focus on other (more challenging/creative) testing techniques, such as exploratory testing (simultaneous learning, designing and execution of tests). In addition, the functional tester can also take the challenge to learn a programming language, so that they can write the testscripts themselves.

Modern software architecture
Today’s architecture software solutions are ideally suited for testautomation. Think of a service-oriented architecture where the business logic is separated from the application and the application integrates with (some) interface (s). These interfaces can be invoked by multiple applications, so it is imperative that this works well and that no regression occurs.

Broaden skills
Testautomation done by a tester is only possible if they are able to develop a broader skill set, so they can test applications from a more technical perspective. Especially in the agile software development approach, due to the iterative process, it is increasingly important that tests will be automated and the team gets the technical testers who can do that. Not doing testautomations means falling behind … The work can’t be done any longer manually, testers need to become more technical!

I look forward to the day when organizations recognize the benefits of testautomation and they start working goal oriented instead of tool oriented.

Testautomation, let’s do it!

This article is based on an original in Dutch published article on the Polteq website (see http://www.polteq.com/weblog/testautomatisering-leuker-kunnen-we-het-niet-maken/)

Share This:

Performance Testing

Over the last months, I have been working on several performance testing projects. They were all Commercial Off-the-shelf (COTS) applications, mostly SharePoint. For all projects, it was mandatory to measure both server-side and client-side performance. With server-side performance testing, it is interesting to see how the server behaves under different conditions of load. With client-side performance testing, it is interesting to see how the applications present the content to the user. I sort of like those projects because all kinds of expertise s come together, like XML, JSON, regular expressions, XPath, JMeter, Selenium WebDriver and think of all the different monitoring tools.

During my journey to select the right tools, I came across JMeter Plugins WebDriver Set. The plugin allows you to use the features of JMeter (test executor and reporting engine) as well as the features of Selenium WebDriver (controlling a browser). The major benefit is that you end up with similar graphs and result tables. So, it was the perfect choice

In the next few weeks, I will post more in-depth instructions on how the create a performance testplan.

Share This:

Use Sonar to Check the Quality of Test Automation Code

sonarsource-300x94I am working now for more than one year on a test automation project with thousands of lines of code. I started this project on my own, but during the year much more people got involved. Everyone with their own coding-style. We needed to guarantee the quality of the code written by those people. Therefore we introduced Sonar which performs static code analysis and can find violations of standards. The analysis includes:

  • Coding standards;
  • Code duplication;
  • Code complexity;
  • Potential bugs;
  • Code comments;
  • Unit Test coverage;
  • And more…

Writing test automation code is like doing “normal” development. So you have to apply coding standards, patterns to avoid duplication and reduce complexity, code comments to describe what each function does. Sometimes when testing safety critical systems you have to write unit tests for the test automation code as well.

Sonar gives you insight in all those areas. It became very clear to me that some test automation developers have bad or uncommon practices. (See screenshot of the dashboard)

dashboard-300x138

Sonar gives the ability to fix or resolve those bad habits.

Installing Sonar is fairly easy

Analyzing your project is even easier and can be done in three ways:

  • Sonar Runner
  • Ant Task
  • Maven Goal

That said, I think the first week of this year was very effective, we have taken the automation code to a higher level.

Share This:

Use Sonar to Check the Quality of Test Automation Code

sonarsource-300x94I am working now for more than one year on a test automation project with thousands of lines of code. I started this project on my own, but during the year much more people got involved. Everyone with their own coding-style. We needed to guarantee the quality of the code written by those people. Therefore we introduced Sonar which performs static code analysis and can find violations of standards. The analysis includes:

  • Coding standards;
  • Code duplication;
  • Code complexity;
  • Potential bugs;
  • Code comments;
  • Unit Test coverage;
  • And more…

Writing test automation code is like doing “normal” development. So you have to apply coding standards, patterns to avoid duplication and reduce complexity, code comments to describe what each function does. Sometimes when testing safety critical systems you have to write unit tests for the test automation code as well.

Sonar gives you insight in all those areas. It became very clear to me that some test automation developers have bad or uncommon practices. (See screenshot of the dashboard)

dashboard-300x138

Sonar gives the ability to fix or resolve those bad habits.

Installing Sonar is fairly easy

Analyzing your project is even easier and can be done in three ways:

  • Sonar Runner
  • Ant Task
  • Maven Goal

That said, I think the first week of this year was very effective, we have taken the automation code to a higher level.

Share This:

2013 Retrospective: a year’s reflection

Just a couple of day and the year 2013 comes to its end. It’s time to look back and reflect what I have done this year. Quite a lot of things came across my path: daily work, sales meeting, testing conferences, writing articles, training people and my promotion. Yes, you read it well, I have been promoted to a different business unit: consultancy 🙂 ! (this means that I will do a bigger variety of assignments across the Netherlands) I worked for the following customers in 2013:

SpilGames Implementation of an automated testing framework using the open source stack.

Dimpact Implementation of performance testscripts to measure the server-side performance (Jmeter) as well as the client-side performance(WebDriver).

Steinweg Tool selection for a new testautomation implementation for their entire landscape. (ongoing)

Sanoma Implementation of performance testscripts to measure the server-side performance (Jmeter) as well as the client-side performance(WebDriver). (ongoing)

Hema Test Automation POC using C# / WebDriver.

Postcode loterij Test Automation implementation using Java / WebDriver.

Richard Oosterhof PHP Selenium Automation. (ongoing)

Below a list of articles I wrote in 2013:

May 2013: “Implementeren van Behaviour Driven Development” in the Spring Special of TestNet Nieuws (TestNet Nieuws 2013 – 1, page 14)

June 2013: “Behave yourself – BDD for testers” in Professional Tester (Issue 21)

June 2013: “The benefits of BDD” in Professional Tester (Volume 5: Issue 3)

September 2013: “Implement Maintainable Test Scripts by applying Design Patterns” in Software Developer”s Journal (Selenium 2 WebDriver)

Below a list of conferences I visited in 2013:

May 2013: “Het overbruggen van de communicatiekloof met Behavior Driven Development/Testing (BDD/T)” at the Spring event of TestNet (Event)

June 2013: “Structured functional automated web service testing” at Test Automation Day in Rotterdam (Event | slides)

July 2013: “Aan de slag met Selenium WebDriver” at TestNet Summerschool (Event | slides – “Aan de slag met Selenium”)

November 2013: “Getting started with Selenium WebDriver” at Software QS Tag in Nürnberg (Event)

Share This:

2013 Retrospective: a year’s reflection

Just a couple of day and the year 2013 comes to its end. It’s time to look back and reflect what I have done this year. Quite a lot of things came across my path: daily work, sales meeting, testing conferences, writing articles, training people and my promotion. Yes, you read it well, I have been promoted to a different business unit: consultancy 🙂 ! (this means that I will do a bigger variety of assignments across the Netherlands) I worked for the following customers in 2013:

SpilGames Implementation of an automated testing framework using the open source stack.

Dimpact Implementation of performance testscripts to measure the server-side performance (Jmeter) as well as the client-side performance(WebDriver).

Steinweg Tool selection for a new testautomation implementation for their entire landscape. (ongoing)

Sanoma Implementation of performance testscripts to measure the server-side performance (Jmeter) as well as the client-side performance(WebDriver). (ongoing)

Hema Test Automation POC using C# / WebDriver.

Postcode loterij Test Automation implementation using Java / WebDriver.

Richard Oosterhof PHP Selenium Automation. (ongoing)

Below a list of articles I wrote in 2013:

May 2013: “Implementeren van Behaviour Driven Development” in the Spring Special of TestNet Nieuws (TestNet Nieuws 2013 – 1, page 14)

June 2013: “Behave yourself – BDD for testers” in Professional Tester (Issue 21)

June 2013: “The benefits of BDD” in Professional Tester (Volume 5: Issue 3)

September 2013: “Implement Maintainable Test Scripts by applying Design Patterns” in Software Developer”s Journal (Selenium 2 WebDriver)

Below a list of conferences I visited in 2013:

May 2013: “Het overbruggen van de communicatiekloof met Behavior Driven Development/Testing (BDD/T)” at the Spring event of TestNet (Event)

June 2013: “Structured functional automated web service testing” at Test Automation Day in Rotterdam (Event | slides)

July 2013: “Aan de slag met Selenium WebDriver” at TestNet Summerschool (Event | slides – “Aan de slag met Selenium”)

November 2013: “Getting started with Selenium WebDriver” at Software QS Tag in Nürnberg (Event)

Share This:

Test Automation Day NL 2013

Herewith a very short update. I was very glad to be part of Test Automation Day 2013 with my track on “Structured Functional Automated Web Service Testing”. I was even more pleased to be introduced by Dorothy Graham.

Please find the slides of my presentation below:


I really liked the final keynote by Emily Bache (@emilybache) with her vision on “The Future of Test Automation”. This was a very inspiring talk! She spotted two trends in our industry. The first one has to do with Lean Startup, a method for developing successful products. The second trend is towards distributed architecture, independently deployable components communicating via messages. Below you will find a quick summary photo of the keynote.

EmilyBache

I’m looking forward to the next Test Automation Day edition!

Share This: