Skip to main content
  • The Wellington City Council (WCC) wanted to deliver quality outcomes without breaking the bank. Find out how Planit’s fast and flexible resources helped WCC achieve this goal.

this is a test Who We Are Landing Page

INSIGHTS / Articles

Lessons Learned in Performance Testing

 25 Sep 2018 
Lessons Learned in Performance Testing Lessons Learned in Performance Testing
Lessons Learned in Performance Testing
INSIGHTS / Articles

Lessons Learned in Performance Testing

 25 Sep 2018 

It’s 30 something years ago that I started in testing. Back then, testing was something that was carried out by developers and it wasn’t a separate function.

I was in a government department in the UK and doing performance testing, probably before it was called performance testing. A big lesson I learned early on was not to follow a process but look at the bigger picture to exactly understand what I’m doing and why I’m doing it.

That’s a lesson that a lot of people are benefitting from now. You get a lot of people in the non-functional side of testing that learn a tool, either automation or performance, and they centre everything they do around that tool, rather than understand what they need to automate or performance test correctly, and then make the tool do it.

Learning by doing

One of the things I encourage people to do nowadays is to go back and learn what your job’s about, know what the aims are, what constitutes a good or bad result, and if you have a problem, try to understand it. There are a lot of people now who just do a search in Google and grab a snippet of code to use.

If they understood what they were doing, and the way their tools worked, they could address that issue themselves, not to mention address a lot more issues as well. There are a lot of script monkeys out there that have changed the way performance testing is done, because that is they way they think the tool allows them to do it, rather than making the tool do what they want it to do.

When I started in performance testing, there really weren’t really any tools. When a tester tells me a tool won’t allow them to do something, I may suggest an alternative way to do it and they often look at me as if I’m some kind of genius. It’s not that I came up with anything ground-breaking, but that’s how I had to do it in the past when the tool genuinely wouldn’t do it.

Another thing is that, as people, we don’t always embrace change. We tend to view it with distrust, particularly when we get disruptors that come into the workplace and professions, and start telling us we’re doing it wrong or could be done other ways. Quite often we can see pitfalls in what they’re suggesting, but I’ve learned over the last few years that rather than show resistance, I engage and go on the journey with them.

That’s because, quite often, the disruptors don’t have the baggage that you may gathered over the years. Quite often they’re looking at the problem from a different perspective, so they can come up with innovations and new ideas that we would never have imagined.

When they do hit those problems, you’re a part of the team and solution. They’ll also actively engage you, since you’re not seen as a naysayer.

Plan ahead

In today’s market, there is an expectation for people to diversify and take on board knowledge from various fields, from Business Analysis through to development and testing. These additional skills certainly add value to a project and employer, providing a level of flexibility and agility. A modern quality engineer should be flexible enough to work in smaller teams on Sprints and potentially take on-board some of basic functions of other roles as required.

Still, I think there will always be a need for experts in various areas to guide and manage these people. As quality specialists, we need to focus on increasing our breadth and depth of testing skills. There was a reason why testing was taken out from developers, so we have a lot of education and learning to do.

Another thing we need to be aware of as is the need to maintain standards and processes with our performance and automation testing. In the past, test assets such as such as automation regression scripts were only run during the development phase, and then maybe every six months or a year later when there was a new release.

A lot of the automation regression scripts we’re developing now can be run every night for the life of the service, which could be a decade or more. So if you let Sprint teams use whatever tools they want and develop around them, it will be tough to maintain the automation regression suite of ten different automation tools for your whole system.

We need to have more rigour around the test assets we’re creating so they are reusable and maintainable. Doing that will ensure they remain as valuable assets even after the end of the initial development.

Automation and Performance Testing are crucial in ensuring your systems are running optimally, but very few can get it right. Contact us today to find today how we can help you get more out of what have.

Speed as a Key Asset

At Planit, we can help you make performance an asset, not a liability. Our expert consultants can provide testing, assessments and advice to mitigate performance risks and achieve peak results.
Find out how we can help you navigate these challenges, achieve your performance goals, and deliver a rapid, responsive, and reliable experience that delights your customers.


Find out more

Get updates

Get the latest articles, reports, and job alerts.