Friday, April 07, 2006

Basics of performance testing

Not too long ago, I had the opportunity to consult for the CIO of a large multi-national. At the beginning of the engagement, he looked at me and said “I am not seeing any value from the effort my company puts into performance testing. I need you to tell me if whether it is the tools that are the problem or us.” They had a centralized “performance certification” process. In this process, every change to the production environment was required to be tested for performance against pre-established goals. The centralized certification team, received the change from the development team along with the defined criteria for testing. The criteria was typically like a requirement to ramp to 10 virtual users running a specific set of transactions to achieve a response time of better than 7 seconds. There was no criteria for any regression performance testing and rarely any requirement to simultaneously drive load against any other portion of the application set to simulate what happens in production. It isn’t very surprising that the tests did not match results in production.

Effective performance testing is expensive. It requires access to a typically constrained test environment, performance test subject matter expertise, infrastructure and application subject matter expertise. It consumes the most expensive resource - time. It can save you from the most embarrasing business and customer fiascos. But it is not worth the cost unless it is done right.

No comments: