Performance issues inevitably sneak into a project. Tracking down the issues is troublesome without the proper tools. Commercial performance-monitoring tools, such as JProbe or OptimizeIt, help pinpoint performance problems. These tools excel at providing performance metrics but typically require expert human intervention to run and interpret the results. These tools are not designed to execute automatically as part of a continuous integration processwhich is where JUnitPerf enters the picture.
JUnitPerf, available from http://www.clarkware.com/software/JUnitPerf.html, is a tool for continuous performance testing. JUnitPerf transparently wraps, or decorates existing JUnit tests without affecting the original test.[1] Remember that JUnit tests should execute quickly. Figure 8-1 shows the UML diagram for the JUnitPerf TimedTest.
[1] For more information on the decorator pattern refer to Design Patterns: Elements of Reusable Object-Oriented Software (Addison-Wesley) by Erich Gamma, et al.
|
Here's a quick overview of how a JUnitPerf timed test works. The following occurs when a JUnitPerf TimedTest.run(TestCase) method is invoked:
Retrieve the current time (before JUnit test execution).
Call super.run(TestResult) to run the JUnit test, where super refers to the JUnit TestDecorator.
Retrieve the current time (after JUnit test execution).
If the elapsed time turns out to be greater than the maximum allowed time, then a junit.framework.AssertionFailedError(String) is thrown. Otherwise, the test passes.