Rational Performance Tester


Ben Smith and Laurie Williams [Contact Authors]
CSC 712 - Software Engineering
Department of Computer Science
North Carolina State University

Back to Software Engineering Tutorials


0.0 Outline
0.0 Background
1.0 Setup
2.0 Getting Started
3.0 Recording an HTTP Test
4.0 Editing and Labeling an HTTP Test
5.0 Creating a Performance Schedule
6.0 Executing/Exporting a Performance Schedule
7.0 Profiling a jUnit Test
8.0 Exercise
9.0 Resources

0.0 Background

Performance Testing is a process by which software is put under a certain demand load and measured for its response times, memory usage and other performance-related metrics. Applications which are used in parallel by more than one user can experience varying loads at different times of demand. By looking at which of a given set of performances an application gives, we can determine where its performance weaknesses are and optimize it for efficiency to ensure that it does not break or slow to a hault when under high demand.

Rational Performance Tester is an integrated design and development environment developed by IBM, built on the Eclipse IDE. The goal of Rational Performance Tester is to provide an integrated environment for Java developers to gather performance metrics and optimize applications for memory and execution time. The following tutorial will demonstrate gathering performance metrics with the web application iTrust and Rational Performance Tester 7.0.0.2.

By recording specific use cases with Rational Performance Tester, we can create a performance test or a systematic, repeatable series of user requests that would typically occur after deployment. After we have one or more performance tests, we can specify a performance schedule which will specify the mix of types of performance tests, the frequency, delay between requests, etc. Running our performance schedule will generate a performance report, which we can use to analyze the recordings of performance-related metrics (such as response time or memory usage) to determine the bottlenecks in our system and do our best to erradicate them.

Top | Contents

1.0 Setup

IMPORTANT: This tutorial assumes you already have the following set up on your computer:

  • Rational Performance Tester
  • iTrust
  • MySQL
  • Tomcat

NOTE: Since the memory requirements of this tutorial are quite high, the VCL image RPT v2 (WinXP) has been created to support its completion. If you have the above list of software configured on your computer, you can skip to the next section.

To use Rational Performance Tester, many students may require access to the Virtual Computing Lab at NCSU. The lab currently has an image, titled Rational Performance Tester saved which is completely prepared to execute the tutorial.

Accessing the lab requires Remote Desktop Connection on a Windows machine. To connect to the image, execute the following steps:

  1. Surf to http://vcl.ncsu.edu
  2. Click Make a New Reservation. You will be asked to login with your unity credentials. NCSU WRAP is your authentication method.
  3. You will then be taken to the New Reservation page. From the environment list, select RPT v2 (WinXP) For when you would like to use the application, select Now.
  4. You will have to wait for the image to be loaded. When the Connect button appears, click it.
  5. You will be given connection information for Remote Desktop Access. Enter it and login to the remote machine.
  6. We recommend opening the tutorial in a browser window on the remote machine before proceeding.
Top | Contents

2.0 Getting Started

To prepare for execution of the steps in the tutorial, you will need to ensure that iTrust is running and configured properly. Double-click IBM Rational Performance Tester on the desktop. The Test Perspective is used for creating and managing performance tests and schedules. We will start from this perspective. If it is not already open, go to Window -> Open Perspective -> Other... and select Test (default) as shown in Figures 2.1 & 2.2.


Figure 2.1: Opening a new perspective


Figure 2.2: Selecting the Test Perspective

To test proper iTrust installation, execute the following steps:
  1. Go to Window -> Show View -> Other... as shown in Figure 2.3.

  2. Figure 2.3: Opening a new view

  3. In the box that appears, select Servers as shown in Figure 2.4.

  4. Figure 2.4: Servers

  5. Your servers view will appear. Click the play button () on the servers view.
  6. Browse to http://localhost:8080/iTrust/, see that this page loads correctly.
  7. Login by clicking the test-purposes link HCP. If you see the homepage for Kelly Doctor, the instance of iTrust is properly configured. Otherwise, consult the iTrust documentation to ensure that iTrust and its components are functional.
  8. You may close the browser containing this test of iTrust.
Top | Contents

3.0 Recording an HTTP Test

To test iTrust's response times under a specific user load, we will need to create an HTTP test from recording. This is achieved by the following procedure:

  1. Click the Create New Test From Recording Icon (). A dialog box will appear.

  2. Figure 3.1: Create a New Test From Recording

  3. Select Create a New Test From Recording and the HTTP Recorder as shown in Figure 3.1. Click Next.
  4. You will be asked to specify a name and location for the test. Call it hcptest and store it in the root of the project.
  5. Surf to http://localhost:8080/iTrust/ within the browser provided by the recorder control.
  6. Login by clicking the "HCP" link above the login form.
  7. When the iTrust main menu appears, select Epidemic Detection.
  8. Select Show State Counts in to say North Carolina.
  9. Select Malaria (84.x) in the combobox below your page will now look like Figure 3.2.

  10. Figure 3.2: Detect an Epidemic

  11. Click Check for Epidemic. The page will respond No epidemic detected.
  12. Click the Log Out button in the upper-right hand corner.
  13. All HTTP requests from the browser window have now been recorded in a repeatable HTTP test. You may close the browser window.
Top | Contents

4.0 Editing and Labeling an HTTP Test

When you close the browser window, the recorder control will generate your performance test and save it to the test file you specified above. You have now simulated a healthcare professional checking for epidemics in his or her area. The performance test contents will be displayed within Rational Performance Tester as shown in Figure 4.1.


Figure 4.1: Performance Test Contents

The titles for each page recorded as you simulated a users path through the iTrust system are useful, but are too generic for understanding the differences in performance with a given use case. Each Test Element Node in the contents represents an HTTP request/response pair. To determine and set a better label for each node, you will want to see the HTTP response that was recorded for each element. Follow these steps:

  1. Select a Test Element Node as shown in Figure 4.2.

  2. Figure 4.2: Selecting a Node

  3. Select the Protocol Data control and the Browser tab as shown in Figure 4.3.

  4. Figure 4.3: Viewing the HTTP Response

  5. Look at the unformatted view of the browser display as shown in the tab and determine a descriptive title. Results is a much better title for the first node in the list than is iTrust - Diagnostic Trends {1}, for example.
  6. Find the Page Title field within the Test Element Details section and type in your new title as shown in Figure 4.4.

  7. Figure 4.4: Naming a Test Element

  8. Repeat this procedure until all nodes are named appropriately.
  9. When you are finished, go to File -> Save to save changes to your test.

Now you will need to create a datapool to collect response statistics over time. Execute the following steps:

  1. Select the hcptest node in the Tests Contents view.
  2. Within the Common Options tab as shown in Figure 4.5, click Add Datapool.

  3. Figure 4.5: Add Datapool

  4. The Import Datapool window will appear as shown in Figure 4.6. Click Create Default.


    Figure 4.6: Create Default

  5. Go to File -> Save to save your performance test
Top | Contents

5.0 Creating a Performance Schedule

Your performance test has been created and appropriately annotated. But to see how iTrust responds to multiple users, we must create a performance schedule with this test. This is achieved with the following procedure:

  1. Right-click on the iTrust project, and go to New -> Performance Schedule as shown in Figure 5.1.

  2. Figure 5.1: Creating a Performance Schedule

  3. You will be asked for a name and destination for your performance schedule. Place it in the root of the project and call it hcpschedule. Click Finish. The view in Figure 5.2 will appear:

  4. Figure 5.2: Performance Schedule Contents

  5. Add the test created in section 2.0 to the schedule. Right-click on User Group 1 and go to Add -> Test as shown in Figure 5.3.

  6. Figure 5.3: Adding a Test

  7. The Select Performance Tests window will appear. Select hcptest as shown in Figure 4.4.

  8. Figure 5.4: Select hcptest

  9. Your test has been added. Now select the hcpschedule node in the Schedule Contents list and click the Think Time tab in the Schedule Element Details view as shown in Figure 5.5.

  10. Figure 5.5: Think Time

  11. Leave Limit think times to a maximum value checked and change 10 seconds to 100 milliseconds.
  12. Go to File -> Save and save your performance schedule.
Top | Contents

6.0 Executing/Exporting a Performance Schedule

Now you have created a performance schedule with multiple healthcare professionals checking for an epidemic. We will now execute it to gather our performance statistics. To do this, right-click on hcpschedule within the Test Navigator pane, and go to Run As -> Performance Schedule as shown in Figure 6.1.


Figure 6.1: Running a Performance Schedule

Your performance report will now appear with the status of collecting the test data in the Overall tab. Statistical categories are along the bottom tabs as shown in Figure 6.2.


Figure 6.2: Viewing Results

Export your results to HTML by completing the following steps:

  1. In the Performance Test Runs view, expand your results which will be titled hcpschedule followed by the date.
  2. Expand the All Hosts node beneath.
  3. Right-click Performance Report and go to Export to HTML... as shown in Figure 6.3.

  4. Figure 6.3: Export to HTML

  5. Select a location and a title and click Finish.
Looking at the code for each of the pages accessed with this schedule (their URLs can be found in the page element details section) can help us discover bottlenecks in our system. Which page exhibits the highest response time? What iTrust Objects are involved in a request of that type? Why could they be slow? What could be done about it?

Top | Contents

7.0 Profiling a jUnit Test

Accessing a web application is not the only form of performance testing. We can gather statistics on Memory Usage and Execution Time for any Java program using RPT's Profiling tools. Since iTrust has a thorough test suite, we will use jUnit to gather execution time statistics for the iTrust bean validators. To profile these jUnit tests, execute the following steps:

  1. From the Profile Menu, click Profile as shown in Figure 7.1.


    Figure 7.1: Click Profile

  2. In the Profile Configuration dialog, right click on the jUnit node and select New... as shown in Figure 7.2.

  3. Figure 7.2: Make a new jUnit Test

  4. In the Test tab, select "Run all tests in a specified..." and click Search.
  5. In the folder selection dialog box, expand iTrust -> unittests and select edu.ncsu.csc.itrust.validate.regex. Click OK. Your Test pane should now look like figure 6.3.

  6. Figure 7.3: jUnit Settings

  7. Click the Monitor pane in the configuration dialog and be sure that Basic Memory Analysis and Execution Time Analysis are checked and that Agent Discoverer is unchecked as shown in Figure 7.4.


    Figure 7.4: Monitoring Settings

  8. Click Profile.
Your jUnit test results will appear and should all pass. Now you will look at your results. You can find them in the Profiling Monitor tab under DefaultMonitor. Expand every subnode in the tree until you end up with something looking like Figure 7.5.


Figure 7.5: Opening Results

Double click Execution Time Analysis to see your results. You should get something that looks like Figure 7.6. You may want to click the % sign to look at your statistics relative to each other. You can also sort the list by any of the columns by clicking them.


Figure 7.6: Sample Results

Note that the junit.framework package should not be included in your analysis of performance bottlenecks, because it will not be included when the application is executed in deployment. But why is edu.ncsu.csc.itrust.validate so high?

Expand the edu.ncsu.csc.itrust.validate package's node and continue to expand the node with the highest Base Time percentage. You will reach the second checkFormat method as shown in Figure 7.7. Why does this method consume so much execution time? Right click on it and select Open Source to analyze it at the sourcecode level.


Figure 7.7: checkFormat

Top | Contents

8.0 Exercise
You now know how to create performance tests and schedules and how to gather performance results using Rational Performance Tester. For this exercise:

  • Create an HTTP test in which an HCP
    • logs in,
    • changes demographic information in his/her profile,
    • edits an existing patient,
    • creates a new patient (create your own demographic information),
    • and logs out.
  • Label the HTTP responses appropriately in the test
  • Add a datapool to the test
  • Create a performance schedule with a limit of 500 millisecond's think time
  • Run the performance schedule. Export the results to an HTML file.
  • E-mail your answers to the following questions to your TA. Attach your results to the e-mail message.
    • Where is the bottleneck in the use cases analyzed in this HTTP test of iTrust?
    • What is the cause of this bottleneck (at the Java class level)?
    • What could be done to improve this performance issue?
Top | Contents

9.0 Resources
Top | Contents

Back to Software Engineering Tutorials
Rational Performance Tester Tutorial ©2003-2009 North Carolina State University, Ben Smith andLaurie Williams.
Email the authors with any questions or comments about this tutorial.
Last Updated: Sunday, August 26, 2007 7:49:20 PM