This task describes how to design a performance test.
To learn more about performance test design, see Performance Test Design Overview.
Make sure the relevant scripts/tests have been uploaded/saved to Performance Center. You can use VuGen scripts and JMeter scripts for performance testing, as well as UFT GUI tests, and UFT API tests.
VuGen scripts: For details on uploading VuGen scripts, see Upload VuGen Scripts.
JMeter scripts: For details on using JMeter scripts, see Upload JMeter scripts to Performance Center.
UFT GUI tests: For details on saving UFT GUI Tests, see the HPE Unified Functional Testing User Guide.
UFT API tests: For details on saving UFT API tests, see the HPE Unified Functional Testing User Guide.
Under Lab Resources, select Testing Hosts and make sure that there is at least one Controller, one load generator, and one data processor in the host pool of your project. If not, contact your administrator to add them.
For optional pre-design best practices, see Performance Test Design Best Practices.
Create a new performance test
- On the My Performance Center navigation bar, select Test Management > Test Plan.
- In the Test Plan Tree, select the Subject root folder, click New Folder . Type folder name and click OK.
- Select the folder from the tree.
- Click New Test . Fill in the fields in the Create New Performance Test dialog box. For details, see Test Plan Module.
Tip: To simplify the process of creating, designing, and running performance tests, you can use
Test Express Designer to guide you through each step. For details, see Test Express Designer.
Design a workload for the test
Designing a workload involves creating Vuser groups, distributing Vusers among the Vuser groups, assigning hosts to the Vuser groups, and defining a run schedule for the test. For task details, see How to Define a Performance Test Workload.
Note: Non-English national characters are not supported in group names.
Integrate virtualized services - optional
Configure and integrate virtualized services into the performance test. For task details, see How to Add Virtualized Services to Performance Tests.
You can start adding projects that contain virtualization services to your performance test from the Performance Test Designer at any point in the design process, but we recommend adding projects after you have added relevant scripts to the test.
Select a topology for the test - optional
Note: Before you can select a topology for a test, you must design the topology. To design topologies, see How to Design Topologies.
In the Performance Center Designer's Topology tab, click Select Topology and select a topology for the test. For user interface details, see Performance Test Designer > Topology.
Select or create monitor profiles to monitor the test - optional
In the Performance Center Designer's Monitors tab, click Add Monitor Profile or Add Monitor OFW. The respective tree opens on the right.
To select existing profiles or monitor-over-firewall agents, select the monitors to add to the test, and click Add selected monitors .
Note: Before you can select monitors for a test, you must configure monitor machines and create monitor profiles. For details, see Create and configure monitor profiles.
Similarly, you must define monitor-over-firewall agents in the system before you can select them to monitor a test.
To create a new monitor profile or monitor-over-firewall agent, click the Create Monitor or Create Monitor OFW button.
For a monitor profile: In the Create Monitor Profile dialog box, enter a name and description, and select a folder for storing the profile. Click OK, and create a new monitor profile as described in Create and configure monitor profiles.
For a monitor-over-firewall agent: In the Create Monitor Over Firewall dialog box, enter a name, the machine key, and select a folder for storing the profile and the MI Listener with which the monitor is to connect. Click OK.
After you have created new monitor profiles or monitor-over-firewall agents, select the monitors to add to the test, and click Add selected monitors/monitors over firewall .
For user interface details, see Performance Test Designer > Monitors.
Enable and configure J2EE/.NET diagnostics - optional
Enable and configure diagnostic modules to collect J2EE/.NET diagnostics data from the test run. For details, see Working with Diagnostics.
Define service level agreements for the test - optional
Define service level agreements to measure performance metrics against performance goals. For details, see How to Define Service Level Agreements.
When you save the test, it goes through a validation process. The test is valid only if it contains no errors. The result of the validation is stated at the bottom of the Test Designer window.
Click the link to open the Test Validation Results dialog box and view the details of the validation results. For user interface details, see Test Validation Results Dialog Box.
Note: If you make changes to the test, and the test is linked to a timeslot, the timeslot is updated with these changes automatically.