Performance Testing (PTS) organizes load tests into scenarios that contain one or more sessions running in parallel. Each session defines an ordered sequence of API calls. Understanding how sessions, APIs, data files, and checkpoints interact helps you design scenarios that accurately simulate real-world traffic patterns and avoid unexpected data consumption behavior.
Key concepts
| Term | Description |
|---|---|
| Test API | An API request that simulates a specific call to your application. Each test API represents one step in a session. |
| Session | An ordered sequence of test APIs, similar to a transaction. APIs within a session run sequentially -- each API waits for the previous one to complete or time out before sending requests. |
| Data file (associate data file) | A file that provides parameterized values to test APIs. A test API is associated with a data file when it uses parameters from that file. If multiple file parameters are used from different data files, multiple data files are associated. |
| Data configuration node | A configuration step required before any API in a session can use file parameters. If an API needs file parameters or defines secondary file parameters (such as MD5-encoded values), configure them in this node first. |
| Checkpoint (assertion) | A validation rule that checks whether an API response meets expected values. If a checkpoint fails, PTS skips all subsequent APIs in that execution sequence. |
| One data poll | A data consumption mode where each row in the data file is used only once, preventing duplicate request data. When enabled, that data file becomes the benchmark. |
| Output parameter | A value extracted from an upstream API response and passed as a parameter to downstream APIs. |
| Data export instruction | An instruction that exports data (such as cookies or output parameters) from one session for use in other sessions and for sharing the exported data globally. |
How sessions and APIs run
Sessions in a scenario run in parallel by default. Within each session, APIs run sequentially -- PTS sends requests for each API only after the preceding API completes or times out.
Because APIs run in order, a downstream API sends fewer requests than its upstream counterpart. When the test ends, some intermediate API requests may still be in progress and never trigger the next API in the sequence.
Data allocation rules
Before any API in a session can use file parameters, configure a data configuration node for that session.
Benchmark file selection
When multiple APIs in a session reference different data files, PTS aligns their rows using a benchmark file -- the data file that determines how many rows all other files contribute. The benchmark is selected as follows:
| Condition | Benchmark selection |
|---|---|
| No one data poll option selected | The data file with the fewest rows becomes the benchmark. |
| One data poll enabled for a parameter | That parameter's data file becomes the benchmark. |
Row alignment across data files
After selecting the benchmark file, PTS truncates all other data files to match its row count and polls the truncated rows repeatedly.
Example: Session 1 has three APIs that reference three data files:
| API | Data file | Row count |
|---|---|---|
| API 1 | File 2 | 200 rows |
| API 2 | File 3 | 1,000 rows |
| API 3 | File 2 (200 rows) + File 1 (100 rows) | -- |
With no benchmark explicitly set, PTS selects File 1 (100 rows) as the benchmark because it has the fewest rows. PTS then truncates File 2 and File 3 to their first 100 rows and polls those rows repeatedly during the test.
One data poll behavior
When one data poll is enabled for a parameter:
That parameter's data file becomes the benchmark.
The same number of rows as in its source file are selected in the combination.
The test on the current API is terminated after the required number of requests are sent.
The session ends after the specified number of requests required in the combination has been sent.
Checkpoint behavior
If a checkpoint (assertion) is set for an API:
Pass: PTS continues to the next API in the session.
Fail: PTS skips all remaining APIs and instructions in that execution sequence. Other parallel executions of the same session are not affected.
Set up a logon-dependent scenario
When a test scenario requires user authentication before running business logic, the logon session must complete and share credentials before other sessions can start.
Configure the logon session
Enable one data poll for the logon API. In the data configuration node of the logon session, select one data poll for a parameter. This makes sure each user account is used only once, preserving logon uniqueness.
Add a data export instruction. After the logon API, add a data export instruction to export session data (such as standard cookies or output parameters) for other sessions to consume. A maximum of five parameters can be exported.
Set the export level. For each exported parameter, specify an Export Level value. After the required number of users have logged on, the user logon information is shared with other sessions in the scenario. Other sessions start only after the number of successful exports reaches this threshold.
NoteThe Export Level value must be less than or equal to the number of rows in the logon API's data file. For example, if the data file has 200 rows, set the export level to a value such as 100.
Impact on parallel sessions
By default, all sessions run in parallel. When a session uses data export instructions, it becomes a blocking dependency -- other sessions wait until the export threshold is met before they start. Data export is the only mechanism that prevents sessions from running in parallel.
Handle export failures
If the export level cannot be reached during a test (for example, because some load generators disconnect), click Release Global Preparations at the bottom of the test page to manually unblock the remaining sessions.