Flow actions

The flow-related actions that you can use are: action group, assert, decision, pass, fail, run test, sleep, fetch test data, lookup test data, iterate test data, iterate while, and iterate through multi-value tags.

The following flow-based actions are available for HCL OneTest™ API tests:

Action Group

An Action Group combines multiple actions and treats them as a single test step. For example, you can use an action group to control how subscription-based operations are started and how they process their received messages. Messages in an Action Group are processed in parallel, as opposed to one after the other when used as normal steps in a test sequence.

A failure within an action group does not propagate to the Action Group container within the test tree. After you start the execution sequence for an Action Group, that sequence is independent of the execution of the containing test.

In the example that is shown below, an Action Group contains two subscribers on different subjects. When executed, the test starts both subscribers and validates their corresponding messages, regardless of the order in which they are received. The Action Group is considered complete after both messages are received.

Action group with different subjects

In the following example, two subscribers have the same subject (that is, they both receive the first subject message simultaneously). This enables separate filters to be applied at the same time.

Action group with the same subject

Action Group | Assert | Decision | Pass | Fail | Run Test | Sleep | Fetch Test Data | Lookup Test Data | Iterate actions | Iterate | Iterate Test Data | Iterate While | Iterate through multi-value tags

Assert

The Assert action is a conditional action that lets the user construct a list of expressions that determine whether a test proceeds. If the Assert action passes, the test continues. If the Assert action fails, the test fails and no further test actions are executed.
Assert action window
Expressions can be written in any of the following forms:
  • ECMAScript (such as JavaScript)
  • Legacy functions
  • Other scripting languages (requires customization)
For more information, see Scripts within tests and stubs.

Expressions must evaluate to true or false. Any other result causes the test to fail with a console message that identifies the non-boolean result.

To enter an expression, click Add and type into the empty input field. To remove an expression, select it and click Delete. To test the expressions, click Test and enter temporary values for any tags shown.

You can use list tags in these expressions. When you click Test, you must supply the complete list of values for the list tag, enclosed in braces.
The value for account_balance is an array, enclosed in braces.

For more information about list tags, see Repeating elements and list tags.

Note: String comparisons are performed if the parameters are enclosed in double quotation marks, otherwise numeric comparisons are performed. When you are comparing Boolean values, they must be treated as strings (for example, eq("%%myTag%%", "false")).

By default, multiple expressions are joined with a logical AND. This means that all expressions must pass in order for the action to pass and for the test to proceed. To join expressions with a logical OR, enable the ‘OR’ Expressions option. When this option in enabled, only one of the expressions needs to pass in order for the action to pass and for the test to proceed.

Action Group | Assert | Decision | Pass | Fail | Run Test | Sleep | Fetch Test Data | Lookup Test Data | Iterate actions | Iterate | Iterate Test Data | Iterate While | Iterate through multi-value tags

Decision

A Decision is a conditional action, the body of which is essentially a function and whose result controls test flow. A Decision can have one of two outcomes: true or false.
Decision action with true and false results

Any HCL OneTest API function can be added as an expression from the Decision editor menu (for example, simple expressions by using XPath value comparison operators, such as equals, less than). Expression arguments can be tags or literals.

Expressions can be written in any of the following forms:
  • ECMAScript (such as JavaScript)
  • Legacy functions
  • Other scripting languages (requires customization)
For more information, see Scripts within tests and stubs.

Expressions must evaluate to true or false. Any other result causes the test to fail with a console message that identifies the non-boolean result.

Multiple expressions are processed as being joined by a logical AND. To override (that is, join them with a logical OR), enable the ‘OR’ Expressions option.

Note: The Decision action cannot test whether a tag is populated.

To enter an expression, click Add and type it into the empty input field. To remove an expression, select it and click Delete. To test the expressions, click Test.

Decision action window
You can use list tags in these expressions. When you click Test, you must supply the complete list of values for the list tag, enclosed in braces.
The value for account_balance is an array, enclosed in braces.

For more information about list tags, see Repeating elements and list tags.

Note: String comparisons are performed if the parameters are enclosed in double quotation marks, otherwise numeric comparisons are performed. When you are comparing Boolean values, they must be treated as strings (for example, eq("%%myTag%%", "false")).

You can drag another action onto the branches of a Decision action. For example, to send a message to the console with information about the negative outcome of the Decision function, you can drag a Log action onto the False branch .

Action Group | Assert | Decision | Pass | Fail | Run Test | Sleep | Fetch Test Data | Lookup Test Data | Iterate actions | Iterate | Iterate Test Data | Iterate While | Iterate through multi-value tags

Pass

When a Pass action is encountered during the execution of a test, the test is deemed to pass and execution jumps to the Tear-down step. This action is commonly used as one of the paths under a decision step to make a test pass its current iteration.
In the Pass action editor, you can set the location of the logging output file by either of these two methods:
  • By typing the full path to the file, which can include tags, and then clicking Browse to locate the file
  • By selecting an entry from a list of the most recently used files
If no output file is set, no logging occurs.

Once an output file is designated, two more file options can be enabled or disabled: Append and Flush.

  • If Append is enabled, the test action adds new logging to the existing file contents. Otherwise, the existing contents are overwritten.
  • If Flush is enabled, the test action writes the message to the file immediately. Otherwise, messages are buffered and written in batches (which can improve performance).

Action Group | Assert | Decision | Pass | Fail | Run Test | Sleep | Fetch Test Data | Lookup Test Data | Iterate actions | Iterate | Iterate Test Data | Iterate While | Iterate through multi-value tags

Fail

When a Fail action is encountered during the execution of a test, the test is deemed to fail and execution jumps to the Tear-down step. This action is commonly used as one of the paths under a Decision step to force a test to fail its current iteration.
In the Fail action editor, you can set the location of the logging output file by either of these two methods:
  • By typing the file’s full path (which can include tags), clicking Browse to locate the file
  • By selecting an entry from a list of the most recently used files
If no output file is set, no logging occurs.

Once an output file is designated, two more file options can be enabled or disabled: Append and Flush .

  • If Append is enabled, the test action adds new logging to the existing file contents. Otherwise, the existing contents are overwritten.
  • If Flush is enabled, the test action writes the message to the file immediately. Otherwise, messages are buffered and written in batches (which can improve performance).

Action Group | Assert | Decision | Pass | Fail | Run Test | Sleep | Fetch Test Data | Lookup Test Data | Iterate actions | Iterate | Iterate Test Data | Iterate While | Iterate through multi-value tags

Run Test

The Run Test action executes another test as a step (subtest) of the current test.

The following table describes the general options available for the action under the Config tab:

Table 1. General options for the Run Test action
Run Settings
Test Click Browse to select the test to execute. Click Clear to reset the field.
Run process in parallel Enable this option to execute the selected test in parallel with the parent test (that is, the test that contains this Run Test action).
Note: If this option is enabled, the Store tab is unavailable.
Parallel Settings (used with "Run process in parallel")
Parent waits for child to complete The parent test waits for the selected test to finish before it terminates.
Child exits when parent completes The selected test is terminated as soon as the parent test is finished executing.
Child Subscriber Settings (controls message delivery when parent and child tests subscribe to the same message subjects)
Default Use independent subscribers in parallel and shared subscribers in series.
Shared Separate message queues deliver messages to both processes.
Independent Subscribers retrieve messages in turn from a single message queue.
Note: Ensure that the selected test does not contain a Run Test action that calls the current test sequence (that is, a mutual recursion).

Within the Run Test action, the Value and Store pages are used to pass data between the parent and child tests.

The Value page passes data to the child test, using the following procedure:
  1. In the child test, create a set of tags that can be used as inputs, using the Tag Data Store or another method.
  2. On the Properties page of the child test, select All Tags in the Input section. Individual tags can also be selected using the other controls on this page. Either method makes the chosen tags visible to the parent test.
  3. In the parent test, open the Run Test step and click the Value tab.
  4. On the Value page you can now see all the input tags from the child test in the Unmapped list. Select all tags for which you want to provide a value and move them to the Mapped list. Any tag that is not mapped uses the default value provided in the Tag Data Store for the child test.
  5. Select each tag in the Mapped list in turn and provide a value for it in the Value field. This value can be a constant, or a tag used by the parent test.
  6. When the parent test runs, it passes the specified values to the child test.
Similarly, the Store page retrieves output data from the child test, using the following procedure:
  1. In the child test, create a set of tags that can be used as outputs, using the Tag Data Store or another method. The child test should provide values to these tags when it runs.
  2. On the Properties page of the child test, select All Tags in the Input section. Individual tags can also be selected using the other controls on this page. Either method makes the chosen tags visible to the parent test.
  3. In the parent test, create a set of tags to hold the output values, using the Tag Data Store or another method.
  4. In the parent test, open the Run Test step and click the Store tab.
  5. On the Value page, you can now see all the tags from the parent test in the Unmapped list. Select all tags for which you want to provide a value and move them to the Mapped list. Any tag that is not mapped will keep the same value that it had prior to the Run Test action.
  6. Select each tag in the Mapped list in turn and provide a value for it in the Value field. This value is typically a tag used by the child test.
  7. When the parent test runs, it reads the data provided by the child test and stores that data into the set of specified tags.

Input and output tags are optional. You can run the child test without passing any data between the two tests.

Note: If you want to pass data between parent and child tests, make sure that the Run process in parallel option is cleared.

Action Group | Assert | Decision | Pass | Fail | Run Test | Sleep | Fetch Test Data | Lookup Test Data | Iterate actions | Iterate | Iterate Test Data | Iterate While | Iterate through multi-value tags

Sleep

A sleep action stops a test’s execution for a specified period, in milliseconds. You can specify a fixed time, a uniformly distributed random time, or a random time with Gaussian distribution. The time range for the Uniform and Gaussian delay is defined by the values that are used in the Minimum Delay and Maximum Delay fields).

A sleep action can be used to make a test wait to ensure that some previous processing is complete, or for simulating delays in adapter stubs.

Action Group | Assert | Decision | Pass | Fail | Run Test | Sleep | Fetch Test Data | Lookup Test Data | Iterate actions | Iterate | Iterate Test Data | Iterate While | Iterate through multi-value tags

Fetch Test Data

The Fetch Test Data action can be used to map data from a test data source to existing test tags.

Click Browse to locate and select the data source from the project resource dialog. The "Group data by column" list, which is sorted alphabetically, facilitates with working with test data sets containing repeating elements. The column that uniquely defines the rows that belong to the group (usually a parent ID) must be selected.

In the Mappings section, select the data source field to map for each of the wanted tags.

If you run the Fetch Test Data option in multiple iterations, select the After this row has been mapped, advance to the next row option to map the next row of data for each subsequent iteration. If you do not select this option, each iteration reads the same row of data over and over.

Mappings section of Fetch Test Data window

If new columns are added to the test data, the values in the new columns will be used as test data if the following conditions are met:
  • A tag must already exist that exactly matches each new column name
  • The column count property must not be set
  • The Auto map new columns to tags at runtime check box must be selected (the default state) in the data source editor
A message is displayed in the console when the data is added to the test.

Action Group | Assert | Decision | Pass | Fail | Run Test | Sleep | Fetch Test Data | Lookup Test Data | Iterate actions | Iterate | Iterate Test Data | Iterate While | Iterate through multi-value tags

Lookup Test Data

The Lookup Test Data action provides keyed access to a data set. For example, a stub might receive a request message and tag value for customerName. You can search the data set, keying off the customerName, and returning data that you can use to send a reply.

The action has two pages, where you can complete the following fields:

Table 2. Fields in the Lookup Test Data window
Field Description
Config page
Test data set The data set, previously configured in HCL OneTest API (see Test data sources), to use for looking up data.
Lookup values The Column Key within the data set to use as the lookup key, and the Lookup Value to search for within the data set that is based on that key. The value is likely to be dynamic and come from a tag that is populated earlier in the test sequence. To use multiple columns, click Add Lookup. To remove a column, select it and click Remove Lookup.
Store page
Output format Select Store all matching rows into lists to write list tags instead of single values. See Repeating elements and list tags.
Data field The column value from a returned row in the data set to which the tag is mapped. For example, if %%customerName%% is found, the values in the ADDRESS and PHONE columns of the same row are mapped to tags named %%Address%% and %%Phone%%.
Tag name The name of an existing tag within the test. Alternatively, you can click the pencil icon at the right side of the field to create a tag name.

Like the Decision action, Lookup Test Data provides two execution paths. If a row is found in the data set, the Found branch is executed. Otherwise, the Not Found branch is executed. More actions can be added to the two paths to provide different success or failure results.

If new columns are added to the test data, the values in the new columns will be used as test data if the following conditions are met:
  • A tag must already exist that exactly matches each new column name
  • The column count property must not be set
  • The Auto map new columns to tags at runtime check box must be selected (the default state) in the data source editor
A message is displayed in the console when the data is added to the test.
If a warning is displayed that the Lookup Test Data action did not find a match, you can add the missing test data to the data source:
  1. Right-click the warning message and click Add Data.
  2. The Automation failed window is displayed. Currently, no test data set types are implemented. Click Copy to Clipboard to copy the displayed data, then click OK.
  3. In the File Data Source window, next to the data source file name, click Open.
  4. The data source is opened. Paste the data from the clipboard into the data source, save the file, and rerun the test.

Action Group | Assert | Decision | Pass | Fail | Run Test | Sleep | Fetch Test Data | Lookup Test Data | Iterate actions | Iterate | Iterate Test Data | Iterate While | Iterate through multi-value tags

Iterate actions

Use iterate actions to specify repeated test steps or repeated test data.

When you specify test iterations with the Creates new test iteration option selected, the built-in tag TEST/ITERATION/NUMBER is set to 1 and is incremented for each loop inside the iterator. During the test, the Progress column in Test Lab displays the progress as a percentage of the current iteration against the total iterations.

Note: If this option is not used, the entire test runs in a single iteration, and the progress bar moves from 0% to 100% upon completion. If this option is enabled in a test and two iterators are used, the progress bar runs through to 100% in the first iterator, and then again for the second iterator.
You can drag other actions onto any of the iterate actions. For example, you can drag a Send Request and a Receive Reply action onto the Iterate Test Data action. Doing so ensures that you send a request and receive a reply for each line of data in your data source.
The send and receive actions are now under the Iterate Test Data action.
The following iterate actions are available:

Iterate

Actions within the grouping specified by the Iterate action are run through a specified range of iterations. They can also be run while iterating over a multi-value tag.

The Iterate action

The following options are available for configuring the action:

Table 3. Configuration fields for Iterate action
Option Description
Iterations The numbers of the specific iterations to be run. For example, if you specify "1,3", no results are stored for the second iteration or any iterations from 4 on. You can include tags, as in the following example:
1-%%limit%%

If you specify "9", HCL OneTest API does not run nine iterations; only the ninth iteration is run.

You can also specify a multi-value tag to be processed, one value for each iteration. For more information, see Iterate through multi-value tags.

Pacing The following options are available in the Pacing section:
Enable pacing
Select this option to control the frequency of the iterations.
Pacing mode
If pacing is enabled, select one of the following modes:
Minimum iteration time
If an iteration completes in less than the specified number of seconds, pause for the remaining time. For example, if you specify 5 seconds in the Period field and an iteration completes in 3 seconds, a 2 second pause is inserted before the next iteration.
Pause between iterations
In the Period field, specify the number of seconds during which execution should pause between the finish of one iteration and the start of the next. No pause is inserted after the final iteration.
Period (seconds)
If pacing is enabled, enter the value associated with the option in the Pacing mode field.
Runtime settings The following options are available:
Creates new test iteration
If this option is selected, repeated iterations are considered to be real iterations in console output and reports.
Continue on fail
If this option is selected, a failure within the iteration group does not force the entire test to fail. If this option is cleared, an action in the iteration group that fails causes the entire test to fail and any remaining steps to be canceled.
Iteration timing The following time limit options are available:
Limit the length of each iteration
If this option is selected, the active iteration fails if it takes longer than the value of the Maximum iteration time in seconds field. If you selected the Continue on fail option, the next iteration is run; if not, then the entire test fails.
Limit the length of the entire iterate action
If this option is selected, the iterator is canceled if the total time taken by all iterations so far exceeds the value of the Maximum total time in seconds field.

Action Group | Assert | Decision | Pass | Fail | Run Test | Sleep | Fetch Test Data | Lookup Test Data | Iterate actions | Iterate | Iterate Test Data | Iterate While | Iterate through multi-value tags

Iterate Test Data

The Iterate Test Data action, similar to the Iterate action, specifies an action or group of actions that can be executed multiple times. The number of iterations, however, is controlled by the number of filter matches found in the selected data set.

The Iterate Test Data action

The following options are available for configuring the action:

Table 4. Configuration fields for Iterate test data action
Option Description
Test data set The data set to search.
Group data by column The (optional) column within the selected data set to use as the basis for repeating elements. When used, all rows within the data set that have the same value in the "grouping" column are grouped and sorted alphabetically as repeating elements.
Iterations The numbers of the specific iterations to be run. For example, if you specify "1,3", no results are stored for the second iteration or any iterations from 4 on. You can include tags, as in the following example:
1-%%limit%%

If you specify "9", HCL OneTest API does not run nine iterations; only the ninth iteration is run.

If you specify an iteration number that is higher than the highest row number in your test data, one of the following results occurs:
  • If you selected the Loop Data option when you defined your test data, then multiple passes are made to satisfy the iteration numbers you specified.
  • If the Loop Data option is cleared, iteration numbers that exceed the number of rows are ignored.

See also Iterate through multi-value tags in this topic.

Pacing The following options are available in the Pacing section:
Enable pacing
Select this option to control the frequency of the iterations.
Pacing mode
If pacing is enabled, select one of the following modes:
Minimum iteration time
If an iteration completes in less than the specified number of seconds, pause for the remaining time. For example, if you specify 5 seconds in the Period field and an iteration completes in 3 seconds, a 2 second pause is inserted before the next iteration.
Pause between iterations
In the Period field, specify the number of seconds during which execution should pause between the finish of one iteration and the start of the next. No pause is inserted after the final iteration.
Period (seconds)
If pacing is enabled, enter the value associated with the option in the Pacing mode field.
Runtime settings The following options are available:
Creates new test iteration
If this option is selected, repeated iterations are considered to be real iterations in console output and reports.
Continue on fail
If this option is selected, a failure within the iteration group does not force the entire test to fail. If this option is cleared, an action in the iteration group that fails causes the entire test to fail and any remaining steps to be canceled.
Iteration timing The following time limit options are available:
Limit the length of each iteration
If this option is selected, the active iteration fails if it takes longer than the value of the Maximum iteration time in seconds field. If you selected the Continue on fail option, the next iteration is run; if not, then the entire test fails.
Limit the length of the entire iterate action
If this option is selected, the iterator is canceled if the total time taken by all iterations so far exceeds the value of the Maximum total time in seconds field.
Note: If the tags you populate with this action are later used to populate a message, ensure that any repeating fields in the data set are marked as such in the message. If repeating elements are not marked, errors are generated.

Under the Filter tab, you can specify values to match in columns within the data set. If no filters are specified, everything passes the filter. Text that is entered under the Required Value field is used as an "Equality" match by default. To modify the filter type, click the icon, select the appropriate filter action, and enter the matching criteria as needed.

Only the rows that contain data that matches all filters are used. If the Disable filter when Tag is empty option is selected, the filter is not applied when an empty value ('''' or NULL) is found in the tag.

The Filter page
Note: The number of iterations that are run depends on the following factors:
  • The number of rows that match all filters
  • The iteration numbers specified

Under the Store tab you can map the data in specific columns (within matching rows) to existing tags in the test.

If new columns are added to the test data, the values in the new columns will be used as test data if the following conditions are met:
  • A tag must already exist that exactly matches each new column name
  • The column count property must not be set
  • The Auto map new columns to tags at runtime check box must be selected (the default state) in the data source editor
A message is displayed in the console when the data is added to the test.

Action Group | Assert | Decision | Pass | Fail | Run Test | Sleep | Fetch Test Data | Lookup Test Data | Iterate actions | Iterate | Iterate Test Data | Iterate While | Iterate through multi-value tags

Iterate While

Use the Iterate While action to specify an action or group of actions that is to be executed as long the specified conditions are met. One or more conditions can be specified.

The following options are available for configuring the action:

Table 5. Configuration fields for Iterate while action
Option Description
Condition Complete the following steps:
  1. Optional: Provide an optional text description that summarizes the condition in the while loop.
  2. Specify the type of script you want to enter:
    ECMA script
    Any standard ECMA script, such as JavaScript. Enter or paste the script for the condition in the space provided. The script must evaluate to either true or false.
    Legacy
    If you want to use a built-in or custom function from HCL OneTest API version 5 or earlier. Click the Add button, right click in the space provided, click Select Function, and choose the function you want to add.

    For general information about scripting, see Scripts within tests and stubs.

  3. Click Test to verify that the script is coded properly.
Iterations The numbers of the specific iterations to be run. For example, if you specify "1,3", no results are stored for the second iteration or any iterations from 4 on. You can include tags, as in the following example:
1-%%limit%%
Pacing The following options are available in the Pacing section:
Enable pacing
Select this option to control the frequency of the iterations.
Pacing mode
If pacing is enabled, select one of the following modes:
Minimum iteration time
If an iteration completes in less than the specified number of seconds, pause for the remaining time. For example, if you specify 5 seconds in the Period field and an iteration completes in 3 seconds, a 2 second pause is inserted before the next iteration.
Pause between iterations
In the Period field, specify the number of seconds during which execution should pause between the finish of one iteration and the start of the next. No pause is inserted after the final iteration.
Period (seconds)
If pacing is enabled, enter the value associated with the option in the Pacing mode field.
Runtime settings The following options are available:
Creates new test iteration
If this option is selected, repeated iterations are considered to be real iterations in console output and reports.
Continue on fail
If this option is selected, a failure within the iteration group does not force the entire test to fail. If this option is cleared, an action in the iteration group that fails causes the entire test to fail and any remaining steps to be canceled.
Iteration timing The following time limit options are available:
Limit the length of each iteration
If this option is selected, the active iteration fails if it takes longer than the value of the Maximum iteration time in seconds field. If you selected the Continue on fail option, the next iteration is run; if not, then the entire test fails.
Limit the length of the entire iterate action
If this option is selected, the iterator is canceled if the total time taken by all iterations so far exceeds the value of the Maximum total time in seconds field.

Action Group | Assert | Decision | Pass | Fail | Run Test | Sleep | Fetch Test Data | Lookup Test Data | Iterate actions | Iterate | Iterate Test Data | Iterate While | Iterate through multi-value tags

Iterate through multi-value tags

For Iterate and Iterate Test Data actions, you can specify a tag that contains a list (a multi-value tag) rather than a literal number. Ensure that the tags have numbers stored in them.

For example, assume that you have a tag containing a list of user names called UserList. Specifying %%UserList%% in the Iterations field on the Config tab of the Iterate action means that the test steps managed by the Iterate action will be repeated once for each value in the list.

Note: If your multi-value tag has only a single value in it, only one iteration is executed. To use a single-value tag, you must add a numeric iteration definition, for example, "1" or "1-3", or "1,2-5".

To store the value used in the current iteration into a tag, go to the Store tab of the Iterate action. Click New and add a tag to the action. Enter a name for the tag so that you can refer to it within the test steps managed by the Iterate action.

For more information, see Repeating elements and list tags.

Action Group | Assert | Decision | Pass | Fail | Run Test | Sleep | Fetch Test Data | Lookup Test Data | Iterate actions | Iterate | Iterate Test Data | Iterate While | Iterate through multi-value tags

Feedback