Custom libraries Keywords design/approach question

HI, I am developing custom keywords library for a test framework and faced design / approach question.
The product under test has significant number similar entities with similar set of controls for each one of them.

~200 controls for each entity
~560 entities
~500 global controls
560 * 200 + 500 = 112 500 controls in approximate

Part of controls is toggle, another part is continuous.
API is common function calls with arguments like set_entity_control (entity_type, entity_index, ctrl_name, value).
From user perspective, single user action is adjusting single control of the single entity at the moment and the goal is to write keywords in accordance to that.

So far I was trying to reduce this number by decomposing it into different feature libs and parametrize the keywords describing single user action, with this approach I have faced 2 following cons:

  1. longer keyword names
  2. 2-3 arguments that makes keywords less readable and looks more like a function calls instead of user action

Ideally goal is having test cases combined from keywords without arguments for toggle controls and with 1 argument for the continuous controls which will represent a single user action, like “Do Something This Entity”. Writing and maintaining all of that manually seems to be crazy, gen ai or something can generate it but this will not avoid further maintenance. What is the better approach to manage this?
Maybe there are some tools that can generate number of keywords dynamically by wrapping single parameterized keyword?
Or the current approach is fine?

I will appreciate any design suggestion or materials.

1 Like

This probably does not answer your question, but have you considered the following core features:

2 Likes

Side note about embedding arguments: if you don’t use space between the arguments and a word in the keyword name, then you can have this as an “optional” argument (though default value is not supported to you need to manage this manually). If you put space around the embedded argument, then it is mandatory to specify it.

for example

*** Variables ***
${DEFAULT_VALUE}    0

*** Test Cases ***
Using default value
    Log My Device ID
Regular usage
    Log My Device2 ID
    Keyword With Spaces Around 2 Fails If No Value

Keyword not found
    Keyword With Spaces Around Fails If No Value

    
Second part of keywordis seen as argument
    Keyword With Spaces Around  Fails If No Value

Needs empty string so not so useful
    Keyword With Spaces Around ${EMPTY} Fails If No Value

Not this empty string though
    Keyword With Spaces Around "" Fails If No Value

*** Keywords ***
Log My Device${dev id} ID
    IF    $dev_id == ""    VAR    ${dev id}    ${DEFAULT_VALUE}
    Log    Dev ID is: ${dev id}    level=CONSOLE

Keyword With Spaces Around ${dev_id} Fails If No Value
    IF    $dev_id == ""    VAR    ${dev id}    ${DEFAULT_VALUE}
    Log    Dev ID is: ${dev id}    level=CONSOLE
1 Like

Hi Denys,

To extend on Francis’ answers,

A popular approach is to have multiple layers of custom keywords. The Idea being you have:

  1. Keywords that are called at the test level, these are business logic keywords that have friendly keyword names and minimal configuration information, and may use Embedded arguments
  2. Next layer down (optional) would be keywords that describe the steps of the keywords from the level above, these are lower level business logic keywords that may still have friendly names but likely have longer names and more configuration arguments
    3, Low level keywords, called by either level above, these would be the keywords you already created, they describe the actual action in the system, may have longer or more ugly names and more arguments, they may be your custom keywords or library keywords

This layering approach helps separate the business logic from the controls, also useful if the controls change, you only update keywords in the control layer.

You would keep each layer of keywords in different resource files (or folder of resource files) related to it’s layer.

Another useful approach is to use variable files to define many of the variables for an environment, so when you level 1 keyword called something like Open ${environmentname} for Appname gets called, inside this keyword you load all the variables for that environment get loaded as variables and don’t have to be defined in the test case.

Hopefully that helps answer your question,

Dave.

2 Likes

If you want to avoid “normal” arguments on test case level, which is often a good idea, using embedded arguments, as already proposed above, could work well. Starting from RF 7.0, such library keywords accept also other arguments. User keywords got support for mixed args already in RF 6.1.

1 Like

Thanks for suggestions, I Indeed was missing embedded arguments feature which seems to be handy to make keyword names shorter and more readable.

Layering keywords down from top level business logic to the API calls is the approach I am following currently, but it still has these 2 cons mentioned in the original post, though embedded argument can help to mitigate both of them.

Another thing I am looking into is methods/keywords generation in python by wrapping the API calls, iterating through it’s possible arguments.

abstract example

for arg1 in arg1_values:
    for arg2 in arg2_values:
        for arg3 in arg3_values:
            for arg4 in arg4_values:
                
                function_name = f"wrapper_{arg1}_{arg2}_{arg3}_{arg4}"
                def wrapper_function(arg1=arg1, arg2=arg2, arg3=arg3, arg4=arg4):
                    return do_something(arg1, arg2, arg3, arg4)

                globals()[function_name] = wrapper_function

debugging without having explicitly defined keywords seems to be a potential problem, apart from others which I am not aware of at this moment), generating documentation along with the wrapper functions in this case seems to be mandatory.

I will appreciate any suggestion or potential problems spots with this approach

If you want to generate keywords dynamically, I recommend getting familiar with the dynamic library API.