remove iteration number of loops

Issue #2 new
laurent bristiel created an issue

When we use the tool with FOR statements, then each iteration appears like a keyword. That is neither readable nor useful I think. Example:

*** test cases ***
loop  
    sleep  1
    :FOR  ${index}  IN RANGE  42
    \  should be equal  ${index}  ${index}

Result:

Total time (s) |   Calls | avg time (s) | median time (s) | stdev (s) | stdev/avg time % | Keyword name
         1.001 |       1 |        1.001 |           1.001 |       0.0 |              0.0 | "BuiltIn.Sleep"
         0.024 |       1 |        0.024 |           0.024 |       0.0 |              0.0 | "${index} IN RANGE [ 42 ]"
         0.013 |      42 |          0.0 |             0.0 |       0.0 |                0 | "BuiltIn.Should Be Equal"
         0.001 |       1 |        0.001 |           0.001 |       0.0 |              0.0 | "41"
         0.001 |       1 |        0.001 |           0.001 |       0.0 |              0.0 | "24"
         0.001 |       1 |        0.001 |           0.001 |       0.0 |              0.0 | "26"

would it be possible to remove those keywords that are not really keywords?

Comments (4)

  1. Mikko Korpela

    Do you have a concrete case where this is a problem?

    I would guess that in realistic situations the iterations wouldn't be in the first 100 keywords..

  2. laurent bristiel reporter

    You are right, this is not a real problem. I just stumbled upon it when I was reviewing the stats for my suite and I did not understand what were those keywords named "0", "1" etc. I just don't think it is relevant to have them there and I though that if there was an easy way to remove them, then maybe we should. But like said, no real issue.

  3. Mikko Korpela

    But the idea of filtering out keywords somehow is in my opinion valuable. Have to see if more comments or issues about it come in.

    I've personally just used grep (and modified the tool if complex logic is needed - the variance based filtering was accidentally discovered - it really helps when working with large data where easy optimisations have been already done).

  4. Log in to comment