Skip to content

Provide option to get result per iteration #2307

@SamPosh

Description

@SamPosh

Feature Description

Our team is using K6 as a scale and reliability testing rather than it as performance test.
The main reason to use K6 is concurrency and multi user and multi iteration(parallel execution) facility.

If we use functional testing framework like pytest this concurrency and multiuser/iteration scenario had to be handled by us. So we decided to use K6. But the problem is K6 is meant for large scale performance test. What I mean is that the results/metrics are given in consolidated manner for all the iterations. We need result/metric for each iteration. I know the intended design of K6 is performance testing in large scale. If we get result per iteration, it will be really useful for simple scale and reliability tests.

Iteration result object should contain all the result parameter given in --summary-export cli option.

Suggested Solution (optional)

Currently handlesummary call back function is returning result.
If handleIterationSummary(data,scenarioData) {} call back is written this problem might be resolved.

Iteration result object should contain all the result parameter given in --summary-export cli option.

Already existing or connected issues / PRs (optional)

No response

Metadata

Metadata

Assignees

No one assigned

    Labels

    evaluation neededproposal needs to be validated or tested before fully implementing it in k6feature

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions