Hello guys,I want to see the load(http_reqs) allocated on each API. How do I configure the K6 script?
Anyone can provide a demo?
You can define thresholds on http_reqs
sub-metrics based on the scenario
tag, which will show them in the end-of-test summary. See this and this for more information and examples.
Hi @ned , thanks your pointer, i want to know how to delete the sub-metric(expected_response:true) in the end-of-test summary.
Unfortunately you can’t delete the submetric right now, we have to implement Add explicit tracking and ignoring of metrics and sub-metrics · Issue #1321 · grafana/k6 · GitHub first
However, you can hide it by implementing a custom handleSummary()
function that prints the end-of-test summary without it, see End of test
@ned , I found a way to hide { expected_response:true } by specifying the system tag, but i found that by this way, in the end-of-test summary, http_reqs number and iterations number don’t match, It feels like a K6 bug.
Good point about systemTags
, I didn’t realize that by excluding expected_response
from them you’d also hide the default sub-metric on it… Just keep in mind that with your current setting of just ['scenario', 'status']
, you are also telling k6 to not attach a lot of other useful system tags to your metric measurements. Which you might need, since I see you are using an InfluxDB output.
Anyway, to answer your question, I think this difference in iterations
and http_reqs
is not a bug in k6 or caused by the systemTags
option you specified. It’s most likely a result of the gracefulStop: '0s'
option you have in your scenarios. It will cause k6 to immediately abort any iterations that are still running at the end of the scenario, and k6 only counts full iterations in he iterations
metric. So, I assume in one of these iterations at the end of the test, the HTTP request finished (and was counted in http_reqs
), but the iteration was interrupted before it was also counted, and that’s why you see a difference.
Thanks @ned , i got it. Now another problem is encountered. When i use an InfluxDB output for multiple-scenarios, i cannot see VU, RPS, Errors, RT for every scenario on Grafana, I defined my tags to categorize k6 entities on each scenario’s request and check
Sorry, can’t help you a lot here, but I suggest editing the dashboard and filtering the metrics by the scenario
tag, see InfluxDB data source | Grafana documentation
Anyway, thank you for my help @ned