I am working on a script which connects to influxdb hosted on a separate linux server and there is a layer of ingress in between k6 and influxdb. Whenever I run the scripts I get the below two messages
WARN[0058] The flush operation took higher than the expected set push interval. If you see this message multiple times then the setup or configuration need to be adjusted to achieve a sustainable rate. output=InfluxDBv1 t=11.915723758s
ERRO[0058] Couldn’t write stats error=“{"error":"timeout"}\n” output=InfluxDBv1
Below is my setup for K6 in a docker compose
k6_crud_tests:
container_name: crud_tests
image: grafana/k6:latest
ports:
- “6568:6568”
environment:
- K6_OUT=influxdb=https://myinfluxdb/k6
- K6_INFLUXDB_CONCURRENT_WRITES=6
volumes:
- ./scripts:/scripts
- ./utilities:/utilities
- ./config:/config
command: run //scripts//CRUD_Tests.js
and below are the env var for influxdb
- INFLUXDB_DB=k6
- INFLUXDB_DATA_MAX_SERIES_PER_DATABASE=0
- INFLUXDB_DATA_MAX_VALUES_PER_TAG=0
- INFLUXDB_HTTP_MAX_BODY_SIZE=90000000
the ingress is also configured to set the body size to 90000000
Any ideas how can I get rid of these two messages pls?
Thank you!
Hi @21.prash,
the timeout error says to you that from the k6 container is not possible to reach the provided InfluxDB endopoint.
You should check your docker’s networking/connectivity configuration in your infrastructure. It could help to replicate the infrastructure in local before. You can find a docker compose for doing it in the k6 repository.
I hope it helps you.
Hello @codebien - This error comes locally as well when I run it inside docker compose. Locally there is nothing between K6 and influxdb
ERRO[0233] Couldn’t write stats error=“{"error":"timeout"}\n” output=InfluxDBv1
ERRO[0234] Couldn’t write stats error=“{"error":"timeout"}\n” output=InfluxDBv1
ERRO[0235] Couldn’t write stats error=“{"error":"timeout"}\n” output=InfluxDBv1
ERRO[0236] Couldn’t write stats error=“{"error":"timeout"}\n” output=InfluxDBv1
ERRO[0237] Couldn’t write stats error=“{"error":"timeout"}\n” output=InfluxDBv1
ERRO[0238] Couldn’t write stats error=“{"error":"timeout"}\n” output=InfluxDBv1
function prepareHeader(token)
{
return {
headers: { “Content-Type” : “application/json”, “Authorization” : "Bearer " + token },
timeout: “4m”
};
}
export function DeleteItem(item, token)
{
let res = http.del(appUrl + item, null, prepareHeader(token));
return res;
}
Am I doing something wrong here??
Thank you!
Did you use the same docker compose configuration I shared?
Have you checked if the k6’s container is configured in the proper way to connect to InfluxDB’s container? For example, are you able to manually establish a connection?
If you don’t know how to do it, you can learn how to set proper connectivity in Docker from the documentation Networking with standalone containers | Docker Documentation.
Below is my compose file with the config details…
version: '3.7'
networks:
default:
name: my-api-network
services:
influxdb:
image: influxdb:1.8
restart: always
ports:
- "8086:8086"
environment:
- INFLUXDB_DB=k6
- INFLUXDB_DATA_MAX_SERIES_PER_DATABASE=0
- INFLUXDB_DATA_MAX_VALUES_PER_TAG=0
- INFLUXDB_HTTP_MAX_BODY_SIZE=50000000
grafana:
depends_on:
- influxdb
image: grafana/grafana:latest
restart: always
ports:
- "3000:3000"
environment:
- GF_AUTH_ANONYMOUS_ORG_ROLE=Admin
- GF_AUTH_ANONYMOUS_ENABLED=true
- GF_AUTH_BASIC_ENABLED=false
volumes:
- ./grafana:/etc/grafana/provisioning/
k6_crud_tests:
depends_on:
- grafana
image: grafana/k6:latest
ports:
- "6568:6568"
environment:
- K6_OUT=influxdb=http://influxdb:8086/k6
- K6_INFLUXDB_CONCURRENT_WRITES=6
volumes:
- ./tests/scripts:/scripts
- ./tests/utilities:/utilities
- ./tests/config:/config
command: run //scripts//CRUD_Tests.js
Hi @21.prash,
I don’t see any specific issue with your docker compose config. Can you try to run a simple script like this from the k6 repository, please?
I got your configuration running as expected with that http_get.js
script. Which command are you using for running the containers?
Let me know.
Its working fine now. The Devops Team did some changes at the influxdb and grafana side and it works like a charm! Thank you!