gRPC shared iterations

Hello,

I was checking the gRPC client and it seems that a new channel is being opened for every iteration. Is there any configuration is which i can have only one channel and stress test that particular channel ?

Thank you

Hello, @Rawad
Could you please take a look at this answer and tell us whether it solves the issue

Thank you @PaulM for the reply i checked it. I did remove the client close but I don’t think it’s what my problem is. for each iteration it seems that am having a new tcp handshake which am not sure if its normal or not.

@Rawad Maybe this example will help.
K6 only supports unary calls (info here).
In the example below, 20 connections(preAllocatedVUs and maxVUswill) be established (similar to keep-alive) and they will keep connections until the end of the test.

import grpc from 'k6/net/grpc';
import { check, sleep } from 'k6';

export let options = {
  scenarios: {
    CreateLoad: {
    executor: 'ramping-arrival-rate',
    startRate: 1,
    stages: [
          { target: 10, duration: '2m'},
          { target: 10, duration: '2m'},
    ],
    preAllocatedVUs: 20,
    maxVUs: 20,
    exec: 'createload'
  }
}};

const client = new grpc.Client();
client.load(['definitions'], 'hello.proto');

export function createload () {
  if (__ITER == 0) {
    client.connect('grpcbin.test.k6.io:9001', {
    // plaintext: false
  });
  }  

  const data = { greeting: 'Bert' };
  const response = client.invoke('hello.HelloService/SayHello', data);

  check(response, {
    'status is OK': (r) => r && r.status === grpc.StatusOK,
  });

};
1 Like

Thank you so much @PaulM. It does help. I do still have a question in this example the rate is limited ? so only the preAllocated Vus can run per second ?

@Rawad Yes, there is a limit on the number of rps (rate / target) and the number of users. Here is a description

ok thank you for your help