How to do serial difference in grafana ES datasource? I have a time series that tracks bytes written (wr) and bytes read (rd) reported the containers in my K8s node. These fileds are total count fields. I want to get the serial difference() over a period of time for the wr and rd, like wr[n+1]-wr[n]. How can I do that with the script filed? ES 6.8.10 grafana 6.1.1
P.S. I posted this question as a reply to a similar topic, but thought it should have it;s own thread.
Use Drivative instead of Max
I don’t have derivative on the drop-down list, is it available in newer grafana versions? There is “Extended Stats” which has standard deviation, Sigma, and “script”. That’s why It think script is my best bet. I’m to do something like _value-doc[t-1].value
Can it be done with metaqueries plugin? I installed the plugin but I don’t know the queries synatx
In my configuration using Grafana 7.3.3, the Derivative is available. I think in the start version 7.x, the Derivative already there.
I upgraded grafana to 7.3.2 but kept current ES version. I see derivative now but it does not work on the field “wr” directly (drop-down is empty, se image 1). I had to choose Max then derivative of Max (see image 2). Why can’t I just do derivative on field ‘wr’ directly? Which metric gives me the value of the ‘wr’ sample and not max/sum/etc… ?
Thanks
You need to define the Fisrt metric, either Avg, Max, Min, Count, etc. Then add new metric to derivative the previous value, because the Elasticsearch need to have first bucket then from the previous bucket, it calculate the derivative.
To view the derivative only, just select the eye until it disabled, then you only have derivative value in the Graph chart
Yes, that’s what I did. I did NOT work unitl I manually selected the “Date Histogram” interval to 5 minutes instead of “auto”. So yes it worked . Thanx.
But after a few hours the dashboard quit displaying and threw an error “Index -1 out of bounds for legth 106”. I’m not sure whay is that.
Anyway I’ll mark this as a soultion. I’ll look for the resolution of this new issue in another thread.