Grafana not able to connect to Influxdb

Hi Guys,
Need help on below:-

I am using below TIG stack versions
Grafana v6.0.0
InfluxDB v1.7.4

We have InfluxDB and Grafana installed on same server.
All permissions for owner of grafana and influxdb important directories are correct.

Also Influxdb service status looks good as data is dumping records into Influx database
Getting error below when I check grafana-server status:-

2019/12/18 16:33:55 http: proxy error: dial tcp ip-address:8086: connect: connection refused

Below error occurs only when some dashboard panels go blank and gives out error as Network Error: undefined(undefined) and after some time whole dashboard goes blank giving out error as Network error:Bad Gateway(502)

Also I checked logs from grafana.log got a couple of issues one of them is above and second is given below:-

t=2019-12-10T15:25:45+0530 lvl=eror msg=“Request error” logger=context userId=1 orgId=1 uname=admin error=“net/http: abort Handler” stack=“/usr/local/go/src/runtime/panic.go:513 (0x435098)\n/usr/local/go/src/net/http/httputil/reverseproxy.go:284 (0xabb577)\n/go/src/github.com/grafana/grafana/pkg/api/pluginproxy/ds_proxy.go:99 (0xd0f66c)\n/go/src/github.com/grafana/grafana/pkg/api/dataproxy.go:35 (0xfed2b2)\n/go/src/github.com/grafana/grafana/pkg/api/api.go:258 (0x1020d03)\n/usr/local/go/src/runtime/asm_amd64.s:522 (0x46385a)\n/usr/local/go/src/reflect/value.go:447 (0x4c1223)\n/usr/local/go/src/reflect/value.go:308 (0x4c0cb3)\n/go/src/github.com/grafana/grafana/vendor/github.com/go-macaron/inject/inject.go:177 (0x7966d3)\n/go/src/github.com/grafana/grafana/vendor/github.com/go-macaron/inject/inject.go:137 (0x796039)\n/go/src/github.com/grafana/grafana/vendor/gopkg.in/macaron.v1/context.go:121 (0x7b2330)\n/go/src/github.com/grafana/grafana/vendor/gopkg.in/macaron.v1/context.go:112 (0x7b224e)\n/go/src/github.com/grafana/grafana/pkg/middleware/request_tracing.go:25 (0xa4f16c)\n/usr/local/go/src/runtime/asm_amd64.s:522 (0x46385a)\n/usr/local/go/src/reflect/value.go:447 (0x4c1223)\n/usr/local/go/src/reflect/value.go:308 (0x4c0cb3)\n/go/src/github.com/grafana/grafana/vendor/github.com/go-macaron/inject/inject.go:177 (0x7966d3)\n/go/src/github.com/grafana/grafana/vendor/github.com/go-macaron/inject/inject.go:137 (0x796039)\n/go/src/github.com/grafana/grafana/vendor/gopkg.in/macaron.v1/context.go:121 (0x7b2330)\n/go/src/github.com/grafana/grafana/vendor/gopkg.in/macaron.v1/context.go:112 (0x7b224e)\n/go/src/github.com/grafana/grafana/pkg/middleware/request_metrics.go:17 (0xa4ea87)\n/usr/local/go/src/runtime/asm_amd64.s:522 (0x46385a)\n/usr/local/go/src/reflect/value.go:447 (0x4c1223)\n/usr/local/go/src/reflect/value.go:308 (0x4c0cb3)\n/go/src/github.com/grafana/grafana/vendor/github.com/go-macaron/inject/inject.go:177 (0x7966d3)\n/go/src/github.com/grafana/grafana/vendor/github.com/go-macaron/inject/inject.go:137 (0x796039)\n/go/src/github.com/grafana/grafana/vendor/gopkg.in/macaron.v1/context.go:121 (0x7b2330)\n/go/src/github.com/grafana/grafana/vendor/gopkg.in/macaron.v1/context.go:112 (0x7b224e)\n/go/src/github.com/grafana/grafana/pkg/middleware/recovery.go:147 (0xa4e9a0)\n/go/src/github.com/grafana/grafana/vendor/gopkg.in/macaron.v1/context.go:79 (0x7b21d0)\n/go/src/github.com/grafana/grafana/vendor/github.com/go-macaron/inject/inject.go:157 (0x7963c0)\n/go/src/github.com/grafana/grafana/vendor/github.com/go-macaron/inject/inject.go:135 (0x79612b)\n/go/src/github.com/grafana/grafana/vendor/gopkg.in/macaron.v1/context.go:121 (0x7b2330)\n/go/src/github.com/grafana/grafana/vendor/gopkg.in/macaron.v1/context.go:112 (0x7b224e)\n/go/src/github.com/grafana/grafana/pkg/middleware/logger.go:34 (0xa4c1ed)\n/usr/local/go/src/runtime/asm_amd64.s:522 (0x46385a)\n/usr/local/go/src/reflect/value.go:447 (0x4c1223)\n/usr/local/go/src/reflect/value.go:308 (0x4c0cb3)\n/go/src/github.com/grafana/grafana/vendor/github.com/go-macaron/inject/inject.go:177 (0x7966d3)\n/go/src/github.com/grafana/grafana/vendor/github.com/go-macaron/inject/inject.go:137 (0x796039)\n/go/src/github.com/grafana/grafana/vendor/gopkg.in/macaron.v1/context.go:121 (0x7b2330)\n/go/src/github.com/grafana/grafana/vendor/gopkg.in/macaron.v1/router.go:187 (0x7c2b27)\n/go/src/github.com/grafana/grafana/vendor/gopkg.in/macaron.v1/router.go:303 (0x7bd29e)\n/go/src/github.com/grafana/grafana/vendor/gopkg.in/macaron.v1/macaron.go:220 (0x7b67f1)\n/usr/local/go/src/net/http/server.go:2741 (0x6b379a)\n/usr/local/go/src/net/http/server.go:1847 (0x6afb35)\n/usr/local/go/src/runtime/asm_amd64.s:1333 (0x465570)\n”

After this influxdb again reads all shards/files shows data points on dashboard and whole story gets repeated but data loading into InfluxDB seems good. I think issue arises from grafana itself but unable to pin point.

Please let me know if you all need more information.

Warm Regards,
//Ashlesh Mandke

Hey, @ashlesh!

Did you check your:

  • local network settings,
  • proxy settings,
  • firewall settings
  • and datasource settings (incl. credentials)?

Looks like at least one of those isn’t set up properly so Grafana can’t connect to this datasource.

Best regards,
Lukas

Hey, @lukaszsiatka Thanks for replying.

I have checked all settings you said, all seems good. No proxy and no issue of firewall and datasource is getting connected. For a minute or so I am able to see datapoints on grafana and then it goes blank(without restarting services). What else should I check please let me know.

Last option remains is we need to upgrade grafana and influxdb to their latest version. Lets not use this for now.

The traceback looks like InfluxDB is closing the connection, or you are getting a timeout.

You can get more detail by exporting these environment variables:

GF_LOG_LEVEL=debug
GF_DATAPROXY_LOGGING=true

Or by setting the above inside your defaults.ini config file

[log]
level=debug
[dataproxy]
logging=true

if the queries are taking a long time to respond (influx under heavy load, or any other reason) try increasing the timeout (the default is 30 seconds)

[dataproxy]
timeout=60

the equivalent env variable would be GF_DATAPROXY_TIMEOUT=60

Hi there.
I have exactly the same issue with plain grafana&influxdb install on the same machine. No passwords at influxdb. No firewalls on the machine. Bad gateway 502
There’re no any conversations to influxdb tcp/8086 port as tcpdump shows.
Log, pcap and screenshot are at https://owncloud5.cpm.ru/index.php/s/8LN0HA2FUZeEuKu
cause i can’t attach them here.
grafana 6.5.2
influxdb 1.7.8-1
platform: ubuntu 18.04.3 LTS
Absolutely the same installation of: grafana 5.4.2/influxdb 1.7.8-1/ubuntu 16.04.6 LTS works perfect.

The screenshot doesn’t show the URL filled in. Here’s what mine looks like (running grafana and influxdb on the same system, ubuntu 18.04 and Grafana 6.5.2, Influx 1.7

I didn’t see anything in the logfile related to influx queries.

Oh, my fault. Thank you Brian.
I thought the hint “http://localhost:8086/” will work by default and did not fill it, as it was already there slightly confusing me )
With that filled with above url everything’s fine.
:ok_hand:

I have fallen over that issue in the past too. The grey localhost setting suggests that it is a default value that does not need filling in.

Hi All,

An update to the issue-
I have observed high memory & cpu usage.
Still I have question that do high memory usage could cause time out to httpd. If its taking longer time to process due to low ram, influx may give timeout error resulting in refusing the connection.
CPU usage :- varying from 90% to 800%
Mem usage :- constantly above 95%
resulting in slow response from server and also giving below error now:-
TSM compaction (start)" engine=tsm1 tsm1_strategy=full

for my setup, i added these environment variables to reduce the impact of indexing/reporting, and it has been stable on a small vm (1CPU 2GB ram), roughly using 1.3GB with grafana+influx in docker.

without these i would see memory max out and influx would terminate.

INFLUXDB_REPORTING_DISABLED=true
INFLUXDB_DATA_INDEX_VERSION=tsi1

As per the parameters given by you we implemented that, but still giving out same error.