Datasource 403 Forbidden "not included in whitelist"

Overview of Environment Setup

I’ve just created 2 Docker containers—one for Grafana, and one for Prometheus—using the same host/server, which is a home linux server. I’ve not specified a docker “network” for either of them, so they are using the same ‘default’ network. Both Grafana and Prometheus will load on their respective ports (192.168.1.75, ports :3000 / :9090), but I can not seem to add Prometheus as a datasource.

Specific error encountered

Within Grafana, using “Server” (proxy mode), I get a 403 returned with this message: {"message":"Data proxy hostname and ip are not included in whitelist"}{"message":"target URL is not a valid target"} . I don’t know what this means, and can’t seem to find the answer online. (“Browser” mode for the datasource doesn’t work due to CORS error, and I shouldn’t need browser mode anyways, right?)

Interestingly, other requests work and only the ‘query’ request is returning the error…

This has to be a common problem. Can anyone help?

Hi @daevski - not sure it’s a “common” problem necessarily; I think it’s the first time I’ve seen something like this :slight_smile:

Couple of questions:

  1. Can you share your actual datasource configuration?
  2. I suspect there will be some errors in your Grafana server log also, can you share those?

When running in Docker, one common mistake (from what I’ve seen) is to assume that the Grafana server “sees” the same network environment (ports on localhost, etc) that is present on the host machine. It sounds like you know what you’re doing in this respect, but thought I’d put it out there.

Finally, the fact you’re getting a CORS error when running in browser mode also seems a little fishy, as I don’t think that should be the case. But I haven’t used Prometheus myself so maybe that’s to be expected.

Certainly, @svetb, thanks for the response!

This is how it’s configured in the web ui:

As you can see, it successfully added the resource, but the test fails. And then this is what happens in a dashboard that’s using that datasource:

Here are the docker logs, let me know if some other logs would help as well. I’m not sure where any of them live yet.

t=2021-06-09T15:41:33+0000 lvl=info msg="Starting Grafana" logger=server version=7.5.7 commit=91de51771c branch=HEAD compiled=2021-05-17T19:16:45+0000
t=2021-06-09T15:41:33+0000 lvl=info msg="Config loaded from" logger=settings file=/usr/share/grafana/conf/defaults.ini
t=2021-06-09T15:41:33+0000 lvl=info msg="Config loaded from" logger=settings file=/etc/grafana/grafana.ini
t=2021-06-09T15:41:33+0000 lvl=info msg="Config overridden from command line" logger=settings arg="default.paths.data=/var/lib/grafana"
t=2021-06-09T15:41:33+0000 lvl=info msg="Config overridden from command line" logger=settings arg="default.paths.logs=/var/log/grafana"
t=2021-06-09T15:41:33+0000 lvl=info msg="Config overridden from command line" logger=settings arg="default.paths.plugins=/var/lib/grafana/plugins"
t=2021-06-09T15:41:33+0000 lvl=info msg="Config overridden from command line" logger=settings arg="default.paths.provisioning=/etc/grafana/provisioning"
t=2021-06-09T15:41:33+0000 lvl=info msg="Config overridden from command line" logger=settings arg="default.log.mode=console"
t=2021-06-09T15:41:33+0000 lvl=info msg="Config overridden from Environment variable" logger=settings var="GF_PATHS_DATA=/var/lib/grafana"
t=2021-06-09T15:41:33+0000 lvl=info msg="Config overridden from Environment variable" logger=settings var="GF_PATHS_LOGS=/var/log/grafana"
t=2021-06-09T15:41:33+0000 lvl=info msg="Config overridden from Environment variable" logger=settings var="GF_PATHS_PLUGINS=/var/lib/grafana/plugins"
t=2021-06-09T15:41:33+0000 lvl=info msg="Config overridden from Environment variable" logger=settings var="GF_PATHS_PROVISIONING=/etc/grafana/provisioning"
t=2021-06-09T15:41:33+0000 lvl=info msg="Path Home" logger=settings path=/usr/share/grafana
t=2021-06-09T15:41:33+0000 lvl=info msg="Path Data" logger=settings path=/var/lib/grafana
t=2021-06-09T15:41:33+0000 lvl=info msg="Path Logs" logger=settings path=/var/log/grafana
t=2021-06-09T15:41:33+0000 lvl=info msg="Path Plugins" logger=settings path=/var/lib/grafana/plugins
t=2021-06-09T15:41:33+0000 lvl=info msg="Path Provisioning" logger=settings path=/etc/grafana/provisioning
t=2021-06-09T15:41:33+0000 lvl=info msg="App mode production" logger=settings
t=2021-06-09T15:41:33+0000 lvl=info msg="Connecting to DB" logger=sqlstore dbtype=sqlite3
t=2021-06-09T15:41:33+0000 lvl=info msg="Starting DB migrations" logger=migrator
t=2021-06-09T15:41:33+0000 lvl=info msg="migrations completed" logger=migrator performed=0 skipped=279 duration=759.319µs
t=2021-06-09T15:41:33+0000 lvl=info msg="Starting plugin search" logger=plugins
t=2021-06-09T15:41:34+0000 lvl=info msg="Registering plugin" logger=plugins id=input
t=2021-06-09T15:41:34+0000 lvl=info msg="Registering plugin" logger=plugins id=flant-statusmap-panel
t=2021-06-09T15:41:34+0000 lvl=info msg="Registering plugin" logger=plugins id=yesoreyeram-boomtable-panel
{"@level":"debug","@message":"datasource: registering query type handler","@timestamp":"2021-06-09T15:41:34.368622Z","queryType":"exponential_heatmap_bucket_data"}
{"@level":"debug","@message":"datasource: registering query type handler","@timestamp":"2021-06-09T15:41:34.368689Z","queryType":"linear_heatmap_bucket_data"}
{"@level":"debug","@message":"datasource: registering query type handler","@timestamp":"2021-06-09T15:41:34.368717Z","queryType":"random_walk"}
{"@level":"debug","@message":"datasource: registering query type handler","@timestamp":"2021-06-09T15:41:34.368747Z","queryType":"predictable_pulse"}
{"@level":"debug","@message":"datasource: registering query type handler","@timestamp":"2021-06-09T15:41:34.368770Z","queryType":"predictable_csv_wave"}
{"@level":"debug","@message":"datasource: registering query type handler","@timestamp":"2021-06-09T15:41:34.368799Z","queryType":"random_walk_table"}
{"@level":"debug","@message":"datasource: registering query type handler","@timestamp":"2021-06-09T15:41:34.368822Z","queryType":"slow_query"}
{"@level":"debug","@message":"datasource: registering query type handler","@timestamp":"2021-06-09T15:41:34.368854Z","queryType":"no_data_points"}
{"@level":"debug","@message":"datasource: registering query type handler","@timestamp":"2021-06-09T15:41:34.368888Z","queryType":"datapoints_outside_range"}
{"@level":"debug","@message":"datasource: registering query type handler","@timestamp":"2021-06-09T15:41:34.368914Z","queryType":"manual_entry"}
{"@level":"debug","@message":"datasource: registering query type handler","@timestamp":"2021-06-09T15:41:34.368943Z","queryType":"csv_metric_values"}
{"@level":"debug","@message":"datasource: registering query type handler","@timestamp":"2021-06-09T15:41:34.368966Z","queryType":"streaming_client"}
{"@level":"debug","@message":"datasource: registering query type handler","@timestamp":"2021-06-09T15:41:34.368991Z","queryType":"live"}
{"@level":"debug","@message":"datasource: registering query type handler","@timestamp":"2021-06-09T15:41:34.369024Z","queryType":"grafana_api"}
{"@level":"debug","@message":"datasource: registering query type handler","@timestamp":"2021-06-09T15:41:34.369051Z","queryType":"arrow"}
{"@level":"debug","@message":"datasource: registering query type handler","@timestamp":"2021-06-09T15:41:34.369081Z","queryType":"annotations"}
{"@level":"debug","@message":"datasource: registering query type handler","@timestamp":"2021-06-09T15:41:34.369103Z","queryType":"table_static"}
{"@level":"debug","@message":"datasource: registering query type handler","@timestamp":"2021-06-09T15:41:34.369133Z","queryType":"random_walk_with_error"}
{"@level":"debug","@message":"datasource: registering query type handler","@timestamp":"2021-06-09T15:41:34.369156Z","queryType":"server_error_500"}
{"@level":"debug","@message":"datasource: registering query type handler","@timestamp":"2021-06-09T15:41:34.369181Z","queryType":"logs"}
{"@level":"debug","@message":"datasource: registering query type handler","@timestamp":"2021-06-09T15:41:34.369209Z","queryType":"node_graph"}
{"@level":"debug","@message":"datasource: registering query type fallback handler","@timestamp":"2021-06-09T15:41:34.369230Z"}
t=2021-06-09T15:41:34+0000 lvl=info msg="HTTP Server Listen" logger=http.server address=[::]:3000 protocol=http subUrl= socket=
t=2021-06-09T15:42:55+0000 lvl=info msg="Request Completed" logger=context userId=0 orgId=0 uname= method=GET path=/ status=302 remote_addr=192.168.1.131 time_ms=0 size=29 referer=
2021/06/09 15:42:58 http: superfluous response.WriteHeader call from gopkg.in/macaron%2ev1.(*responseWriter).WriteHeader (response_writer.go:60)
t=2021-06-09T15:42:58+0000 lvl=info msg="Request Completed" logger=context userId=1 orgId=1 uname=admin method=POST path=/api/datasources/proxy/2/api/v1/query status=403 remote_addr=192.168.1.131 time_ms=24 size=116 referer=http://192.168.1.75:3000/datasources/edit/2/

For good measure, there is the grafana docker-compose and the grafana docker image version is Grafana v7.5.7 (91de51771c).

grafana:
    container_name: grafana
    image: grafana/grafana
    restart: unless-stopped
    volumes:
      - grafana-storage:/var/lib/grafana
      # Binding file to container
      - ./configs/grafana/grafana.ini:/etc/grafana/grafana.ini:rw
    ports:
      - "3000:3000"

Ok, so none of that looks obviously wrong.

I went ahead and googled “Data proxy hostname and ip are not included in whitelist”. The first result is this github issue, raised by a user who (accidentally?) set Grafana’s data_source_proxy_whitelist setting and got this same error. Did you by any chance set that setting in your Grafana config? Because that would likely explain what you’re seeing…

Yes, actually. It was set. When I comment out that configuration, I am able to use that datasource!

Thanks for the assist on this. I may have looked at that GitHub bug and assumed that couldn’t possibly be the issue, which is why it’s nice to have someone else challenge your assumptions. Appreciate it!

1 Like