I’m trying to use promtail to scrape kubernetes pod logs. I want promtail to attach some of the pod annotations as labels, but it’s not behaving at all how I expected. I’m running promtail 2.7.4 installed from the helm chart.
Here are some of the issues I’m seeing:
- I couldn’t get promtail to attach ANY pod annotations as labels until I added the following to my config:
snippets:
extraRelabelConfigs:
# keep all kubernetes annotations
- action: labelmap
regex: __meta_kubernetes_pod_annotation_(.+)
- However, when I do the above, it wont let me selectively drop labels. Also a bunch of ‘standard’ labels seem to show up regardless.
My scrape_config job looks like the following (I only have 1 job).
scrape_configs:
- job_name: kubernetes-pods
kubernetes_sd_configs:
- role: pod
relabel_configs:
- source_labels: [__meta_kubernetes_namespace]
action: keep
regex: gitlab-runner
- source_labels: [__meta_kubernetes_pod_node_name]
target_label: '__host__'
- source_labels: [__meta_kubernetes_pod_annotation_project_runner_gitlab_com_id]
target_label: project_id
- source_labels: [__meta_kubernetes_pod_annotation_job_runner_gitlab_com_name]
target_label: job_name
pipeline_stages:
- labelallow:
- project_id
- job_name
- node_name
- namespace
Promtail does seem to be filtering out anything not in the gitlab-runner
namespace which is good. However it does not relabel the annotations as specified (I assume the extraRelabelConfigs is overriding all?) and it does not remove any labels even though theres a labelallow
stage. I also tried using labeldrop
without luck either.
Here’s all the labels that get pushed to Loki with the config, which is far more than I want/need and expect will cause cardinality issues.
app
container
filename
job
job_runner_gitlab_com_before_sha
job_runner_gitlab_com_id
job_runner_gitlab_com_name
job_runner_gitlab_com_ref
job_runner_gitlab_com_sha
job_runner_gitlab_com_url
namespace
node_name
pod
project_runner_gitlab_com_id
stream
Looking for any guidance on how to fix.