Hello, I want to apply loki for several log data sources. For example,
the first Netflow data source ( short for: netflow1), it needs loki write (1 pod)+ loki query (30 pods)+loki frontend (2 pod)
the second Netflow data source( short for: netflow2), it needs loki write (1 pod)+ loki query (30 pods)+loki frontend (2 pod)
the third Netflow data source( short for: netflow3), it needs loki write (1 pod)+ loki query (30 pods)+loki frontend (2 pod)
the first DNS data source ( short for: dns1), it needs loki write (1 pod)+ loki query (20 pods)+loki frontend (2 pod)
the second DNS data source( short for: dns2), it needs loki write (1 pod)+ loki query (20 pods)+loki frontend (2 pod)
the third DNS data source( short for: dns3), it needs loki write (1 pod)+ loki query (20 pods)+loki frontend (2 pod)
and so on.
If I set up the whole process for each data source seperately, there will be about (1+30+2)*3+(1+20+2)*3=168 pods. That means a lot of resources (such as CPU ,Memory) are wasted for idle time when there is no query.
Is there any way to share the resources ?For example, there is 1 pod of loki write for each data source ,but they use the same 40 pods of loki query +4 loki frontend for all of them.
that means only need 6+40+4=50 pods for all data sources.
Is there any methods to solve this problem? Thanks very much.
Best wishes.
This topic was automatically closed 365 days after the last reply. New replies are no longer allowed.