How to configure Kafka Burrow for an SASL Kafka Cluster

Written by

Vishwa Teja Vangari

Published on

Jun 29, 2019

Reading time

2 min read

  • Data & Analytics
  • Observability
  • Technical Deep Dive

Configuring LinkedIn Kafka Burrow on SASL Kafka Cluster to check consumers Lag, topics and consumers.

Burrow: Kafka Consumer Lag Monitoring Tool

In the modern application development era, all the organizations are trending towards Event-Driven Micro Services architecture. Apache Kafka is one of the widely adopted distributed event streaming platform between micro-services for its scalability, performance, fault-tolerant, durability, reliability and many more features.

As these organizations tend to use Apache Kafka Cluster, quite a few applications evolve rapidly using multiple hundreds of Kafka Topics, and many more producer and consumer applications. It becomes quite hard to manage hundreds of topics in cluster, information about consumers, offsets, consumer lag etc. so there have been a few open source monitoring tools on kafka cluster like Yahoo's Kafka Manager, LinkedIn Burrow, Landoop Kafka Tools etc.

This blog will focus on configuring LinkedIn Kafka Burrow on SASL Kafka Cluster to check consumers Lag, topics and consumers.

Sample Grafana Dashboard after integrating with burrow

Configuring Burrow using SASL Connection:
As we already know SASL Authentication for Kafka Cluster can be done in below three different ways:

  • SASL_SCRAM_256
  • SASL_SCRAM_512

As of now current open source LinkedIn Burrow supports only SASL PLAIN Authentication configuration. To add support for all SASL mechanisms, I have forked the base burrow repository and added support for SASL_SCRAM_256 and SASL_SCRAM_512 at this GitHub repo. Docker version of this repo is available at docker hub (vishwavangari/burrow).

Spin up local Kafka Cluster which accepts clients based out of SASL authentication as in my previous article.

Below burrow.toml configuration file is used in building up burrow docker image, so we'll need to pass in the required cluster parameters while spinning up burrow docker instance.

Using vishwavangari/burrow and joway/burrow-dashboard docker images, passing in required configuration to spin up burrow for SASL Kafka Cluster using below docker-compose file. Thanks joway for setting up Burrow Dashboard to visualize consumers, consumer lags, topics etc.

If we use SHA-256 SCRAM algorithm for Kafka Cluster users, then we can specify burrow env variable SHA_ALGORITHM in docker-compose file as _sha256.
Note: If the connection between burrow docker container and local kafka cluster fails, then start docker-compose file by adding in network_mode as host

Burrow APIs:

We could leverage these burrow APIs and then integrate with any visualization, alerting frameworks like Grafana, Splunk dashboards to visualize and alert support teams by email/message, in case of any cluster disaster or if consumers are lagging way behind or if something abnormal happens by configuring few rules.

We could hit up http://localhost:8095/ for Burrow dashboard:

Cluster Detail Topic Offset


Burrow, a Kafka monitoring tool is widely used and integrated with other visualization, alerting frameworks like Influx, Grafana, Splunk. If you want to read more about burrow refer this wiki and about telegraf plugin for burrow at this link.


You might also like

All Insights

Looking to build something amazing?

Let's Talk