Giter VIP home page Giter VIP logo

provectus / kafka-ui Goto Github PK

View Code? Open in Web Editor NEW
8.5K 8.5K 1.1K 29.67 MB

Open-Source Web UI for Apache Kafka Management

License: Apache License 2.0

HTML 0.07% TypeScript 41.22% Dockerfile 0.03% Java 57.91% SCSS 0.01% JavaScript 0.06% ANTLR 0.68% CSS 0.03%
apache-kafka big-data cluster-management event-streaming hacktoberfest kafka kafka-brokers kafka-client kafka-cluster kafka-connect kafka-manager kafka-producer kafka-streams kafka-ui opensource streaming-data streams web-ui

kafka-ui's Introduction

UI for Apache Kafka logo UI for Apache Kafka 

Versatile, fast and lightweight web UI for managing Apache Kafka® clusters. Built by developers, for developers.


License UI for Apache Kafka Price Free Release version Chat with us Docker pulls

DOCSQUICK STARTCOMMUNITY DISCORD
AWS MarketplaceProductHunt

UI for Apache Kafka is a free, open-source web UI to monitor and manage Apache Kafka clusters.

UI for Apache Kafka is a simple tool that makes your data flows observable, helps find and troubleshoot issues faster and deliver optimal performance. Its lightweight dashboard makes it easy to track key metrics of your Kafka clusters - Brokers, Topics, Partitions, Production, and Consumption.

DISCLAIMER

UI for Apache Kafka is a free tool built and supported by the open-source community. Curated by Provectus, it will remain free and open-source, without any paid features or subscription plans to be added in the future. Looking for the help of Kafka experts? Provectus can help you design, build, deploy, and manage Apache Kafka clusters and streaming applications. Discover Professional Services for Apache Kafka, to unlock the full potential of Kafka in your enterprise!

Set up UI for Apache Kafka with just a couple of easy commands to visualize your Kafka data in a comprehensible way. You can run the tool locally or in the cloud.

Interface

Features

  • Multi-Cluster Management — monitor and manage all your clusters in one place
  • Performance Monitoring with Metrics Dashboard — track key Kafka metrics with a lightweight dashboard
  • View Kafka Brokers — view topic and partition assignments, controller status
  • View Kafka Topics — view partition count, replication status, and custom configuration
  • View Consumer Groups — view per-partition parked offsets, combined and per-partition lag
  • Browse Messages — browse messages with JSON, plain text, and Avro encoding
  • Dynamic Topic Configuration — create and configure new topics with dynamic configuration
  • Configurable Authentificationsecure your installation with optional Github/Gitlab/Google OAuth 2.0
  • Custom serialization/deserialization plugins - use a ready-to-go serde for your data like AWS Glue or Smile, or code your own!
  • Role based access control - manage permissions to access the UI with granular precision
  • Data masking - obfuscate sensitive data in topic messages

The Interface

UI for Apache Kafka wraps major functions of Apache Kafka with an intuitive user interface.

Interface

Topics

UI for Apache Kafka makes it easy for you to create topics in your browser by several clicks, pasting your own parameters, and viewing topics in the list.

Create Topic

It's possible to jump from connectors view to corresponding topics and from a topic to consumers (back and forth) for more convenient navigation. connectors, overview topic settings.

Connector_Topic_Consumer

Messages

Let's say we want to produce messages for our topic. With the UI for Apache Kafka we can send or write data/messages to the Kafka topics without effort by specifying parameters, and viewing messages in the list.

Produce Message

Schema registry

There are 3 supported types of schemas: Avro®, JSON Schema, and Protobuf schemas.

Create Schema Registry

Before producing avro/protobuf encoded messages, you have to add a schema for the topic in Schema Registry. Now all these steps are easy to do with a few clicks in a user-friendly interface.

Avro Schema Topic

Getting Started

To run UI for Apache Kafka, you can use either a pre-built Docker image or build it (or a jar file) yourself.

Quick start (Demo run)

docker run -it -p 8080:8080 -e DYNAMIC_CONFIG_ENABLED=true provectuslabs/kafka-ui

Then access the web UI at http://localhost:8080

The command is sufficient to try things out. When you're done trying things out, you can proceed with a persistent installation

Persistent installation

services:
  kafka-ui:
    container_name: kafka-ui
    image: provectuslabs/kafka-ui:latest
    ports:
      - 8080:8080
    environment:
      DYNAMIC_CONFIG_ENABLED: 'true'
    volumes:
      - ~/kui/config.yml:/etc/kafkaui/dynamic_config.yaml

Please refer to our configuration page to proceed with further app configuration.

Some useful configuration related links

Web UI Cluster Configuration Wizard

Configuration file explanation

Docker Compose examples

Misc configuration properties

Helm charts

Quick start

Building from sources

Quick start with building

Liveliness and readiness probes

Liveliness and readiness endpoint is at /actuator/health.
Info endpoint (build info) is located at /actuator/info.

Configuration options

All of the environment variables/config properties could be found here.

Contributing

Please refer to contributing guide, we'll guide you from there.

kafka-ui's People

Contributors

5hin0bi avatar anezboretskiy avatar antipova926 avatar apetrovs avatar arthurniedial avatar azatsafin avatar burusha16 avatar david-db88 avatar dependabot-preview[bot] avatar dependabot[bot] avatar gataniel avatar germanosin avatar gneyhabub avatar guzel738 avatar haarolean avatar habrahamyanpro avatar iliax avatar kristi-dev avatar lazzy-panda avatar mchukmarov avatar mgrdich avatar narekmat avatar neirubugz avatar nelydavtyan avatar razizbekyan avatar rustamgimadiev avatar simonyandev avatar vladsenyuta avatar winnie-chiu avatar workshur avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

kafka-ui's Issues

Do NOT work with cloud zookeeper

I've tried to deploy in Cloud along with MSK, based on https://github.com/provectus/quickstart-provectus-streaming-data-platform-kafka
The KAFKA_CLUSTERS_0_ZOOKEEPER param passed as string, the example
"z-1.mskcluster-test1.q75ugu.c3.kafka.us-west-2.amazonaws.com:2181,
z-2.mskcluster-test1.q75ugu.c3.kafka.us-west-2.amazonaws.com:2181,
z-3.mskcluster-test1.q75ugu.c3.kafka.us-west-2.amazonaws.com:2181"

UI itself is available, and represent passed cluster with Brokers, Topics list, topic messages and settings.

Under the Brokers tab, "ZOOKEEPER STATUS" is present, but show that zookeeper offline.

The zookeeper itself is accessible from Kafka-UI Fargate task, the Kafka-UI log proff this. In the log there "connection established" messages with each Zookeeper nodes.

I've tried to find the reason for this by enabling debug in log4j2, but didn't found any valuable information.

The example of Kafka-UI log below.
image
kafka-ui-cloudlog.txt

Topic creation flow is invalid

Frontend sends get topic before post action.
There is no need to make additional get, info could be retrieved from post response

Consumers UI component

Acceptance criteria:

  • consumer groups component added
    • the user can see list of groups with number of consumers & number of topics
    • the user can filter groups by name
  • consumer group component added
    • the user can see list of consumers with topic, partition, lag, messages behind, vcurrent offset, end offset

blocked by #9

Show ISR & OSR

Update In Sync Replicas & Out of Sync Replicas on broker's page to use ISR & OSR params from cluster's payload

blocked by: #5

Topic Messages API

Acceptance criteria:

  • New endpoint to get the list of topic messages added.
  • implemented the ability to filter topic messages by:
    • partition
    • offset
    • timestamp
  • Endpoint should be pageable
  • Add schema registry url config
  • Add schema registry to docker compose
  • Provision one topic with messages schema in schema registry
  • Trait default content as a text
  • topics could be configured:
    • text
    • json
    • avro (schema registry)

Delegation tokens

Implement CRUD for cluster tokens.
Should be under a new tab in cluster menu below "Consumers"

Consumer Groups API

Acceptance criteria:

  1. Provide a list of consumer groups connected to the cluster
  2. In consumer group, provide a list of consumer id with topic, partition, lag, messages behind, current offset, end offset

Update Readme for all components.

We need to add:

  • clear instructions how to install and run the app
    • e.g: brew cask install java doesn't seem to work anymore
  • info about supported Node versions

NullPointerException

Some requests with TIMESTAMP fails, e.g.

/api/clusters/local/topics/users/messages?limit=100&seekType=TIMESTAMP&seekTo=0::1594120200949&seekTo=1::1594120200949

returns

message: "java.lang.NullPointerException"
path: "/api/clusters/local/topics/users/messages"
requestId: "71f158dc"
status: 500
timestamp: 1594120799702```

Topic Messages UI component

Acceptance criteria

  1. New component added
  2. The user can filter messages by keyword & partition
  3. The user can render messages from offset or timestamp

blocked by: #10

Get rid of all unexpected "any" types.

  1. Run npm run lint from /kafka-ui-react-app
  2. Specify a correct type instead of any for all cases from the list
src/components/Topics/Details/Messages/Messages.tsx
  Line 41:29:   Unexpected any. Specify a different type  @typescript-eslint/no-explicit-any
  Line 76:13:   Unexpected any. Specify a different type  @typescript-eslint/no-explicit-any
  Line 101:52:  Unexpected any. Specify a different type  @typescript-eslint/no-explicit-any
  Line 181:43:  Unexpected any. Specify a different type  @typescript-eslint/no-explicit-any
  Line 230:53:  Unexpected any. Specify a different type  @typescript-eslint/no-explicit-any

src/redux/store/configureStore/dev.ts
  Line 9:16:  Unexpected any. Specify a different type  @typescript-eslint/no-explicit-any

Add custom parameters to topic creation form

AC

The user is able to:

  • add multiple custom parameters to topic creation form
  • select available custom topic parameter from list
  • update value of added parameter
  • remove added parameter
  • review validation error & update custom params

Screenshot 2020-02-27 22 37 49

blocked by: #7

Prometheus node exporter metrics source

  1. Add prometheus node exporter to one cluster (for all nodes)
  2. Add config to resolve node exporter addresses per broker
  3. Config to read metrics from node exporter or jmx
  4. Implement metrics reading from node exporter

Stop reading on empty poll

Now we are waiting for n messages or timeout. Let's check the max partition offset before starting a reading.

Topic messages: Smart filters

  1. Add query parameter js-filter with java script predicate function
  2. Add Nashorn dependency
  3. Filter messages before string inclusion using this function and record
  4. Pass variables to function:
    4.1 key (key object)
    4.2 value (content object)
    4.3 headers
    4.4 offset
    4.5 partition

UPD: design for the frontend part is already done on figma

  • Support message preview (partial selection of json fields)
  • Support searching by message headers (requested in #780)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.