obsidiandynamics / kafdrop Goto Github PK
View Code? Open in Web Editor NEWKafka Web UI
License: Apache License 2.0
Kafka Web UI
License: Apache License 2.0
Is there any plan or enhancement in pipeline for alerting, notifying (event based, etc) in Kafdrop3?
Hi
Thanks for the excellent project.
Compiling from source or using the bintray .jar file (any of the recent versions) in our own Docker image results in " nested exception is java.lang.NoClassDefFoundError: org/codehaus/jackson/JsonNode " when trying to view AVRO messages using a Schema Registry.
When using the "DEFAULT" format, the messages show, but obviously serialised.
Trace as below;
errorAtts: {timestamp=Tue Oct 29 04:09:27 GMT 2019, status=500, error=Internal Server Error, message=Handler dispatch failed; nested exception is java.lang.NoClassDefFoundError: org/codehaus/jackson/JsonNode, trace=java.lang.NoClassDefFoundError: org/codehaus/jackson/JsonNode at kafdrop.util.AvroMessageDeserializer.getDeserializer(AvroMessageDeserializer.java:28) at kafdrop.util.AvroMessageDeserializer.<init>(AvroMessageDeserializer.java:15) at kafdrop.controller.MessageController.getDeserializer(MessageController.java:190) at kafdrop.controller.MessageController.viewMessageForm(MessageController.java:125) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.base/java.lang.reflect.Method.invoke(Method.java:567) at org.springframework.web.method.support.InvocableHandlerMethod.doInvoke(InvocableHandlerMethod.java:190) at org.springframework.web.method.support.InvocableHandlerMethod.invokeForRequest(InvocableHandlerMethod.java:138) at org.springframework.web.servlet.mvc.method.annotation.ServletInvocableHandlerMethod.invokeAndHandle(ServletInvocableHandlerMethod.java:104) at org.springframework.web.servlet.mvc.method.annotation.RequestMappingHandlerAdapter.invokeHandlerMethod(RequestMappingHandlerAdapter.java:892) at org.springframework.web.servlet.mvc.method.annotation.RequestMappingHandlerAdapter.handleInternal(RequestMappingHandlerAdapter.java:797) at org.springframework.web.servlet.mvc.method.AbstractHandlerMethodAdapter.handle(AbstractHandlerMethodAdapter.java:87) at org.springframework.web.servlet.DispatcherServlet.doDispatch(DispatcherServlet.java:1039) at org.springframework.web.servlet.DispatcherServlet.doService(DispatcherServlet.java:942) at org.springframework.web.servlet.FrameworkServlet.processRequest(FrameworkServlet.java:1005) at org.springframework.web.servlet.FrameworkServlet.doGet(FrameworkServlet.java:897) at javax.servlet.http.HttpServlet.service(HttpServlet.java:645) at org.springframework.web.servlet.FrameworkServlet.service(FrameworkServlet.java:882) at javax.servlet.http.HttpServlet.service(HttpServlet.java:750) at io.undertow.servlet.handlers.ServletHandler.handleRequest(ServletHandler.java:74) at io.undertow.servlet.handlers.FilterHandler$FilterChainImpl.doFilter(FilterHandler.java:129) at kafdrop.config.CorsConfiguration$1.doFilter(CorsConfiguration.java:88) at io.undertow.servlet.core.ManagedFilter.doFilter(ManagedFilter.java:61) at io.undertow.servlet.handlers.FilterHandler$FilterChainImpl.doFilter(FilterHandler.java:131) at org.springframework.boot.actuate.web.trace.servlet.HttpTraceFilter.doFilterInternal(HttpTraceFilter.java:88) at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:118) at io.undertow.servlet.core.ManagedFilter.doFilter(ManagedFilter.java:61) at io.undertow.servlet.handlers.FilterHandler$FilterChainImpl.doFilter(FilterHandler.java:131) at org.springframework.web.filter.RequestContextFilter.doFilterInternal(RequestContextFilter.java:99) at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:118) at io.undertow.servlet.core.ManagedFilter.doFilter(ManagedFilter.java:61) at io.undertow.servlet.handlers.FilterHandler$FilterChainImpl.doFilter(FilterHandler.java:131) at org.springframework.web.filter.FormContentFilter.doFilterInternal(FormContentFilter.java:92) at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:118) at io.undertow.servlet.core.ManagedFilter.doFilter(ManagedFilter.java:61) at io.undertow.servlet.handlers.FilterHandler$FilterChainImpl.doFilter(FilterHandler.java:131) at org.springframework.web.filter.HiddenHttpMethodFilter.doFilterInternal(HiddenHttpMethodFilter.java:93) at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:118) at io.undertow.servlet.core.ManagedFilter.doFilter(ManagedFilter.java:61) at io.undertow.servlet.handlers.FilterHandler$FilterChainImpl.doFilter(FilterHandler.java:131) at org.springframework.boot.actuate.metrics.web.servlet.WebMvcMetricsFilter.filterAndRecordMetrics(WebMvcMetricsFilter.java:114) at org.springframework.boot.actuate.metrics.web.servlet.WebMvcMetricsFilter.doFilterInternal(WebMvcMetricsFilter.java:104) at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:118) at io.undertow.servlet.core.ManagedFilter.doFilter(ManagedFilter.java:61) at io.undertow.servlet.handlers.FilterHandler$FilterChainImpl.doFilter(FilterHandler.java:131) at org.springframework.web.filter.CharacterEncodingFilter.doFilterInternal(CharacterEncodingFilter.java:200) at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:118) at io.undertow.servlet.core.ManagedFilter.doFilter(ManagedFilter.java:61) at io.undertow.servlet.handlers.FilterHandler$FilterChainImpl.doFilter(FilterHandler.java:131) at io.undertow.servlet.handlers.FilterHandler.handleRequest(FilterHandler.java:84) at io.undertow.servlet.handlers.security.ServletSecurityRoleHandler.handleRequest(ServletSecurityRoleHandler.java:62) at io.undertow.servlet.handlers.ServletChain$1.handleRequest(ServletChain.java:68) at io.undertow.servlet.handlers.ServletDispatchingHandler.handleRequest(ServletDispatchingHandler.java:36) at io.undertow.servlet.handlers.RedirectDirHandler.handleRequest(RedirectDirHandler.java:68) at io.undertow.servlet.handlers.security.SSLInformationAssociationHandler.handleRequest(SSLInformationAssociationHandler.java:132) at io.undertow.servlet.handlers.security.ServletAuthenticationCallHandler.handleRequest(ServletAuthenticationCallHandler.java:57) at io.undertow.server.handlers.PredicateHandler.handleRequest(PredicateHandler.java:43) at io.undertow.security.handlers.AbstractConfidentialityHandler.handleRequest(AbstractConfidentialityHandler.java:46) at io.undertow.servlet.handlers.security.ServletConfidentialityConstraintHandler.handleRequest(ServletConfidentialityConstraintHandler.java:64) at io.undertow.security.handlers.AuthenticationMechanismsHandler.handleRequest(AuthenticationMechanismsHandler.java:60) at io.undertow.servlet.handlers.security.CachedAuthenticatedSessionHandler.handleRequest(CachedAuthenticatedSessionHandler.java:77) at io.undertow.security.handlers.AbstractSecurityContextAssociationHandler.handleRequest(AbstractSecurityContextAssociationHandler.java:43) at io.undertow.server.handlers.PredicateHandler.handleRequest(PredicateHandler.java:43) at io.undertow.server.handlers.PredicateHandler.handleRequest(PredicateHandler.java:43) at io.undertow.servlet.handlers.ServletInitialHandler.handleFirstRequest(ServletInitialHandler.java:269) at io.undertow.servlet.handlers.ServletInitialHandler.access$100(ServletInitialHandler.java:78) at io.undertow.servlet.handlers.ServletInitialHandler$2.call(ServletInitialHandler.java:133) at io.undertow.servlet.handlers.ServletInitialHandler$2.call(ServletInitialHandler.java:130) at io.undertow.servlet.core.ServletRequestContextThreadSetupAction$1.call(ServletRequestContextThreadSetupAction.java:48) at io.undertow.servlet.core.ContextClassLoaderSetupAction$1.call(ContextClassLoaderSetupAction.java:43) at io.undertow.servlet.handlers.ServletInitialHandler.dispatchRequest(ServletInitialHandler.java:249) at io.undertow.servlet.handlers.ServletInitialHandler.access$000(ServletInitialHandler.java:78) at io.undertow.servlet.handlers.ServletInitialHandler$1.handleRequest(ServletInitialHandler.java:99) at io.undertow.server.Connectors.executeRootHandler(Connectors.java:376) at io.undertow.server.HttpServerExchange$1.run(HttpServerExchange.java:830) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:835) Caused by: java.lang.ClassNotFoundException: org.codehaus.jackson.JsonNode at java.base/jdk.internal.loader.BuiltinClassLoader.loadClass(BuiltinClassLoader.java:583) at java.base/jdk.internal.loader.ClassLoaders$AppClassLoader.loadClass(ClassLoaders.java:178) at java.base/java.lang.ClassLoader.loadClass(ClassLoader.java:521) ... 80 more , path=/topic/md-ack-events/messages}
Thanks
Hi,
How can i disable/ enable the topic creation ?
Is there currently a way to connect to Kafdrop securely (using https instead of http)?
Currently the pipeline fails on PRs as it doesn't have access to secret variables. At minimum, we should be able to run a simple mvn
build.
Does this support SASL_SSL.
Environment:
Kuberneties on cent7. Running Version 3.18.1 of kafdrop behind an nginx ingress
Love this tool, but would like to lock down the non-read operations (thus far, creating topics). I basically have a server config rule (in ingress file) to block that path, but is there a flag I can set to block those kind of operations as you might be expanding in the future to allow that sort of functionality. Any advice on how to do so?
Thanks a ton!
docker registry description of available env variables
2019-09-07 11:19:43.034 ERROR 6 [ XNIO-1 task-10] i.u.s.a.LoggingExceptionHandler : UT005023: Exception handling request to /topic/trip/messages
org.springframework.web.util.NestedServletException: Request processing failed; nested exception is io.confluent.common.config.ConfigException: Invalid value null for configuration schema.registry.url: Expected a comma separated list.
Hi,
I packaged and ran it from my local trying to connect my Organization's test cluster using the following standalone java jar run. Kindly help setup with proper config. I tried the same command without adding the client.jaas auth argument giving me same result.
java --add-opens=java.base/sun.nio.ch=ALL-UNNAMED
-jar target/kafdrop-3.18.0-SNAPSHOT.jar
--kafka.brokerConnect=host1:port1,host2:port2,host3:port3,host4:port4,host5:port5
-Djava.security.auth.login.config=~/Desktop/client_jaas.conf
org.springframework.web.util.NestedServletException: Request processing failed; nested exception is kafdrop.service.KafkaAdminClientException: java.util.concurrent.ExecutionException: org.apache.kafka.common.errors.TimeoutException: Timed out waiting for a node assignment.
at org.springframework.web.servlet.FrameworkServlet.processRequest(FrameworkServlet.java:1013)
at org.springframework.web.servlet.FrameworkServlet.doGet(FrameworkServlet.java:897)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:645)
at org.springframework.web.servlet.FrameworkServlet.service(FrameworkServlet.java:882)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:750)
at io.undertow.servlet.handlers.ServletHandler.handleRequest(ServletHandler.java:74)
at io.undertow.servlet.handlers.FilterHandler$FilterChainImpl.doFilter(FilterHandler.java:129)
at kafdrop.config.CorsConfiguration$1.doFilter(CorsConfiguration.java:88)
at io.undertow.servlet.core.ManagedFilter.doFilter(ManagedFilter.java:61)
at io.undertow.servlet.handlers.FilterHandler$FilterChainImpl.doFilter(FilterHandler.java:131)
at org.springframework.boot.actuate.web.trace.servlet.HttpTraceFilter.doFilterInternal(HttpTraceFilter.java:88)
at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:118)
at io.undertow.servlet.core.ManagedFilter.doFilter(ManagedFilter.java:61)
at io.undertow.servlet.handlers.FilterHandler$FilterChainImpl.doFilter(FilterHandler.java:131)
at org.springframework.web.filter.RequestContextFilter.doFilterInternal(RequestContextFilter.java:99)
at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:118)
at io.undertow.servlet.core.ManagedFilter.doFilter(ManagedFilter.java:61)
at io.undertow.servlet.handlers.FilterHandler$FilterChainImpl.doFilter(FilterHandler.java:131)
at org.springframework.web.filter.FormContentFilter.doFilterInternal(FormContentFilter.java:92)
at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:118)
at io.undertow.servlet.core.ManagedFilter.doFilter(ManagedFilter.java:61)
at io.undertow.servlet.handlers.FilterHandler$FilterChainImpl.doFilter(FilterHandler.java:131)
at org.springframework.web.filter.HiddenHttpMethodFilter.doFilterInternal(HiddenHttpMethodFilter.java:93)
at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:118)
at io.undertow.servlet.core.ManagedFilter.doFilter(ManagedFilter.java:61)
at io.undertow.servlet.handlers.FilterHandler$FilterChainImpl.doFilter(FilterHandler.java:131)
at org.springframework.boot.actuate.metrics.web.servlet.WebMvcMetricsFilter.filterAndRecordMetrics(WebMvcMetricsFilter.java:114)
at org.springframework.boot.actuate.metrics.web.servlet.WebMvcMetricsFilter.doFilterInternal(WebMvcMetricsFilter.java:104)
at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:118)
at io.undertow.servlet.core.ManagedFilter.doFilter(ManagedFilter.java:61)
at io.undertow.servlet.handlers.FilterHandler$FilterChainImpl.doFilter(FilterHandler.java:131)
at org.springframework.web.filter.CharacterEncodingFilter.doFilterInternal(CharacterEncodingFilter.java:200)
at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:118)
at io.undertow.servlet.core.ManagedFilter.doFilter(ManagedFilter.java:61)
at io.undertow.servlet.handlers.FilterHandler$FilterChainImpl.doFilter(FilterHandler.java:131)
at io.undertow.servlet.handlers.FilterHandler.handleRequest(FilterHandler.java:84)
at io.undertow.servlet.handlers.security.ServletSecurityRoleHandler.handleRequest(ServletSecurityRoleHandler.java:62)
at io.undertow.servlet.handlers.ServletChain$1.handleRequest(ServletChain.java:68)
at io.undertow.servlet.handlers.ServletDispatchingHandler.handleRequest(ServletDispatchingHandler.java:36)
at io.undertow.servlet.handlers.RedirectDirHandler.handleRequest(RedirectDirHandler.java:68)
at io.undertow.servlet.handlers.security.SSLInformationAssociationHandler.handleRequest(SSLInformationAssociationHandler.java:132)
at io.undertow.servlet.handlers.security.ServletAuthenticationCallHandler.handleRequest(ServletAuthenticationCallHandler.java:57)
at io.undertow.server.handlers.PredicateHandler.handleRequest(PredicateHandler.java:43)
at io.undertow.security.handlers.AbstractConfidentialityHandler.handleRequest(AbstractConfidentialityHandler.java:46)
at io.undertow.servlet.handlers.security.ServletConfidentialityConstraintHandler.handleRequest(ServletConfidentialityConstraintHandler.java:64)
at io.undertow.security.handlers.AuthenticationMechanismsHandler.handleRequest(AuthenticationMechanismsHandler.java:60)
at io.undertow.servlet.handlers.security.CachedAuthenticatedSessionHandler.handleRequest(CachedAuthenticatedSessionHandler.java:77)
at io.undertow.security.handlers.AbstractSecurityContextAssociationHandler.handleRequest(AbstractSecurityContextAssociationHandler.java:43)
at io.undertow.server.handlers.PredicateHandler.handleRequest(PredicateHandler.java:43)
at io.undertow.server.handlers.PredicateHandler.handleRequest(PredicateHandler.java:43)
at io.undertow.servlet.handlers.ServletInitialHandler.handleFirstRequest(ServletInitialHandler.java:269)
at io.undertow.servlet.handlers.ServletInitialHandler.access$100(ServletInitialHandler.java:78)
at io.undertow.servlet.handlers.ServletInitialHandler$2.call(ServletInitialHandler.java:133)
at io.undertow.servlet.handlers.ServletInitialHandler$2.call(ServletInitialHandler.java:130)
at io.undertow.servlet.core.ServletRequestContextThreadSetupAction$1.call(ServletRequestContextThreadSetupAction.java:48)
at io.undertow.servlet.core.ContextClassLoaderSetupAction$1.call(ContextClassLoaderSetupAction.java:43)
at io.undertow.servlet.handlers.ServletInitialHandler.dispatchRequest(ServletInitialHandler.java:249)
at io.undertow.servlet.handlers.ServletInitialHandler.access$000(ServletInitialHandler.java:78)
at io.undertow.servlet.handlers.ServletInitialHandler$1.handleRequest(ServletInitialHandler.java:99)
at io.undertow.server.Connectors.executeRootHandler(Connectors.java:376)
at io.undertow.server.HttpServerExchange$1.run(HttpServerExchange.java:830)
at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
at java.base/java.lang.Thread.run(Thread.java:834)
Even though specfy different port than 9000, still the kafdrop listen on 9000.
ps ax | grep -i "kafdrop.jar" | egrep -v "grep" | awk '{print $1}'
1199
netstat -plant | grep 9000
tcp6 0 0 :::9000 :::* LISTEN 1199/java
netstat -plant | grep 9010
tcp6 0 0 :::9010 :::* LISTEN 1199/java
I get the below error when I run behind a reverse proxy /kafdrop
(Istio in my case).
I tried set servlet.contextPath: /kafdrop
in values.yml. But I get:
2019-11-08 17:01:04.349 �[32m INFO�[m �[35m15�[m [ XNIO-1 task-1] �[36mo.s.w.s.FrameworkServlet �[m : Completed initialization in 8 ms
errorAtts: {timestamp=Fri Nov 08 17:01:27 GMT 2019, status=404, error=Not Found, message=Not Found, path=/kafdrop}
17:01:27/0 ERROR [XNIO-1 task-11]: Error executing FreeMarker template
FreeMarker template error:
The following has evaluated to null or missing:
==> error.trace [in template "error.ftl" at line 10, column 3]
----
Tip: It's the step after the last dot that caused this error, not those before it.
----
Tip: If the failing expression is known to legally refer to something that's sometimes null or missing, either specify a default value like myOptionalVar!myDefault, or use <#if myOptionalVar??>when-present<#else>when-missing</#if>. (These only cover the last step of the expression; to cover the whole expression, use parenthesis: (myOptionalVar.foo)!myDefault, (myOptionalVar.foo)??
----
----
FTL stack trace ("~" means nesting-related):
- Failed at: ${error.trace} [in template "error.ftl" at line 10, column 1]
----
Java stack trace (for programmers):
----
freemarker.core.InvalidReferenceException: [... Exception message was already printed; see it above ...]
at freemarker.core.InvalidReferenceException.getInstance(InvalidReferenceException.java:134)
at freemarker.core.EvalUtil.coerceModelToTextualCommon(EvalUtil.java:467)
at freemarker.core.EvalUtil.coerceModelToStringOrMarkup(EvalUtil.java:389)
at freemarker.core.EvalUtil.coerceModelToStringOrMarkup(EvalUtil.java:358)
at freemarker.core.DollarVariable.calculateInterpolatedStringOrMarkup(DollarVariable.java:100)
at freemarker.core.DollarVariable.accept(DollarVariable.java:63)
at freemarker.core.Environment.visit(Environment.java:330)
at freemarker.core.Environment.visit(Environment.java:336)
at freemarker.core.Environment.process(Environment.java:309)
at freemarker.template.Template.process(Template.java:384)
at org.springframework.web.servlet.view.freemarker.FreeMarkerView.processTemplate(FreeMarkerView.java:389)
at org.springframework.web.servlet.view.freemarker.FreeMarkerView.doRender(FreeMarkerView.java:302)
at org.springframework.web.servlet.view.freemarker.FreeMarkerView.renderMergedTemplateModel(FreeMarkerView.java:253)
at org.springframework.web.servlet.view.AbstractTemplateView.renderMergedOutputModel(AbstractTemplateView.java:178)
at org.springframework.web.servlet.view.AbstractView.render(AbstractView.java:316)
at org.springframework.web.servlet.DispatcherServlet.render(DispatcherServlet.java:1371)
at org.springframework.web.servlet.DispatcherServlet.processDispatchResult(DispatcherServlet.java:1117)
at org.springframework.web.servlet.DispatcherServlet.doDispatch(DispatcherServlet.java:1056)
at org.springframework.web.servlet.DispatcherServlet.doService(DispatcherServlet.java:942)
at org.springframework.web.servlet.FrameworkServlet.processRequest(FrameworkServlet.java:1005)
at org.springframework.web.servlet.FrameworkServlet.doGet(FrameworkServlet.java:897)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:645)
at org.springframework.web.servlet.FrameworkServlet.service(FrameworkServlet.java:882)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:750)
at io.undertow.servlet.handlers.ServletHandler.handleRequest(ServletHandler.java:74)
at io.undertow.servlet.handlers.FilterHandler.handleRequest(FilterHandler.java:81)
at io.undertow.servlet.handlers.security.ServletSecurityRoleHandler.handleRequest(ServletSecurityRoleHandler.java:62)
at io.undertow.servlet.handlers.ServletChain$1.handleRequest(ServletChain.java:68)
at io.undertow.servlet.handlers.ServletDispatchingHandler.handleRequest(ServletDispatchingHandler.java:36)
at io.undertow.servlet.handlers.RedirectDirHandler.handleRequest(RedirectDirHandler.java:68)
at io.undertow.server.handlers.PredicateHandler.handleRequest(PredicateHandler.java:43)
at io.undertow.server.handlers.PredicateHandler.handleRequest(PredicateHandler.java:43)
at io.undertow.servlet.handlers.ServletInitialHandler.dispatchRequest(ServletInitialHandler.java:251)
at io.undertow.servlet.handlers.ServletInitialHandler.dispatchToPath(ServletInitialHandler.java:186)
at io.undertow.servlet.spec.RequestDispatcherImpl.error(RequestDispatcherImpl.java:501)
at io.undertow.servlet.spec.RequestDispatcherImpl.error(RequestDispatcherImpl.java:419)
at io.undertow.servlet.spec.HttpServletResponseImpl.doErrorDispatch(HttpServletResponseImpl.java:196)
at io.undertow.servlet.handlers.ServletInitialHandler.handleFirstRequest(ServletInitialHandler.java:276)
at io.undertow.servlet.handlers.ServletInitialHandler.access$100(ServletInitialHandler.java:78)
at io.undertow.servlet.handlers.ServletInitialHandler$2.call(ServletInitialHandler.java:133)
at io.undertow.servlet.handlers.ServletInitialHandler$2.call(ServletInitialHandler.java:130)
at io.undertow.servlet.core.ServletRequestContextThreadSetupAction$1.call(ServletRequestContextThreadSetupAction.java:48)
at io.undertow.servlet.core.ContextClassLoaderSetupAction$1.call(ContextClassLoaderSetupAction.java:43)
at io.undertow.servlet.handlers.ServletInitialHandler.dispatchRequest(ServletInitialHandler.java:249)
at io.undertow.servlet.handlers.ServletInitialHandler.access$000(ServletInitialHandler.java:78)
at io.undertow.servlet.handlers.ServletInitialHandler$1.handleRequest(ServletInitialHandler.java:99)
at io.undertow.server.Connectors.executeRootHandler(Connectors.java:376)
at io.undertow.server.HttpServerExchange$1.run(HttpServerExchange.java:830)
at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
at java.base/java.lang.Thread.run(Thread.java:835)
2019-11-08 17:01:27.631 �[31mERROR�[m �[35m15�[m [ XNIO-1 task-11] �[36mi.u.s.a.LoggingExceptionHandler �[m : UT005023: Exception handling request to /error
java.lang.RuntimeException: org.springframework.web.util.NestedServletException: Request processing failed; nested exception is freemarker.core.InvalidReferenceException: The following has evaluated to null or missing:
==> error.trace [in template "error.ftl" at line 10, column 3]
----
Tip: It's the step after the last dot that caused this error, not those before it.
----
Tip: If the failing expression is known to legally refer to something that's sometimes null or missing, either specify a default value like myOptionalVar!myDefault, or use <#if myOptionalVar??>when-present<#else>when-missing</#if>. (These only cover the last step of the expression; to cover the whole expression, use parenthesis: (myOptionalVar.foo)!myDefault, (myOptionalVar.foo)??
----
----
FTL stack trace ("~" means nesting-related):
- Failed at: ${error.trace} [in template "error.ftl" at line 10, column 1]
----
at io.undertow.servlet.spec.HttpServletResponseImpl.doErrorDispatch(HttpServletResponseImpl.java:198)
at io.undertow.servlet.handlers.ServletInitialHandler.handleFirstRequest(ServletInitialHandler.java:276)
at io.undertow.servlet.handlers.ServletInitialHandler.access$100(ServletInitialHandler.java:78)
at io.undertow.servlet.handlers.ServletInitialHandler$2.call(ServletInitialHandler.java:133)
at io.undertow.servlet.handlers.ServletInitialHandler$2.call(ServletInitialHandler.java:130)
at io.undertow.servlet.core.ServletRequestContextThreadSetupAction$1.call(ServletRequestContextThreadSetupAction.java:48)
at io.undertow.servlet.core.ContextClassLoaderSetupAction$1.call(ContextClassLoaderSetupAction.java:43)
at io.undertow.servlet.handlers.ServletInitialHandler.dispatchRequest(ServletInitialHandler.java:249)
at io.undertow.servlet.handlers.ServletInitialHandler.access$000(ServletInitialHandler.java:78)
at io.undertow.servlet.handlers.ServletInitialHandler$1.handleRequest(ServletInitialHandler.java:99)
at io.undertow.server.Connectors.executeRootHandler(Connectors.java:376)
at io.undertow.server.HttpServerExchange$1.run(HttpServerExchange.java:830)
at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
at java.base/java.lang.Thread.run(Thread.java:835)
Caused by: org.springframework.web.util.NestedServletException: Request processing failed; nested exception is freemarker.core.InvalidReferenceException: The following has evaluated to null or missing:
==> error.trace [in template "error.ftl" at line 10, column 3]
----
Tip: It's the step after the last dot that caused this error, not those before it.
----
Tip: If the failing expression is known to legally refer to something that's sometimes null or missing, either specify a default value like myOptionalVar!myDefault, or use <#if myOptionalVar??>when-present<#else>when-missing</#if>. (These only cover the last step of the expression; to cover the whole expression, use parenthesis: (myOptionalVar.foo)!myDefault, (myOptionalVar.foo)??
----
----
FTL stack trace ("~" means nesting-related):
- Failed at: ${error.trace} [in template "error.ftl" at line 10, column 1]
----
at org.springframework.web.servlet.FrameworkServlet.processRequest(FrameworkServlet.java:1013)
at org.springframework.web.servlet.FrameworkServlet.doGet(FrameworkServlet.java:897)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:645)
at org.springframework.web.servlet.FrameworkServlet.service(FrameworkServlet.java:882)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:750)
at io.undertow.servlet.handlers.ServletHandler.handleRequest(ServletHandler.java:74)
at io.undertow.servlet.handlers.FilterHandler.handleRequest(FilterHandler.java:81)
at io.undertow.servlet.handlers.security.ServletSecurityRoleHandler.handleRequest(ServletSecurityRoleHandler.java:62)
at io.undertow.servlet.handlers.ServletChain$1.handleRequest(ServletChain.java:68)
at io.undertow.servlet.handlers.ServletDispatchingHandler.handleRequest(ServletDispatchingHandler.java:36)
at io.undertow.servlet.handlers.RedirectDirHandler.handleRequest(RedirectDirHandler.java:68)
at io.undertow.server.handlers.PredicateHandler.handleRequest(PredicateHandler.java:43)
at io.undertow.server.handlers.PredicateHandler.handleRequest(PredicateHandler.java:43)
at io.undertow.servlet.handlers.ServletInitialHandler.dispatchRequest(ServletInitialHandler.java:251)
at io.undertow.servlet.handlers.ServletInitialHandler.dispatchToPath(ServletInitialHandler.java:186)
at io.undertow.servlet.spec.RequestDispatcherImpl.error(RequestDispatcherImpl.java:501)
at io.undertow.servlet.spec.RequestDispatcherImpl.error(RequestDispatcherImpl.java:419)
at io.undertow.servlet.spec.HttpServletResponseImpl.doErrorDispatch(HttpServletResponseImpl.java:196)
... 14 more
Caused by: freemarker.core.InvalidReferenceException: The following has evaluated to null or missing:
==> error.trace [in template "error.ftl" at line 10, column 3]
----
Tip: It's the step after the last dot that caused this error, not those before it.
----
Tip: If the failing expression is known to legally refer to something that's sometimes null or missing, either specify a default value like myOptionalVar!myDefault, or use <#if myOptionalVar??>when-present<#else>when-missing</#if>. (These only cover the last step of the expression; to cover the whole expression, use parenthesis: (myOptionalVar.foo)!myDefault, (myOptionalVar.foo)??
----
----
FTL stack trace ("~" means nesting-related):
- Failed at: ${error.trace} [in template "error.ftl" at line 10, column 1]
----
at freemarker.core.InvalidReferenceException.getInstance(InvalidReferenceException.java:134)
at freemarker.core.EvalUtil.coerceModelToTextualCommon(EvalUtil.java:467)
at freemarker.core.EvalUtil.coerceModelToStringOrMarkup(EvalUtil.java:389)
at freemarker.core.EvalUtil.coerceModelToStringOrMarkup(EvalUtil.java:358)
at freemarker.core.DollarVariable.calculateInterpolatedStringOrMarkup(DollarVariable.java:100)
at freemarker.core.DollarVariable.accept(DollarVariable.java:63)
at freemarker.core.Environment.visit(Environment.java:330)
at freemarker.core.Environment.visit(Environment.java:336)
at freemarker.core.Environment.process(Environment.java:309)
at freemarker.template.Template.process(Template.java:384)
at org.springframework.web.servlet.view.freemarker.FreeMarkerView.processTemplate(FreeMarkerView.java:389)
at org.springframework.web.servlet.view.freemarker.FreeMarkerView.doRender(FreeMarkerView.java:302)
at org.springframework.web.servlet.view.freemarker.FreeMarkerView.renderMergedTemplateModel(FreeMarkerView.java:253)
at org.springframework.web.servlet.view.AbstractTemplateView.renderMergedOutputModel(AbstractTemplateView.java:178)
at org.springframework.web.servlet.view.AbstractView.render(AbstractView.java:316)
at org.springframework.web.servlet.DispatcherServlet.render(DispatcherServlet.java:1371)
at org.springframework.web.servlet.DispatcherServlet.processDispatchResult(DispatcherServlet.java:1117)
at org.springframework.web.servlet.DispatcherServlet.doDispatch(DispatcherServlet.java:1056)
at org.springframework.web.servlet.DispatcherServlet.doService(DispatcherServlet.java:942)
at org.springframework.web.servlet.FrameworkServlet.processRequest(FrameworkServlet.java:1005)
... 31 more
The Kafdrop Host:9090/ --> always gives me whitelabel error page
2019-08-04 21:44:52.125 WARN 6 [ XNIO-1 task-3] k.c.NetworkClient$DefaultMetadataUpdater : [Consumer clientId=kafdrop-client, groupId=kafdrop-consumer-group] 50 partitions have leader brokers without a matching listener, including [__consumer_offsets-0, __consumer_offsets-10, __consumer_offsets-20, __consumer_offsets-40, __consumer_offsets-30, __consumer_offsets-39, __consumer_offsets-9, __consumer_offsets-11, __consumer_offsets-31, __consumer_offsets-13]
2019-08-04 21:44:52.228 WARN 6 [ XNIO-1 task-3] k.c.NetworkClient$DefaultMetadataUpdater : [Consumer clientId=kafdrop-client, groupId=kafdrop-consumer-group] 50 partitions have leader brokers without a matching listener, including [__consumer_offsets-0, __consumer_offsets-10, __consumer_offsets-20, __consumer_offsets-40, __consumer_offsets-30, __consumer_offsets-39, __consumer_offsets-9, __consumer_offsets-11, __consumer_offsets-31, __consumer_offsets-13]
2019-08-04 21:44:52.331 WARN 6 [ XNIO-1 task-3] k.c.NetworkClient$DefaultMetadataUpdater : [Consumer clientId=kafdrop-client, groupId=kafdrop-consumer-group] 50 partitions have leader brokers without a matching listener, including [__consumer_offsets-0, __consumer_offsets-10, __consumer_offsets-20, __consumer_offsets-40, __consumer_offsets-30, __consumer_offsets-39, __consumer_offsets-9, __consumer_offsets-11, __consumer_offsets-31, __consumer_offsets-13]
2019-08-04 21:44:52.433 WARN 6 [ XNIO-1 task-3] k.c.NetworkClient$DefaultMetadataUpdater : [Consumer clientId=kafdrop-client, groupId=kafdrop-consumer-group] 50 partitions have leader brokers without a matching listener, including [__consumer_offsets-0, __consumer_offsets-10, __consumer_offsets-20, __consumer_offsets-40, __consumer_offsets-30, __consumer_offsets-39, __consumer_offsets-9, __consumer_offsets-11, __consumer_offsets-31, __consumer_offsets-13]
2019-08-04 21:44:52.537 WARN 6 [ XNIO-1 task-3] k.c.NetworkClient$DefaultMetadataUpdater : [Consumer clientId=kafdrop-client, groupId=kafdrop-consumer-group] 50 partitions have leader brokers without a matching listener, including [__consumer_offsets-0, __consumer_offsets-10, __consumer_offsets-20, __consumer_offsets-40, __consumer_offsets-30, __consumer_offsets-39, __consumer_offsets-9, __consumer_offsets-11, __consumer_offsets-31, __consumer_offsets-13]
2019-08-04 21:44:52.639 WARN 6 [ XNIO-1 task-3] k.c.NetworkClient$DefaultMetadataUpdater : [Consumer clientId=kafdrop-client, groupId=kafdrop-consumer-group] 50 partitions have leader brokers without a matching listener, including [__consumer_offsets-0, __consumer_offsets-10, __consumer_offsets-20, __consumer_offsets-40, __consumer_offsets-30, __consumer_offsets-39, __consumer_offsets-9, __consumer_offsets-11, __consumer_offsets-31, __consumer_offsets-13]
I tried clearing my log.dir and restarting both kafka and zookeeper
I have successfully configured the networking part of the cluster with kafdrop instance.
these logs are the not changing.
when I reset the log.dir
directory from all my kafka nodes The logs of kafdrop changes to
remove broker 0
add broker 0
respectively same thing for broker 1 and 2.
Hi all,
I need, in order to run Kafkdrop in my context, to override those properties-file location.
I can provide a PR, if you want to.
Is it OK for you ?
Hello,
I'm getting the following error message when trying to use our Avro registry server
UT005023: Exception handling request to /topic/events.jobs.errors/messages
I can make curl
requests against our schema registry and I get valid JSON, but the path is under /subjects/events/jobs/errors
and not /topic/events.jobs.errors/messages
.
curl -X GET -i -H "Content-Type: application/vnd.schemaregistry.v1+json" http://[machine]:[port]/subjects/events/jobs/errors
HTTP/1.1 200 OK
Server: gunicorn/20.0.0
Date: Thu, 19 Dec 2019 22:26:40 GMT
Connection: close
Content-Type: application/json
Content-Length: 70
{"specs": ["fpe", "memory", "segfault", "si"], "version"...}
Am I missing a configuration setting for Kafdrop? I did not configure our Avro installation, so I'm a bit hamstrung with how it currently works.
Now it is difficult to track last received message but with ablility to view all messages from topic at once it will be much easier.
Hey guys, does Kafdrop support old version Kafka 0.10.1.0
?
The requirements says Kafdrop support Kafka (version 0.10.0 or newer), I tested with compiled kafdrop-3.19.0-SNAPSHOT.jar
, it seems unsupported (UnsupportedVersionException
occured )...
Is there any way to let Kafdrop be compatible with old version Kafka ? Something like setting zookeeper
configurations ?
...
org.springframework.web.util.NestedServletException: Request processing failed; nested exception is kafdrop.service.KafkaAdminClientException: java.util.concurrent.ExecutionException: org.apache.kafka.common.errors.UnsupportedVersionException: The broker does not support DESCRIBE_CONFIGS
at org.springframework.web.servlet.FrameworkServlet.processRequest(FrameworkServlet.java:1013)
at org.springframework.web.servlet.FrameworkServlet.doGet(FrameworkServlet.java:897)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:645)
at org.springframework.web.servlet.FrameworkServlet.service(FrameworkServlet.java:882)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:750)
at io.undertow.servlet.handlers.ServletHandler.handleRequest(ServletHandler.java:74)
at io.undertow.servlet.handlers.FilterHandler$FilterChainImpl.doFilter(FilterHandler.java:129)
at kafdrop.config.CorsConfiguration$1.doFilter(CorsConfiguration.java:88)
at io.undertow.servlet.core.ManagedFilter.doFilter(ManagedFilter.java:61)
at io.undertow.servlet.handlers.FilterHandler$FilterChainImpl.doFilter(FilterHandler.java:131)
at org.springframework.boot.actuate.web.trace.servlet.HttpTraceFilter.doFilterInternal(HttpTraceFilter.java:88)
at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:118)
at io.undertow.servlet.core.ManagedFilter.doFilter(ManagedFilter.java:61)
at io.undertow.servlet.handlers.FilterHandler$FilterChainImpl.doFilter(FilterHandler.java:131)
at org.springframework.web.filter.RequestContextFilter.doFilterInternal(RequestContextFilter.java:99)
at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:118)
at io.undertow.servlet.core.ManagedFilter.doFilter(ManagedFilter.java:61)
at io.undertow.servlet.handlers.FilterHandler$FilterChainImpl.doFilter(FilterHandler.java:131)
at org.springframework.web.filter.FormContentFilter.doFilterInternal(FormContentFilter.java:92)
...
CentOS 7 Linux Kernel 3.10.0-514.16.1.el7.x86_64
OpenJDK 13.0.1-b9
I have zookeeper and kafka running locally.
When i tried to start up kafdrop and connect to it, it fails with the following message.
[AdminClient clientId=adminclient-1] Connection to node -1 (localhost/127.0.0.1:9092) could not be established. Broker may not be available.
This is my docker-compose
zookeeper:
image: wurstmeister/zookeeper
ports:
- "2181:2181"
networks:
- monitor-net
kafka:
build: .
links:
- zookeeper
ports:
- "9092:9092"
- "7071:7071"
environment:
KAFKA_LISTENERS: PLAINTEXT://:9092
KAFKA_ADVERTISED_LISTENERS: PLAINTEXT://localhost:9092
KAFKA_ZOOKEEPER_CONNECT: zookeeper:2181
KAFKA_OPTS: -javaagent:/usr/app/jmx_prometheus_javaagent.jar=7071:/usr/app/prom-jmx-agent-config.yml
volumes:
- /var/run/docker.sock:/var/run/docker.sock
networks:
- monitor-net
kafdrop:
image: obsidiandynamics/kafdrop
ports:
- "9000:9000"
environment:
KAFKA_BROKERCONNECT: localhost:9092
JVM_OPTS: "-Xms16M -Xmx48M -Xss180K -XX:-TieredCompilation -XX:+UseStringDeduplication"
depends_on:
- kafka
networks:
- monitor-net
I am able to connect to the broker and fetch current list of topics and create new topic using the scripts from apache.
>bin/kafka-topics.sh --create --bootstrap-server localhost:9092 --replication-factor 1 --partitions 1 --topic test
>bin/kafka-topics.sh --list --bootstrap-server localhost:9092
What am I missing?
By and large, the information we currently obtain from ZooKeeper can be had from the Kafka Admin API, sans a few attributes that are of little relevance (e.g. broker start time). There shouldn't be a compelling reason left to use ZK. Furthermore, some managed Kafka services disallow direct access to ZK for security reasons.
Once removed, also remove the ZooKeeper and Curator libraries.
Deprecate the zookeeper.connect
property, printing a warning when it is used. (Eventually to be removed altogether.)
java -version
openjdk version "11.0.4" 2019-07-16 LTS
OpenJDK Runtime Environment 18.9 (build 11.0.4+11-LTS)
OpenJDK 64-Bit Server VM 18.9 (build 11.0.4+11-LTS, mixed mode, sharing)
mvn clean package
[INFO] Scanning for projects...
[WARNING]
[WARNING] Some problems were encountered while building the effective model for com.obsidiandynamics.kafdrop:kafdrop:jar:3.11.0-SNAPSHOT
[WARNING] 'dependencyManagement.dependencies.dependency.exclusions.exclusion.artifactId' for org.quartz-scheduler:quartz:jar with value '*' does not match a valid id pattern. @ org.springframework.boot:spring-boot-dependencies:2.1.5.RELEASE, /root/.m2/repository/org/springframework/boot/spring-boot-dependencies/2.1.5.RELEASE/spring-boot-dependencies-2.1.5.RELEASE.pom, line 2608, column 25
[WARNING]
[WARNING] It is highly recommended to fix these problems because they threaten the stability of your build.
[WARNING]
[WARNING] For this reason, future Maven versions might no longer support building such malformed projects.
[WARNING]
[INFO]
[INFO] ------------------------------------------------------------------------
[INFO] Building kafdrop 3.11.0-SNAPSHOT
[INFO] ------------------------------------------------------------------------
[INFO]
[INFO] --- maven-clean-plugin:2.4.1:clean (default-clean) @ kafdrop ---
[INFO] Deleting /root/kafdrop/target
[INFO]
[INFO] --- maven-resources-plugin:2.7:copy-resources (prepare-dockerfile) @ kafdrop ---
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] Copying 2 resources
[INFO]
[INFO] --- spring-boot-maven-plugin:2.1.5.RELEASE:build-info (build-info) @ kafdrop ---
[INFO]
[INFO] --- maven-resources-plugin:2.7:resources (default-resources) @ kafdrop ---
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] Copying 12 resources
[INFO] Copying 37 resources
[INFO]
[INFO] --- maven-compiler-plugin:3.8.1:compile (default-compile) @ kafdrop ---
[INFO] Changes detected - recompiling the module!
[INFO] Compiling 42 source files to /root/kafdrop/target/classes
[INFO] -------------------------------------------------------------
[ERROR] COMPILATION ERROR :
[INFO] -------------------------------------------------------------
[ERROR] javac: invalid target release: 11
Usage: javac <options> <source files>
use -help for a list of possible options
[INFO] 1 error
[INFO] -------------------------------------------------------------
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 2.770s
[INFO] Finished at: Sun Sep 29 17:50:59 CEST 2019
[INFO] Final Memory: 21M/178M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-compiler-plugin:3.8.1:compile (default-compile) on project kafdrop: Compilation failure
[ERROR] javac: invalid target release: 11
[ERROR] Usage: javac <options> <source files>
[ERROR] use -help for a list of possible options
[ERROR] -> [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
alternatives --config java
There are 2 programs which provide 'java'.
Selection Command
-----------------------------------------------
* 1 java-1.8.0-openjdk.x86_64 (/usr/lib/jvm/java-1.8.0-openjdk-1.8.0.222.b10-1.el7_7.x86_64/jre/bin/java)
+ 2 java-11-openjdk.x86_64 (/usr/lib/jvm/java-11-openjdk-11.0.4.11-1.el7_7.x86_64/bin/java)
So 11 is active, while 8 is also on the machine
Hi thanks for this awesome tool. Can you tell me what configuration I should have to use ACL page.
Wish a reply.
Best regards
Ismael
Hello,
I'm trying this out against one of our environments kafka clusters using docker:
winpty docker run -it --rm -p 9002:9000 -e KAFKA_BROKERCONNECT=x.x.x.11:6667,x.x.x.12:6667 -e JVM_OPTS="-Xms32M -Xmx64M" obsidiandynamics/kafdrop:latest
I'm seeing it start then try connecting to Zookeeper at localhost in a constant loop:
2020-01-03 19:05:21.636 INFO 15 [ main] o.a.z.Environment : Client environment:os.memory.total=49MB
2020-01-03 19:05:21.647 INFO 15 [ main] o.a.z.ZooKeeper : Initiating client connection, connectString=localhost:2181 sessionTimeout=5000 watcher=org.apache.curator.ConnectionState@781711b7
2020-01-03 19:05:21.686 INFO 15 [ main] o.a.z.c.X509Util : Setting -D jdk.tls.rejectClientInitiatedRenegotiation=true to disable client-initiated TLS renegotiation
2020-01-03 19:05:21.709 INFO 15 [ main] o.a.z.ClientCnxnSocket : jute.maxbuffer value is 4194304 Bytes
2020-01-03 19:05:21.738 INFO 15 [ main] o.a.z.ClientCnxn : zookeeper.request.timeout value is 0. feature enabled=
2020-01-03 19:05:21.801 INFO 15 [localhost:2181)] o.a.z.ClientCnxn$SendThread : Opening socket connection to server localhost/127.0.0.1:2181. Will not attempt to authenticate using SASL (unknown error)
2020-01-03 19:05:21.834 INFO 15 [localhost:2181)] o.a.z.ClientCnxn$SendThread : Socket error occurred: localhost/127.0.0.1:2181: Connection refused
2020-01-03 19:05:22.170 INFO 15 [ main] k.s.BuildInfo : Kafdrop version: 3.9.0, build time: 2019-09-26T09:17:02.688Z
Is localhost a default for the Zookeeper server someplace? I tried searching through the code, but I don't see where it might be hard coded.
I'm using your project and I found it very user-friendly and simple to configure and run.
Unfortunately, it does fit my monitoring requirements.
I believe that with few improvements this tool can be awesome.
Some enhancements are:
Hi,
Can we integrate the LAG and HOST informations for each partition ? For example, we can easily see the blocking hosts.
I'm probably asking the obvious here, but are there are any plans to support creating topics and manage ACL rules or even zookeeper users (SCRAM)?
It seems to be the last missing piece in managing a kafka cluster :)
Currently do we have user login for Kafdrop? If yes, how are we persisting user credentials?
If not, then is there any plan to provide this feature as an enhancement?
I was starting with the docker-compose in the example which works, but then moved to kubernetes which leads to an error:
2019-07-02 09:48:48.442 INFO 9 [ter.local:2181)] o.a.z.ClientCnxn$SendThread : Opening socket connection to server kafka-zoo-service.default.svc.cluster.local/10.101.40.207:2181. Will not attempt to authenticate using SASL (unknown error)
2019-07-02 09:48:53.305 WARN 9 [ChildrenCache-1] o.a.c.ConnectionState : Connection attempt unsuccessful after 17185 (greater than max timeout of 15000). Resetting connection and trying again with a new connection.
2019-07-02 09:48:53.309 WARN 9 [ter.local:2181)] o.a.z.ClientCnxn$SendThread : Client session timed out, have not heard from server in 5868ms for sessionid 0x0
2019-07-02 09:48:53.415 INFO 9 [ChildrenCache-1] o.a.z.ZooKeeper : Session: 0x0 closed
2019-07-02 09:48:53.415 INFO 9 [ain-EventThread] o.a.z.ClientCnxn$EventThread : EventThread shut down for session: 0x0
2019-07-02 09:48:53.416 INFO 9 [ChildrenCache-1] o.a.z.ZooKeeper : Initiating client connection, connectString=kafka-zoo-service.default.svc.cluster.local:2181 sessionTimeout=5000 watcher=org.apache.curator.ConnectionState@6f3f0ae
2019-07-02 09:48:53.417 INFO 9 [ChildrenCache-1] o.a.z.ClientCnxnSocket : jute.maxbuffer value is 4194304 Bytes
2019-07-02 09:48:53.417 INFO 9 [ChildrenCache-1] o.a.z.ClientCnxn : zookeeper.request.timeout value is 0. feature enabled=
2019-07-02 09:48:53.420 INFO 9 [ter.local:2181)] o.a.z.ClientCnxn$SendThread : Opening socket connection to server kafka-zoo-service.default.svc.cluster.local/10.101.40.207:2181. Will not attempt to authenticate using SASL (unknown error)
2019-07-02 09:48:58.424 WARN 9 [ter.local:2181)] o.a.z.ClientCnxn$SendThread : Client session timed out, have not heard from server in 5005ms for sessionid 0x0
2019-07-02 09:48:58.425 INFO 9 [ter.local:2181)] o.a.z.ClientCnxn$SendThread : Client session timed out, have not heard from server in 5005ms for sessionid 0x0, closing socket connection and attempting reconnect
I want to change these timeout values
https://github.com/obsidiandynamics/kafdrop/blob/master/src/main/java/kafdrop/config/CuratorConfiguration.java#L55
Is there a way to change these by environment variables or do you have any other hints?
I have separate Kafka server. how I will connect the kafdrop with kafka. I have downloaded Kafka from 'https://kafka.apache.org/'. And zookeeper and kafka is running sperate. I am beginner any help is appreciated
Hi
I'm testing KafkaDrop for the first time against Kafka 2.2. Everything seems to work and I can easily display messages on topics which have not been written with Kafka transaction.
But on some of our topics, all messages are written transactionally. Normally 2 messages on 2 different topics. So those topics contain the Kafka transaction markers ( as described here https://docs.google.com/document/d/11Jqy_GjUGtdXJK94XGsEIK7CP1SnQGdp2eF0wSw9ra8/edit#heading=h.mylukj7bg1rf)
It seems to me that KafDrop then fails to display ANY messages on these topics, at least the View Message screen remains empty. I can't see any error message in the log, though.
Can you tell me if this is just not currently supported (I might then have a look into it and try to come up with a PR)? Or am I doing something wrong?
Thanks, Joe
Hello. So just want to say that we love this software. Very useful.
We just wanted to have consumer data in JSON.
When you make a call to /topic/
You get that in JSON, but it does not include the consumer group data table that you see in http:
Group ID | Combined Lag |
---|---|
**** | 155632 |
Could this data be added to the JSON request for topic/?
Also could /consumer/ endpoint support JSON too. When sending Accept application/json it still sends HTML and not JSON.
Thanks
I was surprised to see that Kafdrop attempts to connect to the public Internet. It seems to be accessing the Github Star thing via HTTP call.
Is there a reason for keeping this as part of the web UI? To me it seems a red flag. The nature of Kafdrop is to serve as an "Admin UI" for Kafka. This means that for any serious production usage, it shouldn't be attempting to connect to the public Internet at all.
Please consider removing this functionality.
Is it possible to connect to a secured Kafka cluster (SASL) with client authentication?
If can, could you please share the configuration sample?
Hi,
When creating topics I am setting config values (segment.bytes, cleanup.policy etc) but these values do not appear in the configuration section for the topic and the custom config column is set to "no". They would have previously shown when using the Home Advisor Kafdrop. Has something changed here?
Thanks
For a cluster with Istio sidecar injection enabled I need to set the following pod annotations
annotations:
sidecar.istio.io/inject: "false"
If I set the ingress path to /kafkdrop
the UI doesn't load.
ingress:
enabled: true
path: /kafdrop
I believe this is because we'd also need to set the servlet context path for the Spring Boot app to
server.servlet.context-path=/kafdrop
There's currently an awkward way of configuring SASL by editing a local file. There is no current way of configuring properties for enabling SSL (TLS) support, providing truststores, keystores, etc.
Ideally, need a way of combing the above into a flexible configuration.
In addition, need a way of easily configuring this when using Docker and Kubernetes/Helm.
Once done, need to deprecate the old way of configuring SASL.
Hi
I am running below command -
java --add-opens=java.base/sun.nio.ch=ALL-UNNAMED -jar target/kafdrop-3.9.0-SNAPSHOT.jar --zookeeper.connect=: --kafka.brokerConnect=:
but getting the error
2019-09-07 20:55:41.795 WARN 16276 [ XNIO-1 task-2] o.a.k.c.NetworkClient : [Consumer clientId=kafdrop-client, groupId=kafdrop-consumer-group] Error connecting to node kafka-0.kafka.default.svc.cluster.local:9092 (id: 0 rack: null)
can you just help me what i a missing here
I can see jmx property in the source code. How is it being used (considering broker are started with JMX enabled at some port)?
How can I take advantage of JMX metrics using Kafdrop?
Hi!
I've installed Docker container with Kafdrop on my remote Apache Kafka server.
How can I bind Web UI to a different address, other then 'localhost', say, public ip of a server?
The following script is used to start Kafdrop as Docker container:
docker run -d --rm -p 9000:9000 \ -e ZOOKEEPER_CONNECT=host:port,host:port \ -e KAFKA_BROKERCONNECT=host:port,host:port \ -e JVM_OPTS="-Xms32M -Xmx64M" \ -e SERVER_SERVLET_CONTEXTPATH="/" \ obsidiandynamics/kafdrop:latest
Thanks!
When the base64 coded values for KAFKA_PROPERTIES, KAFKA_TRUSTSTORE and KAFKA_KEYSTORE contain blanks, the decoding in the script does not work. Environment variables should be put in double quotes in kafdrop.sh.
Is it possible to monitor 2 different Kafka-Zookeeper ecosystem?
I have 5 Broker-3 Zookeeper, running as one ecosystem.
I also have another ecosystem with 7 Broker-3 Zookeeper.
Is it possible to monitor them both?
I may miss something, but it seems that connecting to a secured Kafka cluster (SSL) with client authentication is currently not supported. Could you provide a corresponding configuration option, please?
Deployed the kafdrop helm chart
Got this exception in logs when trying to connect to Confluent cp-helm-chart
I've tried both:
zkConnect: kafka-cp-zookeeper:2888
kafkaBrokerConnect: kafka-cp-kafka:9092
zkConnect: kafka-cp-zookeeper-headless:2888
kafkaBrokerConnect: kafka-cp-kafka-headless:9092
My Kafka cluster is not using SASL at the moment. Do I need to disable SASL in Kafadrop?
kubectl logs kafdrop-5c95cdddc5-l5nkq
:
2019-06-28 09:54:13.962 �[32m INFO�[m �[35m6�[m [zookeeper:2888)] �[36mo.a.z.ClientCnxn$SendThread �[m : Opening socket connection to server kafka-cp-zookeeper/10.51.246.35:2888. Will not attempt to authenticate using SASL (unknown error)
2019-06-28 09:54:16.632 �[31mERROR�[m �[35m6�[m [ main] �[36mo.a.c.ConnectionState �[m : Connection timed out for connection string (kafka-cp-zookeeper:2888) and timeout (15000) / elapsed (5000)
org.apache.curator.CuratorConnectionLossException: KeeperErrorCode = ConnectionLoss at org.apache.curator.ConnectionState.checkTimeouts(ConnectionState.java:197) at org.apache.curator.ConnectionState.getZooKeeper(ConnectionState.java:88) at org.apache.curator.CuratorZookeeperClient.getZooKeeper(CuratorZookeeperClient.java:116) at org.apache.curator.framework.imps.CuratorFrameworkImpl.getZooKeeper(CuratorFrameworkImpl.java:489) at org.apache.curator.framework.imps.ExistsBuilderImpl$3.call(ExistsBuilderImpl.java:226) at org.apache.curator.framework.imps.ExistsBuilderImpl$3.call(ExistsBuilderImpl.java:215) at org.apache.curator.RetryLoop.callWithRetry(RetryLoop.java:108) at org.apache.curator.framework.imps.ExistsBuilderImpl.pathInForegroundStandard(ExistsBuilderImpl.java:212) at org.apache.curator.framework.imps.ExistsBuilderImpl.pathInForeground(ExistsBuilderImpl.java:205) at org.apache.curator.framework.imps.ExistsBuilderImpl.forPath(ExistsBuilderImpl.java:168) at org.apache.curator.framework.imps.ExistsBuilderImpl.forPath(ExistsBuilderImpl.java:39) at org.apache.curator.framework.recipes.cache.NodeCache.start(NodeCache.java:172) at kafdrop.service.CuratorKafkaMonitor.start(CuratorKafkaMonitor.java:108) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.base/java.lang.reflect.Method.invoke(Method.java:567) at org.springframework.beans.factory.annotation.InitDestroyAnnotationBeanPostProcessor$LifecycleElement.invoke(InitDestroyAnnotationBeanPostProcessor.java:363) at org.springframework.beans.factory.annotation.InitDestroyAnnotationBeanPostProcessor$LifecycleMetadata.invokeInitMethods(InitDestroyAnnotationBeanPostProcessor.java:307) at org.springframework.beans.factory.annotation.InitDestroyAnnotationBeanPostProcessor.postProcessBeforeInitialization(InitDestroyAnnotationBeanPostProcessor.java:136) at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.applyBeanPostProcessorsBeforeInitialization(AbstractAutowireCapableBeanFactory.java:414) at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.initializeBean(AbstractAutowireCapableBeanFactory.java:1770) at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:593) at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:515) at org.springframework.beans.factory.support.AbstractBeanFactory.lambda$doGetBean$0(AbstractBeanFactory.java:320) at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:222) at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:318) at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:199) at org.springframework.beans.factory.config.DependencyDescriptor.resolveCandidate(DependencyDescriptor.java:277) at org.springframework.beans.factory.support.DefaultListableBeanFactory.doResolveDependency(DefaultListableBeanFactory.java:1248) at org.springframework.beans.factory.support.DefaultListableBeanFactory.resolveDependency(DefaultListableBeanFactory.java:1168) at org.springframework.beans.factory.support.ConstructorResolver.resolveAutowiredArgument(ConstructorResolver.java:857) at org.springframework.beans.factory.support.ConstructorResolver.createArgumentArray(ConstructorResolver.java:760) at org.springframework.beans.factory.support.ConstructorResolver.autowireConstructor(ConstructorResolver.java:218) at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.autowireConstructor(AbstractAutowireCapableBeanFactory.java:1341) at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBeanInstance(AbstractAutowireCapableBeanFactory.java:1187) at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:555) at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:515) at org.springframework.beans.factory.support.AbstractBeanFactory.lambda$doGetBean$0(AbstractBeanFactory.java:320) at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:222) at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:318) at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:199) at org.springframework.beans.factory.support.DefaultListableBeanFactory.preInstantiateSingletons(DefaultListableBeanFactory.java:843) at org.springframework.context.support.AbstractApplicationContext.finishBeanFactoryInitialization(AbstractApplicationContext.java:877) at org.springframework.context.support.AbstractApplicationContext.refresh(AbstractApplicationContext.java:549) at org.springframework.boot.web.servlet.context.ServletWebServerApplicationContext.refresh(ServletWebServerApplicationContext.java:142) at org.springframework.boot.SpringApplication.refresh(SpringApplication.java:775) at org.springframework.boot.SpringApplication.refreshContext(SpringApplication.java:397) at org.springframework.boot.SpringApplication.run(SpringApplication.java:316) at org.springframework.boot.builder.SpringApplicationBuilder.run(SpringApplicationBuilder.java:139) at kafdrop.Kafdrop.main(Kafdrop.java:48) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.base/java.lang.reflect.Method.invoke(Method.java:567) at org.springframework.boot.loader.MainMethodRunner.run(MainMethodRunner.java:48) at org.springframework.boot.loader.Launcher.launch(Launcher.java:87) at org.springframework.boot.loader.Launcher.launch(Launcher.java:50) at org.springframework.boot.loader.JarLauncher.main(JarLauncher.java:51)
I started kafdrop failed and it indicated that broker address cat't connect.
2019-08-25 21:52:51.946 INFO 421 [ main] o.a.k.c.c.AbstractConfig : AdminClientConfig values:
bootstrap.servers = [localhost:9092]
client.dns.lookup = default
client.id =
connections.max.idle.ms = 300000
metadata.max.age.ms = 300000
metric.reporters = []
metrics.num.samples = 2
metrics.recording.level = INFO
metrics.sample.window.ms = 30000
receive.buffer.bytes = 65536
reconnect.backoff.max.ms = 1000
reconnect.backoff.ms = 50
request.timeout.ms = 120000
retries = 5
retry.backoff.ms = 100
sasl.client.callback.handler.class = null
sasl.jaas.config = null
sasl.kerberos.kinit.cmd = /usr/bin/kinit
sasl.kerberos.min.time.before.relogin = 60000
sasl.kerberos.service.name = null
sasl.kerberos.ticket.renew.jitter = 0.05
sasl.kerberos.ticket.renew.window.factor = 0.8
sasl.login.callback.handler.class = null
sasl.login.class = null
sasl.login.refresh.buffer.seconds = 300
sasl.login.refresh.min.period.seconds = 60
sasl.login.refresh.window.factor = 0.8
sasl.login.refresh.window.jitter = 0.05
sasl.mechanism = GSSAPI
security.protocol = PLAINTEXT
send.buffer.bytes = 131072
ssl.cipher.suites = null
ssl.enabled.protocols = [TLSv1.2, TLSv1.1, TLSv1]
ssl.endpoint.identification.algorithm = https
ssl.key.password = null
ssl.keymanager.algorithm = SunX509
ssl.keystore.location = null
ssl.keystore.password = null
ssl.keystore.type = JKS
ssl.protocol = TLS
ssl.provider = null
ssl.secure.random.implementation = null
ssl.trustmanager.algorithm = PKIX
ssl.truststore.location = null
ssl.truststore.password = null
ssl.truststore.type = JKS
2019-08-25 21:52:51.965 INFO 421 [ main] o.a.k.c.u.AppInfoParser$AppInfo : Kafka version: 2.2.1
2019-08-25 21:52:51.965 INFO 421 [ main] o.a.k.c.u.AppInfoParser$AppInfo : Kafka commitId: 55783d3133a5a49a
2019-08-25 21:52:51.987 INFO 421 [ main] k.s.BuildInfo : Kafdrop version: 3.9.0-SNAPSHOT, build time: 2019-08-24T18:07:26.130Z
2019-08-25 21:52:51.994 WARN 421 [| adminclient-1] o.a.k.c.NetworkClient : [AdminClient clientId=adminclient-1] Connection to node -1 (localhost/127.0.0.1:9092) could not be established. Broker may not be available.
2019-08-25 21:52:52.021 INFO 421 [ChildrenCache-1] k.s.CuratorKafkaMonitor : Topic confi
Hi, please could you add support for schema registry and message format (AVRO) in the environment variables of the docker image?
Hi,
I am using the latest Kafdrop Docker Image (Kafdrop 3.20.0) and i have a Kafka cluster configured to use scram-sha-256 without ssl. For Kafdrop i created the following kafka.properties file:
security.protocol=SASL_PLAINTEXT
sasl.method=SCRAM-SHA-256
sasl.jaas.config=org.apache.kafka.common.security.scram.ScramLoginModule required
username="kafdrop"
password="super-secret-password";
When i try to run the container it complains about a missing serviceName. But as far as i understand a serviceName is only needed for Kerberos.
Thanks, Roland
Startup logs:
Writing Kafka properties into kafka.properties
2020-01-20 15:35:42.719 INFO ${sys:PID} [ main] k.Kafdrop$EnvironmentSetupListener : Initializing JAAS config
2020-01-20 15:35:42.731 INFO ${sys:PID} [ main] k.Kafdrop$EnvironmentSetupListener : env: null .isSecured kafka: false
2020-01-20 15:35:42.731 INFO ${sys:PID} [ main] k.Kafdrop$EnvironmentSetupListener : Env: null
2020-01-20 15:35:42.979 INFO 14 [ main] o.s.b.StartupInfoLogger : Starting application on 94a6e54d533b with PID 14 (started by root in /)
2020-01-20 15:35:42.981 INFO 14 [ main] o.s.b.SpringApplication : No active profile set, falling back to default profiles: default
2020-01-20 15:35:45.019 INFO 14 [ main] i.u.s.s.ServletContextImpl : Initializing Spring embedded WebApplicationContext
2020-01-20 15:35:45.019 INFO 14 [ main] w.s.c.ServletWebServerApplicationContext : Root WebApplicationContext: initialization completed in 1993 ms
2020-01-20 15:35:45.628 INFO 14 [ main] k.c.KafkaConfiguration : Checking truststore file kafka.truststore.jks
2020-01-20 15:35:45.628 INFO 14 [ main] k.c.KafkaConfiguration : Checking keystore file kafka.keystore.jks
2020-01-20 15:35:45.628 INFO 14 [ main] k.c.KafkaConfiguration : Checking properties file kafka.properties
2020-01-20 15:35:45.629 INFO 14 [ main] k.c.KafkaConfiguration : Loading properties from kafka.properties
WARNING: An illegal reflective access operation has occurred
WARNING: Illegal reflective access by org.apache.kafka.common.network.SaslChannelBuilder (file:/kafdrop-3.20.0/lib/kafka-clients-2.3.1.jar) to method sun.security.krb5.Config.getInstance()
WARNING: Please consider reporting this to the maintainers of org.apache.kafka.common.network.SaslChannelBuilder
WARNING: Use --illegal-access=warn to enable warnings of further illegal reflective access operations
WARNING: All illegal access operations will be denied in a future release
2020-01-20 15:35:45.710 WARN 14 [ main] o.s.c.s.AbstractApplicationContext : Exception encountered during context initialization - cancelling refresh attempt: org.springframework.beans.factory.UnsatisfiedDependencyException: Error creating bean with name 'aclController' defined in URL [jar:file:/kafdrop-3.20.0/kafdrop-3.20.0.jar!/BOOT-INF/classes!/kafdrop/controller/AclController.class]: Unsatisfied dependency expressed through constructor parameter 0; nested exception is org.springframework.beans.factory.UnsatisfiedDependencyException: Error creating bean with name 'kafkaMonitorImpl' defined in URL [jar:file:/kafdrop-3.20.0/kafdrop-3.20.0.jar!/BOOT-INF/classes!/kafdrop/service/KafkaMonitorImpl.class]: Unsatisfied dependency expressed through constructor parameter 0; nested exception is org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'kafkaHighLevelConsumer': Invocation of init method failed; nested exception is org.apache.kafka.common.KafkaException: Failed to construct kafka consumer
2020-01-20 15:35:45.728 INFO 14 [ main] ConditionEvaluationReportLoggingListener :
Error starting ApplicationContext. To display the conditions report re-run your application with 'debug' enabled.
2020-01-20 15:35:45.730 ERROR 14 [ main] o.s.b.SpringApplication : Application run failed
org.springframework.beans.factory.UnsatisfiedDependencyException: Error creating bean with name 'aclController' defined in URL [jar:file:/kafdrop-3.20.0/kafdrop-3.20.0.jar!/BOOT-INF/classes!/kafdrop/controller/AclController.class]: Unsatisfied dependency expressed through constructor parameter 0; nested exception is org.springframework.beans.factory.UnsatisfiedDependencyException: Error creating bean with name 'kafkaMonitorImpl' defined in URL [jar:file:/kafdrop-3.20.0/kafdrop-3.20.0.jar!/BOOT-INF/classes!/kafdrop/service/KafkaMonitorImpl.class]: Unsatisfied dependency expressed through constructor parameter 0; nested exception is org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'kafkaHighLevelConsumer': Invocation of init method failed; nested exception is org.apache.kafka.common.KafkaException: Failed to construct kafka consumer
at org.springframework.beans.factory.support.ConstructorResolver.createArgumentArray(ConstructorResolver.java:769)
at org.springframework.beans.factory.support.ConstructorResolver.autowireConstructor(ConstructorResolver.java:218)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.autowireConstructor(AbstractAutowireCapableBeanFactory.java:1341)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBeanInstance(AbstractAutowireCapableBeanFactory.java:1187)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:555)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:515)
at org.springframework.beans.factory.support.AbstractBeanFactory.lambda$doGetBean$0(AbstractBeanFactory.java:320)
at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:222)
at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:318)
at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:199)
at org.springframework.beans.factory.support.DefaultListableBeanFactory.preInstantiateSingletons(DefaultListableBeanFactory.java:845)
at org.springframework.context.support.AbstractApplicationContext.finishBeanFactoryInitialization(AbstractApplicationContext.java:877)
at org.springframework.context.support.AbstractApplicationContext.refresh(AbstractApplicationContext.java:549)
at org.springframework.boot.web.servlet.context.ServletWebServerApplicationContext.refresh(ServletWebServerApplicationContext.java:141)
at org.springframework.boot.SpringApplication.refresh(SpringApplication.java:744)
at org.springframework.boot.SpringApplication.refreshContext(SpringApplication.java:391)
at org.springframework.boot.SpringApplication.run(SpringApplication.java:312)
at org.springframework.boot.builder.SpringApplicationBuilder.run(SpringApplicationBuilder.java:140)
at kafdrop.Kafdrop.main(Kafdrop.java:53)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.base/java.lang.reflect.Method.invoke(Method.java:567)
at org.springframework.boot.loader.MainMethodRunner.run(MainMethodRunner.java:48)
at org.springframework.boot.loader.Launcher.launch(Launcher.java:87)
at org.springframework.boot.loader.Launcher.launch(Launcher.java:51)
at org.springframework.boot.loader.JarLauncher.main(JarLauncher.java:52)
Caused by: org.springframework.beans.factory.UnsatisfiedDependencyException: Error creating bean with name 'kafkaMonitorImpl' defined in URL [jar:file:/kafdrop-3.20.0/kafdrop-3.20.0.jar!/BOOT-INF/classes!/kafdrop/service/KafkaMonitorImpl.class]: Unsatisfied dependency expressed through constructor parameter 0; nested exception is org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'kafkaHighLevelConsumer': Invocation of init method failed; nested exception is org.apache.kafka.common.KafkaException: Failed to construct kafka consumer
at org.springframework.beans.factory.support.ConstructorResolver.createArgumentArray(ConstructorResolver.java:769)
at org.springframework.beans.factory.support.ConstructorResolver.autowireConstructor(ConstructorResolver.java:218)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.autowireConstructor(AbstractAutowireCapableBeanFactory.java:1341)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBeanInstance(AbstractAutowireCapableBeanFactory.java:1187)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:555)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:515)
at org.springframework.beans.factory.support.AbstractBeanFactory.lambda$doGetBean$0(AbstractBeanFactory.java:320)
at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:222)
at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:318)
at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:199)
at org.springframework.beans.factory.config.DependencyDescriptor.resolveCandidate(DependencyDescriptor.java:277)
at org.springframework.beans.factory.support.DefaultListableBeanFactory.doResolveDependency(DefaultListableBeanFactory.java:1251)
at org.springframework.beans.factory.support.DefaultListableBeanFactory.resolveDependency(DefaultListableBeanFactory.java:1171)
at org.springframework.beans.factory.support.ConstructorResolver.resolveAutowiredArgument(ConstructorResolver.java:857)
at org.springframework.beans.factory.support.ConstructorResolver.createArgumentArray(ConstructorResolver.java:760)
... 26 more
Caused by: org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'kafkaHighLevelConsumer': Invocation of init method failed; nested exception is org.apache.kafka.common.KafkaException: Failed to construct kafka consumer
at org.springframework.beans.factory.annotation.InitDestroyAnnotationBeanPostProcessor.postProcessBeforeInitialization(InitDestroyAnnotationBeanPostProcessor.java:139)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.applyBeanPostProcessorsBeforeInitialization(AbstractAutowireCapableBeanFactory.java:414)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.initializeBean(AbstractAutowireCapableBeanFactory.java:1770)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:593)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:515)
at org.springframework.beans.factory.support.AbstractBeanFactory.lambda$doGetBean$0(AbstractBeanFactory.java:320)
at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:222)
at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:318)
at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:199)
at org.springframework.beans.factory.config.DependencyDescriptor.resolveCandidate(DependencyDescriptor.java:277)
at org.springframework.beans.factory.support.DefaultListableBeanFactory.doResolveDependency(DefaultListableBeanFactory.java:1251)
at org.springframework.beans.factory.support.DefaultListableBeanFactory.resolveDependency(DefaultListableBeanFactory.java:1171)
at org.springframework.beans.factory.support.ConstructorResolver.resolveAutowiredArgument(ConstructorResolver.java:857)
at org.springframework.beans.factory.support.ConstructorResolver.createArgumentArray(ConstructorResolver.java:760)
... 40 more
Caused by: org.apache.kafka.common.KafkaException: Failed to construct kafka consumer
at org.apache.kafka.clients.consumer.KafkaConsumer.<init>(KafkaConsumer.java:827)
at org.apache.kafka.clients.consumer.KafkaConsumer.<init>(KafkaConsumer.java:664)
at org.apache.kafka.clients.consumer.KafkaConsumer.<init>(KafkaConsumer.java:644)
at kafdrop.service.KafkaHighLevelConsumer.initializeClient(KafkaHighLevelConsumer.java:47)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.base/java.lang.reflect.Method.invoke(Method.java:567)
at org.springframework.beans.factory.annotation.InitDestroyAnnotationBeanPostProcessor$LifecycleElement.invoke(InitDestroyAnnotationBeanPostProcessor.java:363)
at org.springframework.beans.factory.annotation.InitDestroyAnnotationBeanPostProcessor$LifecycleMetadata.invokeInitMethods(InitDestroyAnnotationBeanPostProcessor.java:307)
at org.springframework.beans.factory.annotation.InitDestroyAnnotationBeanPostProcessor.postProcessBeforeInitialization(InitDestroyAnnotationBeanPostProcessor.java:136)
... 53 more
Caused by: org.apache.kafka.common.KafkaException: java.lang.IllegalArgumentException: No serviceName defined in either JAAS or Kafka config
at org.apache.kafka.common.network.SaslChannelBuilder.configure(SaslChannelBuilder.java:160)
at org.apache.kafka.common.network.ChannelBuilders.create(ChannelBuilders.java:146)
at org.apache.kafka.common.network.ChannelBuilders.clientChannelBuilder(ChannelBuilders.java:67)
at org.apache.kafka.clients.ClientUtils.createChannelBuilder(ClientUtils.java:99)
at org.apache.kafka.clients.consumer.KafkaConsumer.<init>(KafkaConsumer.java:741)
... 63 more
Caused by: java.lang.IllegalArgumentException: No serviceName defined in either JAAS or Kafka config
at org.apache.kafka.common.security.kerberos.KerberosLogin.getServiceName(KerberosLogin.java:301)
at org.apache.kafka.common.security.kerberos.KerberosLogin.configure(KerberosLogin.java:92)
at org.apache.kafka.common.security.authenticator.LoginManager.<init>(LoginManager.java:60)
at org.apache.kafka.common.security.authenticator.LoginManager.acquireLoginManager(LoginManager.java:104)
at org.apache.kafka.common.network.SaslChannelBuilder.configure(SaslChannelBuilder.java:149)
... 67 more```
Hi all,
I want to connect to a kafka cluster hosted by aiven.io, i have a trustore-file, a keystore-file, and their respectives passwords.
I made a little docker-compose and provide env variables with the BASE64 encoded files.
version: "3"
services:
kafdrop:
image: obsidiandynamics/kafdrop
ports:
- "9000:9000"
environment:
KAFKA_BROKERCONNECT: "mybroker..aivencloud.com:24117"
KAFKA_PROPERTIES: "c3Nblablablrcw=="
KAFKA_TRUSTSTORE: "/u3blablabla=="
KAFKA_KEYSTORE: "/u3+7blablabladyLfz/y"
I have a message in my logs;
kafdrop_1 | 2019-10-10 12:24:25.663 WARN 13 [ main] o.a.k.c.c.AbstractConfig : The configuration 'ssl.truststore.location' was supplied but isn't a known config.
kafdrop_1 | 2019-10-10 12:24:25.663 WARN 13 [ main] o.a.k.c.c.AbstractConfig : The configuration 'ssl.keystore.password' was supplied but isn't a known config.
kafdrop_1 | 2019-10-10 12:24:25.664 WARN 13 [ main] o.a.k.c.c.AbstractConfig : The configuration 'ssl.keystore.location' was supplied but isn't a known config.
kafdrop_1 | 2019-10-10 12:24:25.664 WARN 13 [ main] o.a.k.c.c.AbstractConfig : The configuration 'ssl.truststore.password' was supplied but isn't a known config.
Note : my kafka.properties is very simple
ssl.keystore.password=blablabla
ssl.truststore.password=blablabla
I didn't know what have i made wrong, can someone help me ? Have you ever been connected to a secured cluster using docker image and such files ?
While most time it is a good and common idea to use fonts from google via links that point to google servers, it is quite slow if the server is not reachable from the docker container - connection timeout - will take 30 seconds for every page until time out is reached.
Maybe there is a way to include the font?
I am trying to setup Kafdrop with SSL auth and encryption with brokers.
My Kafka cluster (3 broker are now listening for SSL and PLAINTEXT, for Zookeeper, connections) is now setup with their respective keystore & truststore file.
In Kafdrop, I've placed the same keystore & truststore file in the root dir.
kafka.properties:
security.protocol=SSL
ssl.endpoint.identification.algorithm=
ssl.truststore.location=/generic-server-truststore.jks
ssl.truststore.password=trustkafka
ssl.keystore.location=/kafka1-server-keystore.jks
ssl.keystore.password=kafkakey
application.yml:
kafka:
brokerConnect: localhost:9082,localhost:9083,localhost:9084
isSecured: true
#saslMechanism: "SASL"
securityProtocol: "SSL"
truststoreLocation : "/generic-server-truststore.jks"
propertiesFileLocation : "/kafka.properties}"
keystoreLocation : "/kafka1-server-keystore.jks"
But, on startup I get the error:
Value of 'env' cannot be null if connecting to secured kafka.
Please let me know what I'm doing wrong here.
Hello,
I've noticed that not all consumer groups are listed in Kafdrop. If I look at Kafka Manager for the same topic, I can see all consumers (including temporary unnamed ones).
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.