Giter VIP home page Giter VIP logo

docker2logstash's People

Contributors

maksymbilenko avatar pushpak51094 avatar

Stargazers

 avatar  avatar

Watchers

 avatar  avatar  avatar

Forkers

mancvso possan

docker2logstash's Issues

java.lang.IllegalStateException: Message not fully read

Hi! Trying this image (with compose) yields this exception

root@gargantua:~# docker ps
CONTAINER ID        IMAGE               COMMAND             CREATED             STATUS              PORTS               NAMES
root@gargantua:~# docker-compose up
Starting root_elasticsearch_1
Starting root_logstash_1
Starting root_kibana_1
Attaching to root_elasticsearch_1, root_logstash_1, root_kibana_1
elasticsearch_1 | [2015-12-14 15:35:41,840][INFO ][node                     ] [Captain Germany] version[2.1.0], pid[1], build[72cd1f1/2015-11-18T22:40:03Z]
elasticsearch_1 | [2015-12-14 15:35:41,851][INFO ][node                     ] [Captain Germany] initializing ...
elasticsearch_1 | [2015-12-14 15:35:42,326][INFO ][plugins                  ] [Captain Germany] loaded [], sites []
elasticsearch_1 | [2015-12-14 15:35:42,451][INFO ][env                      ] [Captain Germany] using [1] data paths, mounts [[/usr/share/elasticsearch/data (/dev/disk/by-uuid/1c4b4b7b-5e73-4c87-85b8-cba6e526ede9)]], net usable_space [8.5gb], net total_space [13.7gb], spins? [possibly], types [ext4]
elasticsearch_1 | [2015-12-14 15:35:53,734][INFO ][node                     ] [Captain Germany] initialized
elasticsearch_1 | [2015-12-14 15:35:53,737][INFO ][node                     ] [Captain Germany] starting ...
elasticsearch_1 | [2015-12-14 15:35:54,176][WARN ][common.network           ] [Captain Germany] publish address: {0.0.0.0} is a wildcard address, falling back to first non-loopback: {172.17.0.2}
elasticsearch_1 | [2015-12-14 15:35:54,189][INFO ][transport                ] [Captain Germany] publish_address {172.17.0.2:9300}, bound_addresses {[::]:9300}
elasticsearch_1 | [2015-12-14 15:35:54,262][INFO ][discovery                ] [Captain Germany] elasticsearch/8IqJ2934RQaQjRa617MzPw
elasticsearch_1 | [2015-12-14 15:35:57,503][INFO ][cluster.service          ] [Captain Germany] new_master {Captain Germany}{8IqJ2934RQaQjRa617MzPw}{172.17.0.2}{172.17.0.2:9300}, reason: zen-disco-join(elected_as_master, [0] joins received)
elasticsearch_1 | [2015-12-14 15:35:57,560][WARN ][common.network           ] [Captain Germany] publish address: {0.0.0.0} is a wildcard address, falling back to first non-loopback: {172.17.0.2}
elasticsearch_1 | [2015-12-14 15:35:57,562][INFO ][http                     ] [Captain Germany] publish_address {172.17.0.2:9200}, bound_addresses {[::]:9200}
elasticsearch_1 | [2015-12-14 15:35:57,565][INFO ][node                     ] [Captain Germany] started
elasticsearch_1 | [2015-12-14 15:35:57,695][INFO ][gateway                  ] [Captain Germany] recovered [0] indices into cluster_state
root_kibana_1 exited with code 1
logstash_1      | log4j:WARN No appenders could be found for logger (org.elasticsearch.node).
logstash_1      | log4j:WARN Please initialize the log4j system properly.
logstash_1      | log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
elasticsearch_1 | [2015-12-14 15:36:54,872][WARN ][transport.netty          ] [Captain Germany] exception caught on transport layer [[id: 0x5b1a40e7, /172.17.0.3:54240 => /172.17.0.2:9300]], closing connection
elasticsearch_1 | java.lang.IllegalStateException: Message not fully read (request) for requestId [2839], action [internal:discovery/zen/unicast_gte_1_4], readerIndex [59] vs expected [220]; resetting
elasticsearch_1 |   at org.elasticsearch.transport.netty.MessageChannelHandler.messageReceived(MessageChannelHandler.java:120)
elasticsearch_1 |   at org.jboss.netty.channel.SimpleChannelUpstreamHandler.handleUpstream(SimpleChannelUpstreamHandler.java:70)
elasticsearch_1 |   at org.jboss.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:564)
elasticsearch_1 |   at org.jboss.netty.channel.DefaultChannelPipeline$DefaultChannelHandlerContext.sendUpstream(DefaultChannelPipeline.java:791)
elasticsearch_1 |   at org.jboss.netty.channel.Channels.fireMessageReceived(Channels.java:296)
elasticsearch_1 |   at org.jboss.netty.handler.codec.frame.FrameDecoder.unfoldAndFireMessageReceived(FrameDecoder.java:462)
elasticsearch_1 |   at org.jboss.netty.handler.codec.frame.FrameDecoder.callDecode(FrameDecoder.java:443)
elasticsearch_1 |   at org.jboss.netty.handler.codec.frame.FrameDecoder.messageReceived(FrameDecoder.java:303)
elasticsearch_1 |   at org.jboss.netty.channel.SimpleChannelUpstreamHandler.handleUpstream(SimpleChannelUpstreamHandler.java:70)
elasticsearch_1 |   at org.jboss.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:564)
elasticsearch_1 |   at org.jboss.netty.channel.DefaultChannelPipeline$DefaultChannelHandlerContext.sendUpstream(DefaultChannelPipeline.java:791)
elasticsearch_1 |   at org.elasticsearch.common.netty.OpenChannelsHandler.handleUpstream(OpenChannelsHandler.java:75)
elasticsearch_1 |   at org.jboss.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:564)
elasticsearch_1 |   at org.jboss.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:559)
elasticsearch_1 |   at org.jboss.netty.channel.Channels.fireMessageReceived(Channels.java:268)
elasticsearch_1 |   at org.jboss.netty.channel.Channels.fireMessageReceived(Channels.java:255)
elasticsearch_1 |   at org.jboss.netty.channel.socket.nio.NioWorker.read(NioWorker.java:88)
elasticsearch_1 |   at org.jboss.netty.channel.socket.nio.AbstractNioWorker.process(AbstractNioWorker.java:108)
elasticsearch_1 |   at org.jboss.netty.channel.socket.nio.AbstractNioSelector.run(AbstractNioSelector.java:337)
elasticsearch_1 |   at org.jboss.netty.channel.socket.nio.AbstractNioWorker.run(AbstractNioWorker.java:89)
elasticsearch_1 |   at org.jboss.netty.channel.socket.nio.NioWorker.run(NioWorker.java:178)
elasticsearch_1 |   at org.jboss.netty.util.ThreadRenamingRunnable.run(ThreadRenamingRunnable.java:108)
elasticsearch_1 |   at org.jboss.netty.util.internal.DeadLockProofWorker$1.run(DeadLockProofWorker.java:42)
elasticsearch_1 |   at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
elasticsearch_1 |   at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
elasticsearch_1 |   at java.lang.Thread.run(Thread.java:745)
elasticsearch_1 | [2015-12-11 22:08:21,126][WARN ][transport.netty          ] [Avalanche] exception caught on transport layer [[id: 0x9f0d63a5, /172.17.0.3:36404 => /172.17.0.2:9300]], closing connection

Any hint?

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.