Giter VIP home page Giter VIP logo

node-red-contrib-rdkafka's People

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar

node-red-contrib-rdkafka's Issues

Produce failed: Local: Queue full

I have got the error message as titled after running nodered for a while (about a week). I am wondering if we need to poll (in order to receive produce callback) so as to avoid the problem? I googled this a bit and found the same issue as the one in librdkafka: confluentinc/librdkafka#998

Node-RED stops working after adding more than three Kafka consumers

I am triying to consume Kafka topics in Node-RED using the node-red-contrib-rdkafka. For this I successivelly add corresponding rdkafka input nodes (connecting to the same broker) and deploy each by each by selecting Modified nodes from the Deploy menu.

This works fine for three input nodes, however, after adding the fourth input node, Node-RED crashes? reproducibly on pressing the Deploy menu button.

Node-RED is started as docker container from the official Node-RED image on Docker Hub. As you can see in the console output below, the correspoding deployment log entries for the fourth consumer are missing:

$ docker run --rm -p1880:1880 nodered/node-red-docker 

> [email protected] start /usr/src/node-red
> node $NODE_OPTIONS node_modules/node-red/red.js -v $FLOWS "--userDir" "/data"

10 Sep 09:02:13 - [info] 

Welcome to Node-RED
===================

10 Sep 09:02:13 - [info] Node-RED version: v0.18.7
10 Sep 09:02:13 - [info] Node.js  version: v6.14.2
10 Sep 09:02:13 - [info] Linux 4.15.0-33-generic x64 LE
10 Sep 09:02:14 - [info] Loading palette nodes
10 Sep 09:02:14 - [warn] ------------------------------------------------------
10 Sep 09:02:14 - [warn] [node-red/rpi-gpio] Info : Ignoring Raspberry Pi specific node
10 Sep 09:02:14 - [warn] ------------------------------------------------------
10 Sep 09:02:14 - [info] Settings file  : /data/settings.js
10 Sep 09:02:14 - [info] User directory : /data
10 Sep 09:02:14 - [warn] Projects disabled : editorTheme.projects.enabled=false
10 Sep 09:02:14 - [info] Flows file     : /data/flows.json
10 Sep 09:02:14 - [info] Creating new flow file
10 Sep 09:02:14 - [info] Server now running at http://127.0.0.1:1880/
10 Sep 09:02:14 - [debug] loaded flow revision: d751713988987e9331980363e24189ce
10 Sep 09:02:14 - [debug] red/runtime/nodes/credentials.load : no user key present
10 Sep 09:02:14 - [debug] red/runtime/nodes/credentials.load : no default key present - generating one
10 Sep 09:02:14 - [debug] red/runtime/nodes/credentials.load : keyType=system
10 Sep 09:02:14 - [warn] 

---------------------------------------------------------------------
Your flow credentials file is encrypted using a system-generated key.

If the system-generated key is lost for any reason, your credentials
file will not be recoverable, you will have to delete it and re-enter
your credentials.

You should set your own key using the 'credentialSecret' option in
your settings file. Node-RED will then re-encrypt your credentials
file using your chosen key the next time you deploy a change.
---------------------------------------------------------------------

10 Sep 09:02:14 - [info] Starting flows
10 Sep 09:02:14 - [debug] red/nodes/flows.start : starting flow : global
10 Sep 09:02:14 - [info] Started flows
10 Sep 09:03:06 - [debug] red/runtime/nodes/credentials.export : encrypting
10 Sep 09:03:06 - [debug] saved flow revision: 104fe97f482455c034c59d6ada2b0506
10 Sep 09:03:06 - [info] Stopping modified nodes
10 Sep 09:03:06 - [debug] red/nodes/flows.stop : stopping flow : global
10 Sep 09:03:06 - [info] Stopped modified nodes
10 Sep 09:03:06 - [info] Starting modified nodes
10 Sep 09:03:06 - [debug] red/nodes/flows.start : starting flow : 94de589e.3ca0d
10 Sep 09:03:06 - [info] Started modified nodes
10 Sep 09:03:06 - [rdkafka] Created consumer subscription on topic = aaa
10 Sep 09:03:15 - [debug] saved flow revision: c79908034bbcb133fe2513658a898420
10 Sep 09:03:15 - [info] Stopping modified nodes
10 Sep 09:03:15 - [debug] red/nodes/flows.stop : stopping flow : global
10 Sep 09:03:15 - [debug] red/nodes/flows.stop : stopping flow : 94de589e.3ca0d
10 Sep 09:03:15 - [info] Stopped modified nodes
10 Sep 09:03:15 - [info] Starting modified nodes
10 Sep 09:03:15 - [info] Started modified nodes
10 Sep 09:03:15 - [rdkafka] Created consumer subscription on topic = bbb
10 Sep 09:03:20 - [debug] saved flow revision: f12926c7372236cd29f3d58852c2ca1a
10 Sep 09:03:20 - [info] Stopping modified nodes
10 Sep 09:03:20 - [debug] red/nodes/flows.stop : stopping flow : global
10 Sep 09:03:20 - [debug] red/nodes/flows.stop : stopping flow : 94de589e.3ca0d
10 Sep 09:03:20 - [info] Stopped modified nodes
10 Sep 09:03:20 - [info] Starting modified nodes
10 Sep 09:03:20 - [info] Started modified nodes
10 Sep 09:03:20 - [rdkafka] Created consumer subscription on topic = ccc
10 Sep 09:03:25 - [debug] saved flow revision: 81288df5f3243d7da90970571c3dc4a3
10 Sep 09:03:25 - [info] Stopping modified nodes
10 Sep 09:03:25 - [debug] red/nodes/flows.stop : stopping flow : global
10 Sep 09:03:25 - [debug] red/nodes/flows.stop : stopping flow : 94de589e.3ca0d
10 Sep 09:03:25 - [info] Stopped modified nodes
10 Sep 09:03:25 - [info] Starting modified nodes
10 Sep 09:03:25 - [info] Started modified nodes
10 Sep 09:03:25 - [rdkafka] Created consumer subscription on topic = ddd
=== no log entries corresponding to deployment ===

The Kafka broker is started from the official Confluent repository with $ docker-compose up -d examples/cp-all-in-one. However, the same issue occurs when used with landoop Kafka.

The closest "explanation" I could find so far is this open issue. But I am still not sure, if this behaviour is by design, and it is actually me hitting an implicit limit or following a wrong practice with this approach.

I can't imagine node-red-contrib-rdkafka or librdkafka being limited to 3 consumers. Do you have any ideas, where the failure could be in what I am trying to do?

Thanks a lot!
Jakob

Implement automatic reconnect

Currently disconnect errors are caught and printed on console and error logs but connection is not re-established automatically

Option to remove quotes in final produced messages

First of all thanks for this awesome package. I find it really useful.

I was deploying a pipeline including nodered -> kafka -> telegraf -> influxdb The pipeline started generating some data using nodered red and sending it to kafka. In this scenario, telegraf consumes data from kafka and sends it to Influxdb. However, it did not work because messages sent to kafka are surrounded by quotes. I guess this is due to a stringify process inside the JS code. This results into a parsing error because parsers do not expect any quotes.

Maybe this could be an additional option, or simply the quotes could be removed.

Implement Consumer Group Commit support

I saw you mentioned you are using the blizzard library upstream. I looked and your implementation doesn't seem to implement transactional commits for consumption, instead favoring autocommit.

This is great if your system worked, but if you're using the event to trigger ETL, or transition to another queue, it seems like this leaves a gap where you might get lost messages. I think adding the ability to control the commit would solve this. Any thoughts?

Thanks, John

segfault in librdkafka.so.1

Dear all,

i am using the latest 0.2.2 version, and i noticed that after an almost a day, node-red crashes. I show in the /var/log/messages the bellow message which i think is the cause of node-red crash.

Jun 26 02:53:06 iot2000 user.info kernel: rdk:broker-1[437]: segfault at b2aeab9b ip b28f0b80 sp b0 2ff01c error ffff0004 in librdkafka.so.1[b28e9000+10b000]

What should i do to prevent this from happening?
My system is the iot2040 by siemens.
Node-RED version: v0.18.4
Node.js version: v6.12.3
Linux 4.4.105-cip15 ia32 LE

Thank you for your support

Producer not connected

Hi

I configured rdKafka producer to send message in event stream using node-red
But it said producer not connected.
2020-05-20

Please help me on this

Windows 7 install fails

Time Elapsed 00:00:00.28
gyp ERR! build error
gyp ERR! stack Error: C:\Program Files (x86)\MSBuild\12.0\bin\msbuild.exe fai
gyp ERR! stack at ChildProcess.onExit (C:\Program Files\nodejs\node_modules
gyp ERR! stack at emitTwo (events.js:106:13)
gyp ERR! stack at ChildProcess.emit (events.js:191:7)
gyp ERR! stack at Process.ChildProcess._handle.onexit (internal/child_proce
gyp ERR! System Windows_NT 6.1.7601
gyp ERR! command "C:\PROGRAM FILES\NODEJS\node.exe" "C:\Program Files\node
gyp ERR! cwd C:\Users\sriniva.aiyar\AppData\Roaming\npm\node_modules\node-red-c
gyp ERR! node -v v6.11.4
gyp ERR! node-gyp -v v3.4.0
gyp ERR! not ok
npm ERR! Windows_NT 6.1.7601
npm ERR! argv "C:\Program Files\nodejs\node.exe" "C:\Program Files\nodejs
npm ERR! node v6.11.4
npm ERR! npm v3.10.10
npm ERR! code ELIFECYCLE

npm ERR! [email protected] install: node-gyp rebuild
npm ERR! Exit status 1
npm ERR!
npm ERR! Failed at the [email protected] install script 'node-gyp rebuild'.
npm ERR! Make sure you have the latest version of node.js and npm installed.
npm ERR! If you do, this is most likely a problem with the node-rdkafka package
npm ERR! not with npm itself.
npm ERR! Tell the author that this fails on your system:
npm ERR! node-gyp rebuild
npm ERR! You can get information on how to open an issue for this project with:
npm ERR! npm bugs node-rdkafka
npm ERR! Or if that isn't available, you can get their info via:
npm ERR! npm owner ls node-rdkafka
npm ERR! There is likely additional logging output above.

npm ERR! Please include the following file with any support request:
npm ERR! c:\Users\sriniva.aiyar.node-red\npm-debug.log

c:\Users\sriniva.aiyar.node-red>

Missing Functionality or Bad Description

The producer information on the node says...

"The output of this node includes key_schema_id, value and offsets which is an array of objects containing the partition, offset, error_code, and error for each of the messages published."

I don't see the output from the producer. Would love to have it so I can store or reference the offset. Would also like to see a way to add the logical or numerical offset in the Consumer node so you can read from the beginning or set to a specific offset by number or timestamp.

Looks like this repo hasn't been updated in a while, I'm hoping it warms back up. :-)

Thanks.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.