Giter VIP home page Giter VIP logo

Comments (15)

robomeister avatar robomeister commented on August 14, 2024 1

Hi Emma, thanks. I'll give it a shot. The kafka-node library I've been using was the first one to turn up in a Google search, and seems to have quite a bit more downloads than node-rdkafka. It may be prudent to back-port in some support for it.

from event-streams.

robomeister avatar robomeister commented on August 14, 2024 1

Is there a working sample of a Nodejs application using node-rdkafka? I'm having a heck of a time with the sasl/ssl settings.

from event-streams.

robomeister avatar robomeister commented on August 14, 2024 1

My new sample looks like this:

var kafka = require('node-rdkafka');

var broker="hostname:port";
var apiKey=""

console.log("Kafka initializing the consumer client to " + broker);
var consumer = new kafka.KafkaConsumer({
'group.id': 'kafka',
'metadata.broker.list': broker,
'security.protocol' : 'sasl_ssl',
'ssl.certificate.location' : 'eventstreams.pem',
'sasl.mechanisms' : 'PLAIN',
'sasl.username' : 'token',
'sasl.password' : apiKey});

console.log("Connecting...");
consumer.connect();

consumer.on('ready', function() {
console.log("Consumer Ready");
consumer.subscribe(['newtopic']);
consumer.consume();
})
.on('data', function(data) {
// Output the actual message contents
console.log(data.value.toString());
})
.on('event.error', function(data) {
console.log("Error:");
console.log(data);
}) ;

But i get the following error attempting to connect;
{ Error: Local: SSL error
at Error (native)
origin: 'local',
message: 'ssl error',
code: -1,
errno: -1,
stack: 'Error: Local: SSL error\n at Error (native)' }
Error:
{ Error: Local: All broker connections are down
at Error (native)
origin: 'local',
message: 'all broker connections are down',
code: -1,
errno: -1,
stack: 'Error: Local: All broker connections are down\n at Error (native)' }

(The broker is running fine)

from event-streams.

robomeister avatar robomeister commented on August 14, 2024 1

Hi Emma, I got my client application working fine using node-rdkafka. I did want to highlight some client behaviour, though, in case it comes up with others. In my logs, the following error is occasionally thrown:

{ Error: Local: Broker transport failure
origin: 'local',
message: 'broker transport failure',
code: -1,
errno: -1,
stack: 'Error: Local: Broker transport failure' }

Sometimes when this happens, the client manages to recover and keep receiving messages. Ultimately, however, the error gets thrown and the client stops working. I've had to implement the following code to keep the connection active:

consumer.on('event.error', function(err)
{
console.log(err);
consumer.disconnect();
});
consumer.on('disconnected', function(data)
{
console.log("Disconnected. Reconnecting...");
consumer.connect();
});

I don't think this is necessarily an issue with eventstreams, but it is a behaviour at the current levels of the nodejs client and eventstreams.

from event-streams.

robomeister avatar robomeister commented on August 14, 2024

Here's the NodeJS client sample, scrubbed of security info:

sample.js.txt

from event-streams.

EmmaHumber avatar EmmaHumber commented on August 14, 2024

Hi @robomeister Thanks for raising the issue - we can reproduce and we're taking a look at the moment.

from event-streams.

EmmaHumber avatar EmmaHumber commented on August 14, 2024

Hi @robomeister

From looking at the flows going to Kafka we can see that the protocol used by the NodeJS client is quite back-level - the request headers are at API version 2.

A Kafka client and Kafka broker negotiate to the highest API version supported by both the client and server, in this case API version 2:
http://kafka.apache.org/protocol.html#protocol_compatibility

The clients section of our pre-requisite page states that Kafka version 2.0 or later clients are supported:
https://ibm.github.io/event-streams/installing/prerequisites/

Unfortunately there's no clear Kafka documentation that links the request APIs versions to the version of Kafka they were added in, but I can see that v1.0 of Kafka was at API v5, so API v2 is quite back level:
https://github.com/apache/kafka/blob/1.0/clients/src/main/java/org/apache/kafka/common/requests/ProduceRequest.java

Kafka should deal with back level clients OK, but we know for example, that the Event Streams message browser does not in some cases, and this is due to the back level protocol.

Our guidance is to use clients based on librdkafka, and the node client for this is node-rdkafka.

I hope that helps, let me know if you need any further information.

from event-streams.

robomeister avatar robomeister commented on August 14, 2024

Quack! This connection string worked:

var consumer = new kafka.KafkaConsumer({
'group.id': 'kafka',
'metadata.broker.list': broker,
'security.protocol' : 'sasl_ssl',
'ssl.ca.location': require('path').resolve("eventstreams.pem"),
'sasl.mechanisms' : 'PLAIN',
'sasl.username' : 'token',
'sasl.password' : apiKey
});

I was using 'ssl.certificate.location' instead of 'ssl.ca.location'

from event-streams.

EmmaHumber avatar EmmaHumber commented on August 14, 2024

Glad to hear you got it working.

I'll feed back to our docs team and see if this is something we can look at including some info on.

from event-streams.

EmmaHumber avatar EmmaHumber commented on August 14, 2024

Hi, thanks for the update

https://github.com/edenhill/librdkafka/wiki/FAQ#why-am-i-seeing-receive-failed-disconnected

states that librdkafka reconnects after disconnect, whatever the reason for the disconnect, so the behaviour you describe above is interesting.

Is there anything in the broker side logs, or debug level client side logs that gives an indication of what error is being seen by the client? I'm wondering if there is a timeout being hit somewhere and the broker is reaping the connections, causing a disconnect.

There are also a number of exceptions that the poll method can throw that are deemed unrecoverable, so again, it would be interesting to see if one of these is being returned:
https://kafka.apache.org/20/javadoc/?org/apache/kafka/clients/consumer/KafkaConsumer.html

from event-streams.

robomeister avatar robomeister commented on August 14, 2024

Hi Emma,

Here's the kafka broker logs. What's a good debug setting for the client logs?
kafka0.txt
kafka1.txt
kafka2.txt

from event-streams.

EmmaHumber avatar EmmaHumber commented on August 14, 2024

Hi

Sorry for the delay, I was out of the office.

There's something slightly odd in the logs. Two of the brokers show:

2018-12-03 15:58:09,770] ERROR [MetadataCache brokerId=0] Listeners are not identical across brokers: Map(2 -> Map(ListenerName(INTERNAL_SECURE) -> kafka-ibm-es-kafka-sts-2.kafka-ibm-es-kafka-headless-svc.kafka.svc.cluster.local:8084 (id: 2 rack: null), ListenerName(INTERNAL) -> kafka-ibm-es-kafka-sts-2.kafka-ibm-es-kafka-headless-svc.kafka.svc.cluster.local:9092 (id: 2 rack: null)), 1 -> Map(ListenerName(INTERNAL_SECURE) -> kafka-ibm-es-kafka-sts-1.kafka-ibm-es-kafka-headless-svc.kafka.svc.cluster.local:8084 (id: 1 rack: null), ListenerName(EXTERNAL) -> icp31.robobob.ca:31948 (id: 1 rack: null), ListenerName(INTERNAL) -> kafka-ibm-es-kafka-sts-1.kafka-ibm-es-kafka-headless-svc.kafka.svc.cluster.local:9092 (id: 1 rack: null)), 0 -> Map(ListenerName(INTERNAL_SECURE) -> kafka-ibm-es-kafka-sts-0.kafka-ibm-es-kafka-headless-svc.kafka.svc.cluster.local:8084 (id: 0 rack: null), ListenerName(INTERNAL) -> kafka-ibm-es-kafka-sts-0.kafka-ibm-es-kafka-headless-svc.kafka.svc.cluster.local:9092 (id: 0 rack: null))) (kafka.server.MetadataCache)

We've only seen this in pre-release versions of Event Streams, and thought we'd resolved it in our main release - in theory it is likely to prevent messages being processed at all. Can you confirm which version of Event Streams you are using, and where you downloaded installed it from please?

A kubectl get pods -o yaml | grep releaseCandidate should give a precise release version string

With regards to the disconnects, I think I'll need to see concurrent client and server side logs, and an example timestamp of the error being seen to work towards understanding what's going on.

Client side logging :

We tend to set debug: all in the consumer options when using node-rdkafka. It's noisy, but it depends how long you are running for before you hit the issue as to whether that's a problem - I'm happy to look through lots of logs.
Unfortunately debug:all doesn't log timestamps so it's difficult to tie up the logs to what's going on at the Kafka side at that time. Are you able to add a timestamp to your application logging that writes out when the disconnect happens, so that we can match that time to the equivalent in the Kafka logs?

Server side logs:
As before please, just make sure they cover the same time period as the client side logs and that the error has occurred.

Thanks!

from event-streams.

robomeister avatar robomeister commented on August 14, 2024

Hi Emma, I'll set it up.

Here's the output of the get pods:
kafka-get-pods

And here's the helm chart:
kafka-helm-chart

I've also attempted the same client against an IBM Cloud Eventstreams instance, and I get similar behaviour.

from event-streams.

robomeister avatar robomeister commented on August 14, 2024

Just emailed you the logs

from event-streams.

EmmaHumber avatar EmmaHumber commented on August 14, 2024

Hi Rob

As discussed via email a while back, the next action was to ensure that the Node.js application restarted the consume by including the following in the error callback

consumer.on('event.error', function(err) {
console.log("Error: ");
console.log(err);
consumer.consume();
}) ;

I'm just tidying up our issues - do you mind if I close this one off, as I've not heard back that the problem is still happening.

Feel free to re-open if it doesn't resolve the problem.

Thanks
Emma

from event-streams.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.