Giter VIP home page Giter VIP logo

robotframework-confluentkafkalibrary's People

Contributors

marcosandremartins avatar nczita avatar robooo avatar tminakov avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar

robotframework-confluentkafkalibrary's Issues

ValueError: Consumer or producer group_id is wrong or does not exists!

Running with 2.0.2-3, and trying to check that a topic exists in a given threaded consumer.
We are using Get Thread Group Id and this is returning a value from our threads. e.g.

Arguments: [ <GetMessagesThread(Thread-2, started daemon 140414778578624)> ]
Return: 'beb1aba9-056e-4877-aca8-403da54b2e4a'
${telemetry_group_id} = beb1aba9-056e-4877-aca8-403da54b2e4a

However, when trying to use this with List Topics ${telemetry_group_id} abc.network.telemetry we receive the following output.

ValueError: Consumer or producer group_id is wrong or does not exists!	
18:09:04.950	DEBUG	Traceback (most recent call last):
  File "/usr/local/lib/python3.12/site-packages/ConfluentKafkaLibrary/__init__.py", line 118, in list_topics
    raise ValueError('Consumer or producer group_id is wrong or does not exists!')
ValueError: Consumer or producer group_id is wrong or does not exists!

I'm not sure if this is expected behaviour or not, as I can see that threaded consumer group id is not added to consumers. So perhaps this is expected, although it would be useful if it were possible to leverage the thread group id for this type of task rather than having to make another consumer just for the purpose of getting topic lists etc.

License Declaration Needed

Summary

As near as can be discerned, a license file/declaration does not exist for this code. A license of whatever type is required for deployment in many organizations.

Solution

Choose a license type that best fits this projects - the more liberal the better.

When Connection to Kafka is failed, the connect to kafka logs keep on coming. Can we stop/limit the retries.

I am using below apis from ConfluentKafkaLibrary 1.8.2.post1

${group_id}= Create Producer Kafka
${resp}= list topics group_id=${group_id}

If Kafka is not running then below error message comes:
|FAIL|rdkafka#producer-1| [thrd:10.61.251.25:9092/bootstrap]: 10.61.251.25:9092/bootstrap: Connect to ipv4#10.61.251.25:9092 failed: Unknown error (after 2037ms in state CONNECT)

This error message keeps on coming after every 2 seconds.

Question: Is there a way to stop/limit this message retry. How can i come out of this.

No keyword with name 'Start Consumer Threaded' found

I am new to Kafka , I have installed robotframework==3.2.1 ,robotframework-confluentkafkalibrary==1.5.0-5
confluent-kafka==1.5.0 , while running the examples i am receiving the below error

Screenshot 2020-12-22 at 23 30 54

Screenshot 2020-12-22 at 23 35 39

Screenshot 2020-12-22 at 23 31 38

Screenshot 2020-12-22 at 23 31 59

Please let me know if I am missing any dependencies. Your help is much appreciated

Tutorial Reference for Kafka Beginners

Summary

Although this Robot Framework library seems very comprehensive, it would be helpful for beginners if there was a step-by-step tutorial on how to initialize a consumer as well as producer.

Topics to Cover

  • Basic configuration setup needs (authentication, how to specify multiple Bootstrap and Zookeeper servers, etc.)
  • Walkthrough of simple consumer test
  • Walkthrough of simple producer test

I realize that the repo has example RF tests, but it doesn't provide quite enough context for a beginner.

Remove Avro and Schema registry dependency

Some users would prefer to use this library without avro + schema registry installation. This could be achieved with optional libs in setup.py + refactoring of imports in src files

VSCode doesn't recognize the library

Hello.
I have already installed both libraries:

pip install robotframework-confluentkafkalibrary==1.9.0-1
$ pip install confluent-kafka==1.9.0

But in VSCode it doesn't even know the library. And it still shows an error that seems strange to me (please, see attached image).
I already tried to install those modules that appear, but usually the error persists.

Have you ever come across something like this?

image

Accessing Poll results as dictionary

Hello,
Keyword Poll return type is a list of dictionary which i cannot access using RF keywords or syntax. Here's my case:

I created a consumer, subscribed to a topic and executed a poll.

${messages}= Poll group_id=${group_id} max_records=3

This returns a list of dictionary and I have some problem accessing the dictionary's values with the keys. What i tried:

  1. I can access the values as list ${messages}[0], but not as dictionary ${messages}[0][key]. Msg error: Bytes '${messages}[0]' used with invalid index 'key'. To use '[key]' as a literal value, it needs to be escaped like '\[key]'
  2. ${dictElem} Get From Dictionary ${messages}[0] key Msg error: TypeError: byte indices must be integers or slices, not str
  3. Log Dictionary ${messages}[0] Msg error: AttributeError: 'bytes' object has no attribute 'keys'
  4. Convert To Dictionary ${messages}[0] Msg error: TypeError: cannot convert dictionary update sequence element #0 to a sequence

What it works though is this:

${type}    Evaluate  type(${messages}[0])
Log    ${type} 
${value}  Evaluate   ${messages}[0].get("key")

${type} is <class 'dict'>. I then understand that ${messages}[0] is a dictionary but i don't understand why RF don't recognize it as a dictionary.

Some more infos.
Log ${messages}[0] gives {"key":"vaue","key2":"value","key3":3}
Log List ${messages} gives

List length is 3 and it contains following items:
0: b'{"key":"vaue","key2":"value","key3":3}'
1: b'{"key":"vaue","key2":"value","key3":3}'
2: b'{"key":"vaue","key2":"value","key3":3}'

Thank you very much.

Issue While consuming the messages from an Avro Topic

Hi All,
I am facing an issue while consuming the data from an avro topic.
Can anyone provide an example for the same. I am able to consume without giving the schema regitry url, but the data is not in readable format. When the schema regitry url is given I am not able to consume messages.

Unable to install robotframework-confluentkafkalibrary on Amazon EC2 graviton instance type

Hello everyone,

 I am trying to install **robotframework-confluentkafkalibrary** on Amazon linux EC2 graviton instance type, but installtion is failing with below error. can someone review and help in resolving the issue.

Note: we also tried by installing Python development packages, but still no luck.
yum list python3-devel
Loaded plugins: extras_suggestions, langpacks, priorities, update-motd
229 packages excluded due to repository priority protections
Installed Packagespython3-devel.aarch64 3.7.10-1.amzn2.0.1 @amzn2-core

Python version: Python 3.7.10

ERROR;
Running setup.py install for confluent-kafka: finished with status 'error'
error: subprocess-exited-with-error

× Running setup.py install for confluent-kafka did not run successfully.
│ exit code: 1
╰─> [53 lines of output]
running install
running build
running build_py
creating build
creating build/lib.linux-aarch64-3.7
creating build/lib.linux-aarch64-3.7/confluent_kafka
copying src/confluent_kafka/error.py -> build/lib.linux-aarch64-3.7/confluent_kafka
copying src/confluent_kafka/serializing_producer.py -> build/lib.linux-aarch64-3.7/confluent_kafka
copying src/confluent_kafka/init.py -> build/lib.linux-aarch64-3.7/confluent_kafka
copying src/confluent_kafka/deserializing_consumer.py -> build/lib.linux-aarch64-3.7/confluent_kafka
creating build/lib.linux-aarch64-3.7/confluent_kafka/schema_registry
copying src/confluent_kafka/schema_registry/avro.py -> build/lib.linux-aarch64-3.7/confluent_kafka/schema_registry
copying src/confluent_kafka/schema_registry/error.py -> build/lib.linux-aarch64-3.7/confluent_kafka/schema_registry
copying src/confluent_kafka/schema_registry/init.py -> build/lib.linux-aarch64-3.7/confluent_kafka/schema_registry
copying src/confluent_kafka/schema_registry/json_schema.py -> build/lib.linux-aarch64-3.7/confluent_kafka/schema_registry
copying src/confluent_kafka/schema_registry/schema_registry_client.py -> build/lib.linux-aarch64-3.7/confluent_kafka/schema_registry
copying src/confluent_kafka/schema_registry/protobuf.py -> build/lib.linux-aarch64-3.7/confluent_kafka/schema_registry
creating build/lib.linux-aarch64-3.7/confluent_kafka/serialization
copying src/confluent_kafka/serialization/init.py -> build/lib.linux-aarch64-3.7/confluent_kafka/serialization
creating build/lib.linux-aarch64-3.7/confluent_kafka/admin
copying src/confluent_kafka/admin/_acl.py -> build/lib.linux-aarch64-3.7/confluent_kafka/admin
copying src/confluent_kafka/admin/_resource.py -> build/lib.linux-aarch64-3.7/confluent_kafka/admin
copying src/confluent_kafka/admin/init.py -> build/lib.linux-aarch64-3.7/confluent_kafka/admin
copying src/confluent_kafka/admin/_config.py -> build/lib.linux-aarch64-3.7/confluent_kafka/admin
creating build/lib.linux-aarch64-3.7/confluent_kafka/avro
copying src/confluent_kafka/avro/error.py -> build/lib.linux-aarch64-3.7/confluent_kafka/avro
copying src/confluent_kafka/avro/init.py -> build/lib.linux-aarch64-3.7/confluent_kafka/avro
copying src/confluent_kafka/avro/cached_schema_registry_client.py -> build/lib.linux-aarch64-3.7/confluent_kafka/avro
copying src/confluent_kafka/avro/load.py -> build/lib.linux-aarch64-3.7/confluent_kafka/avro
creating build/lib.linux-aarch64-3.7/confluent_kafka/kafkatest
copying src/confluent_kafka/kafkatest/verifiable_client.py -> build/lib.linux-aarch64-3.7/confluent_kafka/kafkatest
copying src/confluent_kafka/kafkatest/init.py -> build/lib.linux-aarch64-3.7/confluent_kafka/kafkatest
copying src/confluent_kafka/kafkatest/verifiable_consumer.py -> build/lib.linux-aarch64-3.7/confluent_kafka/kafkatest
copying src/confluent_kafka/kafkatest/verifiable_producer.py -> build/lib.linux-aarch64-3.7/confluent_kafka/kafkatest
creating build/lib.linux-aarch64-3.7/confluent_kafka/avro/serializer
copying src/confluent_kafka/avro/serializer/init.py -> build/lib.linux-aarch64-3.7/confluent_kafka/avro/serializer
copying src/confluent_kafka/avro/serializer/message_serializer.py -> build/lib.linux-aarch64-3.7/confluent_kafka/avro/serializer
running build_ext
building 'confluent_kafka.cimpl' extension
creating build/temp.linux-aarch64-3.7
creating build/temp.linux-aarch64-3.7/tmp
creating build/temp.linux-aarch64-3.7/tmp/pip-install-68vas1l2
creating build/temp.linux-aarch64-3.7/tmp/pip-install-68vas1l2/confluent-kafka_3fb59b05b078474d8f48cf36fc061fd6
creating build/temp.linux-aarch64-3.7/tmp/pip-install-68vas1l2/confluent-kafka_3fb59b05b078474d8f48cf36fc061fd6/src
creating build/temp.linux-aarch64-3.7/tmp/pip-install-68vas1l2/confluent-kafka_3fb59b05b078474d8f48cf36fc061fd6/src/confluent_kafka
creating build/temp.linux-aarch64-3.7/tmp/pip-install-68vas1l2/confluent-kafka_3fb59b05b078474d8f48cf36fc061fd6/src/confluent_kafka/src
gcc -pthread -Wno-unused-result -Wsign-compare -DNDEBUG -O2 -g -pipe -Wall -Wp,-D_FORTIFY_SOURCE=2 -fexceptions -fstack-protector-strong --param=ssp-buffer-size=4 -grecord-gcc-switches -moutline-atomics -D_GNU_SOURCE -fPIC -fwrapv -fPIC -I/usr/include/python3.7m -c /tmp/pip-install-68vas1l2/confluent-kafka_3fb59b05b078474d8f48cf36fc061fd6/src/confluent_kafka/src/confluent_kafka.c -o build/temp.linux-aarch64-3.7/tmp/pip-install-68vas1l2/confluent-kafka_3fb59b05b078474d8f48cf36fc061fd6/src/confluent_kafka/src/confluent_kafka.o
In file included from /tmp/pip-install-68vas1l2/confluent-kafka_3fb59b05b078474d8f48cf36fc061fd6/src/confluent_kafka/src/confluent_kafka.c:17:0:
/tmp/pip-install-68vas1l2/confluent-kafka_3fb59b05b078474d8f48cf36fc061fd6/src/confluent_kafka/src/confluent_kafka.h:23:10: fatal error: librdkafka/rdkafka.h: No such file or directory
#include <librdkafka/rdkafka.h>

^~~~~~~~~~~~~~~~~~~~~~
compilation terminated.
error: command 'gcc' failed with exit status 1
[end of output]

note: This error originates from a subprocess, and is likely not a problem with pip.
error: legacy-install-failure

× Encountered error while trying to install package.
╰─> confluent-kafka

note: This is an issue with the package mentioned above, not pip.
hint: See above for output from the failure.

Get Messages From Thread : AttributeError: 'NoneType' object has no attribute 'decode'

Using 2.0.2-2, we are seeing random cases of Get Messages From Thread failing with the error "AttributeError: 'NoneType' object has no attribute 'decode'".

File "/usr/local/lib/python3.12/site-packages/ConfluentKafkaLibrary/consumer.py", line 364, in get_messages_from_thread records = self._decode_data( ^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.12/site-packages/ConfluentKafkaLibrary/consumer.py", line 324, in _decode_data return [record.decode(str(decode_format)) for record in data]

Installing uuid==1.30

The setup.py specifies uuid==1.30, and pip installs this library: https://pypi.org/project/uuid/#history - but it shadows the module uuid from the standard library - https://docs.python.org/3/library/uuid.html (it's also in py2).

And in executing tests, some other modules that are using uuid (I have a couple that import it, and also 3rd party libraries like boto3 do it) are throwing an error:

Importing test library 'C:\xxx\AWSLibrary.py' failed: SyntaxError: invalid syntax (uuid.py, line 138)
Traceback (most recent call last):
  File "C:\xxx\AWSLibrary.py", line 12, in <module>
    import boto3
  File "C:\yyy\Lib\site-packages\boto3\__init__.py", line 16, in <module>
    from boto3.session import Session
  File "C:\yyy\Lib\site-packages\boto3\session.py", line 17, in <module>
    import botocore.session
  File "C:\yyy\Lib\site-packages\botocore\session.py", line 38, in <module>
    from botocore import handlers
  File "C:\yyy\Lib\site-packages\botocore\handlers.py", line 25, in <module>
    import uuid
PYTHONPATH:

Is there a need to get this external package (that's from 2006), while it is already in the standard lib?

Thank you.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.