Giter VIP home page Giter VIP logo

Comments (9)

charithe avatar charithe commented on June 4, 2024

Hi @nicobn

I tried to recreate the problem and didn't succeed. Can you provide a little bit more context on the issue or provide a test case? For example, how many messages do you read and write? Do you use the class rule? Does it happen on even single tests or only when you run your full suite?

from kafka-junit.

nicobn avatar nicobn commented on June 4, 2024

Below is a test case to reproduce the issue. I tried using a class rule and got the same result.

import com.github.charithe.kafka.KafkaJunitRule;
import com.google.common.collect.ImmutableMap;
import org.apache.kafka.clients.producer.KafkaProducer;
import org.apache.kafka.clients.producer.Producer;
import org.apache.kafka.clients.producer.ProducerRecord;
import org.apache.kafka.clients.producer.RecordMetadata;
import org.junit.Rule;
import org.junit.Test;

import java.io.IOException;
import java.util.List;
import java.util.Map;
import java.util.concurrent.ExecutionException;
import java.util.concurrent.Future;
import java.util.concurrent.TimeoutException;

import static org.junit.Assert.assertEquals;

public class KafkaJUnitTestRule {
    @Rule public KafkaJunitRule kafka = new KafkaJunitRule();

    @Test
    public void test() throws TimeoutException, IOException, ExecutionException, InterruptedException {
        final Map<String, Object> producerProperties = ImmutableMap.<String, Object>builder()
            .put("bootstrap.servers", "127.0.0.1:" + kafka.kafkaBrokerPort())
            .put("key.serializer", "org.apache.kafka.common.serialization.StringSerializer")
            .put("value.serializer", "org.apache.kafka.common.serialization.StringSerializer")
            .put("request.required.acks", "1")
            .build();

        final Producer<String, String> producer = new KafkaProducer<>(producerProperties);

        final ProducerRecord<String, String> record = new ProducerRecord<>("roger", "rogercyr", "Roger Cyr is the best printer administrator");

        final Future<RecordMetadata> future = producer.send(record);
        final RecordMetadata recordMetadata = future.get();
        producer.close();

        final List<String> messages = kafka.readStringMessages("roger", 1);
        final String message = messages.get(0);
        assertEquals("Roger Cyr is the best printer administrator", message);
    }
}

from kafka-junit.

nicobn avatar nicobn commented on June 4, 2024

To answer your other questions:

  1. The problem occurs when running the full suite as well as when I run only that test
  2. I'm using the following dependencies:
        <dependency>
            <groupId>org.apache.kafka</groupId>
            <artifactId>kafka-clients</artifactId>
            <version>0.9.0.1</version>
        </dependency>
        <dependency>
            <groupId>com.github.charithe</groupId>
            <artifactId>kafka-junit</artifactId>
            <version>1.9</version>
            <scope>test</scope>
            <exclusions>
                <exclusion>
                    <groupId>org.slf4j</groupId>
                    <artifactId>log4j-over-slf4j</artifactId>
                </exclusion>
            </exclusions>
        </dependency>

from kafka-junit.

charithe avatar charithe commented on June 4, 2024

So, I am still unable to hit the 30s delay. Your test case consistently finishes in under 5s for me. However, one thing I noticed is that if I bump the version of Kafka to 0.9.0.1 (project is still on 0.9.0.0), the same test takes 2 more seconds. That seems like some kind of a regression. I managed to shave off the extra 2 seconds by adding timeout.ms to the producer config. Maybe that will help you as well. Can you please retry with the following producer config and report back the results. Also, which OS are you on?

final Map<String, Object> producerProperties = ImmutableMap.<String, Object>builder()
                .put("bootstrap.servers", "127.0.0.1:" + kafka.kafkaBrokerPort())
                .put("key.serializer", "org.apache.kafka.common.serialization.StringSerializer")
                .put("value.serializer", "org.apache.kafka.common.serialization.StringSerializer")
                .put("timeout.ms", "100")
                .build();

I also realised that the built-in configs in the rule are a bit outdated for Kafka 0.9. I'll update them and release a new version soon.

from kafka-junit.

nicobn avatar nicobn commented on June 4, 2024

I tried with timeout.ms set to 100 and the execution time was still ~30 s. I also tried switching to 0.9.0 and got similar results.

Here's the relevant code in ZookeeperConsumerConnector:

  def shutdown() {
    val canShutdown = isShuttingDown.compareAndSet(false, true);
    if (canShutdown) {
      logger.info("ZKConsumerConnector shutting down")
      try {
        scheduler.shutdown
        fetcher match {
          case Some(f) => f.shutdown
          case None =>
        }
        sendShudownToAllQueues
        if (zkClient != null) {
          zkClient.close()
          zkClient = null
        }
      }
      catch {
        case e =>
          logger.fatal(e)
          logger.fatal(Utils.stackTrace(e))
      }
      logger.info("ZKConsumerConnector shut down completed")
    }
  }

from kafka-junit.

nicobn avatar nicobn commented on June 4, 2024

Also, a few more pieces of information:

  1. I'm using the v1.8.0 JDK
  2. The result is the same, regardless of if I run it inside IntelliJ or using mvn on the console

from kafka-junit.

nicobn avatar nicobn commented on June 4, 2024

I had a hunch we were dealing with something out of the ordinary and I was right.

I'm running MacOS X. When I execute the test on MacOS X, it takes >30 seconds. Running the same test on the same code inside my Linux virtual machine takes less than 1 second. This issue may not be related to your package at all. What OS are you running the test on ?

from kafka-junit.

charithe avatar charithe commented on June 4, 2024

Aha! I suspected it was an OS issue. I was running the tests on a Linux machine -- which explains why I couldn't recreate it.

When I get a chance, I'll grab a Mac and try to see if I can figure out what happens.

from kafka-junit.

charithe avatar charithe commented on June 4, 2024

Interestingly, I ran the tests on a MacBook Pro running El Capitan and still couldn't recreate the issue. I am beginning to suspect that this is an isolated problem with your particular dev environment. There's nothing much I can do for now since I can't recreate it. If you manage to track down the cause, please post here as I'll be very interested to know.

from kafka-junit.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.