Giter VIP home page Giter VIP logo

springwolf / springwolf-core Goto Github PK

View Code? Open in Web Editor NEW
224.0 8.0 64.0 7.83 MB

Automated documentation for event-driven applications built with Spring Boot

Home Page: https://www.springwolf.dev

License: Apache License 2.0

Java 94.64% CSS 0.22% HTML 0.82% TypeScript 3.72% JavaScript 0.03% Kotlin 0.58%
kafka swagger spring-boot documentation-generator spring-kafka springfox asyncapi spring-rabbitmq ampq spring-cloud-stream

springwolf-core's Introduction

Logo Springwolf

Automated documentation for event-driven applications built with Spring Boot

Last Version GitHub commits since latest release (by SemVer including pre-releases) License

springwolf-core springwolf-asyncapi springwolf-ui springwolf-plugins springwolf-addons

We are on discord for any question, discussion, request etc. Join us at https://discord.gg/HZYqd5RPTd

Contents

About

This project is inspired by Springfox. It documents asynchronous APIs using the AsyncAPI specification.

springwolf-ui adds a web UI, much like that of Springfox, and allows easy publishing of auto-generated payload examples.

๐Ÿช‡ Demo & ๐Ÿ“– Documentation

Take a look at the Springwolf live demo and a generated AsyncAPI document.

springwolf.dev includes the quickstart guide and full documentation.

โœจ Why You Should Use Springwolf

Springwolf exploits the fact that you already fully described your consumer endpoint (with listener annotations, such as @KafkaListener, @RabbitListener, @SqsListener, etc.) and generates the documentation based on this information.

Share API Schema Definition

The AsyncAPI conform documentation can be integrated into API hubs (like backstage) or be shared as a json/yaml file with others.

UI Based API Testing

In projects using asynchronous APIs, you may often find yourself needing to manually send a message to some topic, whether you are manually testing a new feature, debugging or trying to understand some flow.

Using the automatically generated example payload object as a suggestion, you can publish it to the correct channel with a single click.

๐Ÿ”ฌ Usage & Example

Protocols not supported natively can still be documented using @AsyncListener and @AsyncPublisher annotation. More details in the documentation.

Plugin Example project Current version SNAPSHOT version
AMQP AMQP Example Maven Central Sonatype Nexus (Snapshots)
AWS SNS AWS SNS Example Maven Central Sonatype Nexus (Snapshots)
AWS SQS AWS SQS Example Maven Central Sonatype Nexus (Snapshots)
Cloud Stream Cloud Stream Example Maven Central Sonatype Nexus (Snapshots)
JMS JMS Example Maven Central Sonatype Nexus (Snapshots)
Kafka Kafka Example Maven Central Sonatype Nexus (Snapshots)
Click to expand all artifacts, bindings and add-ons
Artifact Current version SNAPSHOT version
AsyncAPI implementation Maven Central Sonatype Nexus (Snapshots)
Core Maven Central Sonatype Nexus (Snapshots)
UI Maven Central Sonatype Nexus (Snapshots)
Bindings Current version SNAPSHOT version
AMQP Binding Maven Central Sonatype Nexus (Snapshots)
AWS SNS Binding Maven Central Sonatype Nexus (Snapshots)
AWS SQS Binding Maven Central Sonatype Nexus (Snapshots)
Google PubSub Binding Maven Central Sonatype Nexus (Snapshots)
Kafka Binding Maven Central Sonatype Nexus (Snapshots)
JMS Binding Maven Central Sonatype Nexus (Snapshots)
Add-on Current version SNAPSHOT version
Common Model Converter Maven Central Sonatype Nexus (Snapshots)
Generic Binding Maven Central Sonatype Nexus (Snapshots)
Json Schema Maven Central Sonatype Nexus (Snapshots)
Kotlinx Serialization Model Converter Maven Central Sonatype Nexus (Snapshots)

๐Ÿš€ Who's Using Springwolf

Comment in this PR to add your company and spread the word

โœ๏ธ How To Participate

Check out our CONTRIBUTING.md guide.

Testing SNAPSHOT version

Sonatype snapshots

Add the following to the repositories closure in build.gradle:

repositories {
    // ...
    maven {
        url "https://s01.oss.sonatype.org/content/repositories/snapshots"
    }
}

Or add the repository to your pom.xml if you are using maven:

<repositories>
    <repository>
        <id>oss-sonatype</id>
        <name>oss-sonatype</name>
        <url>https://s01.oss.sonatype.org/content/repositories/snapshots</url>
        <snapshots>
            <enabled>true</enabled>
        </snapshots>
    </repository>
</repositories>

Local Snapshot Build

To work with local builds, run the publishToMavenLocal task. The current version number is set in .env file.

๐Ÿ‘ Contributors

Thanks goes to these wonderful people (emoji key):

Stav Shamir
Stav Shamir

๐Ÿ’ป
Timon Back
Timon Back

๐Ÿ’ป
sam0r040
sam0r040

๐Ÿ’ป
Carlos Tasada
Carlos Tasada

๐Ÿ’ป
jrlambs
jrlambs

๐Ÿ’ป
DmitriButorchin
DmitriButorchin

๐Ÿ’ป
Thomas Vahrst
Thomas Vahrst

๐Ÿ’ป
Yasen Pavlov
Yasen Pavlov

๐Ÿ’ป
Arthur Geweiler
Arthur Geweiler

๐Ÿ’ป
CS-BASF
CS-BASF

๐Ÿ’ป
Jeroen van Wilgenburg
Jeroen van Wilgenburg

๐Ÿ’ป
Michael Strelchenko
Michael Strelchenko

๐Ÿ’ป
Olivier Gaudefroy
Olivier Gaudefroy

๐Ÿ’ป
Omerbea
Omerbea

๐Ÿ’ป
Pavel Bodiachevskii
Pavel Bodiachevskii

๐Ÿ’ป
Sergio Roldรกn
Sergio Roldรกn

๐Ÿ’ป
Stmated
Stmated

๐Ÿ’ป
Themistoklis Pyrgiotis
Themistoklis Pyrgiotis

๐Ÿ’ป
Zach Hubbs
Zach Hubbs

๐Ÿ’ป
biergit
biergit

๐Ÿ’ป
kalarani
kalarani

๐Ÿ’ป
Dipesh Singh
Dipesh Singh

๐Ÿ’ป
Sakshi Jain
Sakshi Jain

๐Ÿ’ป
Sheheryar Aamir
Sheheryar Aamir

๐Ÿ’ป
jmwestbe
jmwestbe

๐Ÿ’ป
pdalfarr
pdalfarr

๐Ÿ’ป
Krzysztof Kwiecieล„
Krzysztof Kwiecieล„

๐Ÿ’ป
Robert Henke
Robert Henke

๐Ÿ’ป
Raphael De Lio
Raphael De Lio

๐Ÿ’ป
Nikita Marunko
Nikita Marunko

๐Ÿ’ป
Victor Levitskiy
Victor Levitskiy

๐Ÿ’ป

To add yourself as a contributor, install the all-contributors CLI and run:

  1. all-contributors check
  2. all-contributors add <username> code
  3. all-contributors generate

springwolf-core's People

Contributors

aerfus avatar biergit avatar ctasada avatar dependabot[bot] avatar dipeshsingh253 avatar dmitributorchin avatar jmwestbe avatar jrlambs avatar jvwilge avatar kalarani avatar krzysztofxkwiecien avatar ogaudefroy avatar omerbea avatar pakisan avatar pdalfarr avatar raphaeldelio avatar robert-henke avatar sakshi-75 avatar sam0r040 avatar sergiorc avatar sheheryaraamir avatar stavshamir avatar stmated avatar strelchm avatar themis-pyrgiotis avatar timonback avatar tvahrst avatar victorlev01 avatar yasen-pavlov avatar zachhubbs avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

springwolf-core's Issues

Model for List was not found

Hi. I have a KafkaListener like this:

@KafkaListener(topics = "topic", groupId = "groupId")
public void listen(List<Order> orders) {
	// Magic
}

Now if I include swagger4kafka: (Need to exclude logback and springfox-core because I'm not using logback and a more recent version of springfox 2.9.2).

<dependency>
    <groupId>io.github.stavshamir</groupId>
    <artifactId>swagger4kafka</artifactId>
    <version>0.0.1</version>
    <exclusions>
        <exclusion>
            <groupId>ch.qos.logback</groupId>
            <artifactId>logback-classic</artifactId>
        </exclusion>
        <exclusion>
            <groupId>ch.qos.logback</groupId>
            <artifactId>logback-core</artifactId>
        </exclusion>
        <exclusion>
            <groupId>io.springfox</groupId>
            <artifactId>springfox-core</artifactId>
        </exclusion>
    </exclusions>
</dependency>
<dependency>
    <groupId>io.github.stavshamir</groupId>
    <artifactId>swagger4kafka-ui</artifactId>
    <version>0.0.1</version>
</dependency>

When starting up the spring boot application I get the following error:
{"instant":{"epochSecond":1589899103,"nanoOfSecond":976000000},"thread":"main","level":"ERROR","loggerName":"io.github.stavshamir.swagger4kafka.services.ModelsService","message":"Model for List was not found","endOfBatch":false,"loggerFqcn":"org.apache.logging.slf4j.Log4jLogger","threadId":1,"threadPriority":5,"source":{"class":"io.github.stavshamir.swagger4kafka.services.ModelsService","method":"getExample","file":"ModelsService.java","line":110}}

I can contribute on this, but I need a hint in the right direction. Does Swagger not have default models for standard types like lists or sets that could be used here?

AsyncApi doc genearation without root tag

Now my generation document on http://localhost:8086/springwolf/docs is:
{ "myproject": { "asyncapi": "2.2.0" "info": { .... } ...... } }
The root node's "myproject" name is from part of AsyncApiConfig:

    @Bean
    public AsyncApiDocket asyncApiDocket() {
        Info info = Info.builder()
                .version("1.0.0")
                .title("myproject")
                .build();

If I set title to null, NullPointerException hapens.

When i try to generate client schema by AsyncApi generator
# ag http://localhost:8086/springwolf/docs @asyncapi/java-spring-template
I get the error:

Something went wrong:
Error: The `asyncapi` field is missing.
    at parse (/usr/local/lib/node_modules/@asyncapi/generator/node_modules/@asyncapi/parser/lib/parser.js:72:13)
    at Generator.generateFromString (/usr/local/lib/node_modules/@asyncapi/generator/lib/generator.js:265:28)
    at Generator.generateFromURL (/usr/local/lib/node_modules/@asyncapi/generator/lib/generator.js:298:17)
    at processTicksAndRejections (node:internal/process/task_queues:96:5)
    at /usr/local/lib/node_modules/@asyncapi/generator/cli.js:156:9

In the code of parser.js exception happens if there is no field "asyncapi" in Json (at first level). How can I generate the asyncapi document without root project title tag?

Documentation is not correctly generated when having producer and consumer for the same topic.

Describe the bug

  1. We have a Spring Boot microservice, let's say it is called "test-service"
  2. "test-service" has the following consumers:
    2.1 ConsumerA -> consumers "topic-a"
    2.2 ConsumerB -> consumer "topic-b"
  3. Now let's say we configure an asyncApiDocket Bean. Everything works fine and I can see both consumers in springwolf ui.
  4. Now we add ProducerA -> produces events to "topic-a'.
  5. Now springwolf ui display only consumer for topic-b and producer for topic-a. Seems the consumer for topic-a documentation was overwritted.

Dependencies and versions used
springwolf-kafka version 0.6.1. I tried with 0.7.0 as well.

Code example

@SpringBootApplication(scanBasePackages = {})

public class ExampleApplication {

    public static void main(String[] args) {

        SpringApplication.run(ExampleApplication.class, args);
    }
}

@Component
@RequiredArgsConstructor
@Slf4j
@KafkaListener(
    batch = "true",
    topics = "topic-a"
)
public class ConsumerA {
    @KafkaHandler
    public void consume(Object event) {
    }
}
@Component
@RequiredArgsConstructor
@Slf4j
@KafkaListener(
    batch = "true",
    topics = "topic-b"
)
public class ConsumerB {
    @KafkaHandler
    public void consume(Object event) {
    }
}
@EnableAsyncApi
@Configuration
public class AsyncApiConfigurationTest {

    private static final String LISTENERS_SCAN_PACKAGE =
        "example.consumer";

    @Bean
    public AsyncApiDocket asyncApiDocket(
        @Value("${service.name}") String appName,
        @Value("${service.version}") String appVersion,
        @Value("${spring.kafka.bootstrap-servers}") String bootstrapServers) {

        Info info = Info.builder()
            .version(appVersion)
            .title(appName)
            .description("description")
            .build();

        String producerTopicName = "topic-a";
        ProducerData preparedDataProducer = ProducerData.builder()
            .channelName(producerTopicName)
            .operationBinding(ImmutableMap.of("kafka", new KafkaOperationBinding()))
            .payloadType(FetchUserBalances.class)
            .build();

        return AsyncApiDocket.builder()
            .basePackage(LISTENERS_SCAN_PACKAGE)
            .server("kafka", Server.builder().protocol("kafka").url(bootstrapServers).build())
            .producer(preparedDataProducer)
            .info(info)
            .build();
    }

image

Stack trace and error logs
No exceptions in log

Support Multiple Producer On the same Kafka Topic

The Producers should also include the "oneOf" in the "message" property of the generated Async api file.

Currently adding two producer on the same topic in the AsyncApiDocket configuration, results to one overriding the other and only the last one is included.

Here's the configuration that results in such an issue. You can see that the searchProducer and the retrieveProducer variables have the same topic.

@Bean
 public AsyncApiDocket asyncApiDocket(KafkaProperties kafkaProperties) {
        Info info = Info.builder()
                .version(asyncApiVersion)
                .title("Party Adapter Async API")
                .build();

        Server kafkaServer = Server.builder()
                .protocol("kafka")
                .url(String.join(",", kafkaProperties.getBootstrapServers()))
                .build();

        ProducerData searchProducer = ProducerData.builder()
                .channelName(replyTopic)
                .payloadType(PartySearchResultPayload.class)
                .binding(ImmutableMap.of("kafka", new KafkaOperationBinding()))
                .build();

        ProducerData retrieveProducer = ProducerData.builder()
                .channelName(replyTopic)
                .payloadType(PartyRetrieveResultPayload.class)
                .binding(ImmutableMap.of("kafka", new KafkaOperationBinding()))
                .build();

        return AsyncApiDocket.builder()
                .basePackage(BASE_PACKAGE_SCAN)
                .info(info)
                .producers(List.of(searchProducer, retrieveProducer))
                .server("kafka", kafkaServer)
                .build();
}

In the previous release you supported the @KafkaHandler that results in having "oneOf" property in the generated json. The producer also needs that.

Thank you

Example payload not publishing via asyncui.html

when trying to publish from the example tab in the asyncui.html getting error stating publish failed please find the attached screen shot
image

After adding AsyncAPI configuration to the RMQ application I'm getting the below error

image

Replace `asyncapi/types` with jasyncapi types

Motivation

Currently, the project uses in-house implementation of the various asyncapi types (such as Channel, Operation etc.).
This in-house implementation is problematic due to:

  1. There are many bindings missing for most of the protocols. Actually only kafka bindings were implemented. For more plugins to be created, we need the bindings for them.
  2. It repeats the great job already done by @Pakisan in jasyncapi.

There for, it would be a great step to replace all those in-house implementations (from asyncapi/types package) with the types provided by jasyncapi-core.

Tasks

  • Add jasyncapi-core dependency
  • Replace all in-house implementations with provided types from jasyncapi. Many have the same names so the switch should be rather smooth, but I did rely on some custom builders, constructors and json serializations that may require additional attention.
  • Fix the tests and verify all pass
  • Remove local asyncapi/tests package (except for ProducerData, which may not belong there, but its future location may be decided later on).
  • Release local snapshot version, and update the kafka plugin. After it is updated, I will release an actual version.

How to document fields in message payload

Is it possible to provide additional description to the fields in a message payload? For example, could I use some annotation to describe the field, provide an example value or define whether it is required or not?

For example, with Swagger annotations, I could do something like

class Payload {
    @ApiModelProperty(value = "my description", example = "my example")
    private final String property;
}

Plans for future

Hi everyone who develops this tool!
I've tried to use this tool on my production project to make useful API for Rabbit AMQP that we expose internally.
But found that in fact library covers at most 10% of AsyncAPI specification
Also I found that for some reason it was designed in a way that restricts any extension of existing "Services/Providers/Models", and all used models are not from async-api dependency but own-created and they doesn't hold enough properties to match AsyncAPI Spec.
Also it is not possible to extend Generated Spec with some custom descriptions of operations/parameters (e.g. via annotations).

So, looking at this facts I have single question - is it planned to support this project and make it usable not only for "Hello World" application?

Kind regards,
Pavlo

Multiple RoutingKeys per Queue not supported? java.lang.IllegalStateException: Duplicate key

Describe the bug
The RabbitChannelsScanner seems not to support Queues with multiple RoutingKeys bound, which is funny :D

Dependencies and versions used
springwolf-amqp version 0.3.1. (thows Exception)
springwolf-amqp version 0.2.0. (ignores 2nd Binding)

Code example
One queue, two bindings.

    @Bean
    @Qualifier("featured-list")
    public Queue featuredListQueue(@Value("${featured.items.list.queue}") String queue){
        return QueueBuilder.durable(queue).quorum().build();
    }

    @Bean
    public Binding bindfeaturedImplicit(@Qualifier("channel") Exchange channelExchange, @Qualifier("featured-list") Queue featuredListQueue){
        return BindingBuilder.bind(featuredListQueue).to(channelExchange).with("channel.listMy.rk").noargs();
    }

    @Bean
    public Binding bindfeaturedExplicit(@Qualifier("featured") Exchange featuredExchange,
                                        @Qualifier("featured-list") Queue featuredListQueue,
                                        @Value("${featured.items.list.rk}") String rk){
        return BindingBuilder.bind(featuredListQueue).to(featuredExchange).with(rk).noargs();
    }

Stack trace and error logs
If an exception has been thrown or an error was logged by springwolf.

java.lang.IllegalStateException: Failed to load ApplicationContext
Caused by: org.springframework.beans.factory.BeanCreationException: 
	Error creating bean with name 'rabbitChannelsScanner' defined in URL [jar:file:/C:/Users/.../.m2/repository/io/github/springwolf/springwolf-amqp/0.3.1/springwolf-amqp-0.3.1.jar!/io/github/stavshamir/springwolf/asyncapi/scanners/channels/RabbitChannelsScanner.class]: ...
Caused by: org.springframework.beans.BeanInstantiationException: ...
Caused by: java.lang.IllegalStateException: 
	Duplicate key featured.items.list.queue (attempted merging values 
	Binding [destination=featured.items.list.queue, exchange=channel.exchange, routingKey=channel.listMy.rk, arguments={}] and
	Binding [destination=featured.items.list.queue, exchange=featured.exchange, routingKey=featured.items.list.rk, arguments={}])

Support for consumers defined in application.properties using spring.cloud.function methods

Describe the feature request
A clear and concise description of what you want to happen.

The "newer" implementation of rabbit/spring cloud stream listeners makes use of spring cloud functions to consume messages with the binding between queues and the method matching on the method name with properties defined in application properties. This library should be able to automatically find these consumers and produce documentation for them.

Motivation
Please describe the motivation for adding this feature (what do you feel is missing, why do you think this enhancement is a good idea etc.)

This is required for continued support of automatically documenting consumers using the more-modern definition pattern.

Technical details
If possible, add technical details like design, implementation, etc. Any idea is welcome!

"Consumer" methods are defined as beans as follows:
@bean
public Consumer messageReceiptConsumer() {
}

queue/binding information is defined in application.properties in the following way:
spring.cloud.stream.bindings.exampleNewConsumerMethod-in-0.destination=exchangeName
spring.cloud.stream.bindings.exampleNewConsumerMethod-in-0.group=example-new-bindings-queue
spring.cloud.stream.rabbit.bindings.exampleNewConsumerMethod-in-0.consumer.bindingRoutingKey=binding.new.routing.key

note: not all of these properties are required and spring has a way of automatically creating queue/exchange names when not present. I don't know for sure, but I believe at least one of these properties is required for binding to occur.

More information on the spring properties supported can be found here:
https://docs.spring.io/spring-cloud-stream/docs/current/reference/html/spring-cloud-stream-binder-rabbit.html#spring-cloud-stream-binder-rabbit-reference

Describe alternatives you've considered
Supporting manual definition of consumers like producers are defined.

not validating the queue

when added producer configuration and trying to publish message from the ui it isn't validating the queue please help me to understand a way in validating the queue

image

NullPointerException Can't find properties in the Model

Caused by: org.springframework.beans.BeanInstantiationException: Failed to instantiate [com.stavshamir.swagger4kafka.services.KafkaEndpointsService]: Constructor threw exception; nested exception is java.lang.NullPointerException
	at org.springframework.beans.BeanUtils.instantiateClass(BeanUtils.java:217) ~[spring-beans-5.2.5.RELEASE.jar:5.2.5.RELEASE]
	at org.springframework.beans.factory.support.SimpleInstantiationStrategy.instantiate(SimpleInstantiationStrategy.java:117) ~[spring-beans-5.2.5.RELEASE.jar:5.2.5.RELEASE]
	at org.springframework.beans.factory.support.ConstructorResolver.instantiate(ConstructorResolver.java:310) ~[spring-beans-5.2.5.RELEASE.jar:5.2.5.RELEASE]
	... 19 common frames omitted
Caused by: java.lang.NullPointerException: null
	at com.stavshamir.swagger4kafka.services.ModelsService.getEnumProperties(ModelsService.java:89) ~[swagger4kafka-1.0.0.jar:na]
	at com.stavshamir.swagger4kafka.services.ModelsService.setDeserializableEnumValues(ModelsService.java:84) ~[swagger4kafka-1.0.0.jar:na]
	at com.stavshamir.swagger4kafka.services.ModelsService.setDeserializableEnumValues(ModelsService.java:63) ~[swagger4kafka-1.0.0.jar:na]
	at com.stavshamir.swagger4kafka.services.ModelsService.register(ModelsService.java:51) ~[swagger4kafka-1.0.0.jar:na]
	at com.stavshamir.swagger4kafka.services.KafkaListenersScanner.topicToEndpoint(KafkaListenersScanner.java:65) ~[swagger4kafka-1.0.0.jar:na]
	at com.stavshamir.swagger4kafka.services.KafkaListenersScanner.lambda$createKafkaEndpoints$2(KafkaListenersScanner.java:48) ~[swagger4kafka-1.0.0.jar:na]
	at java.base/java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:195) ~[na:na]

Version used:

 <dependency>
            <groupId>com.stavshamir</groupId>
            <artifactId>swagger4kafka</artifactId>
            <version>1.0.0</version>
        </dependency>
        <dependency>
            <groupId>com.stavshamir</groupId>
            <artifactId>swagger4kafka-ui</artifactId>
            <version>1.0.0</version>
        </dependency>

Springboot version:

    <parent>
        <groupId>org.springframework.boot</groupId>
        <artifactId>spring-boot-starter-parent</artifactId>
        <version>2.2.6.RELEASE</version>
        <relativePath/> <!-- lookup parent from repository -->
    </parent>

Unsatisfied Dependency Issue

getting this error while running the app, added below dependencies and then also added Async configuration

io.github.springwolf springwolf-rabbitmq-plugin 1.0-SNAPSHOT io.github.springwolf springwolf-ui 0.1.1 runtime

Caused by: org.springframework.beans.factory.UnsatisfiedDependencyException: Error creating bean with name 'defaultAsyncApiService' defined in URL [jar:file:/home/vcap/app/BOOT-INF/lib/springwolf-core-0.2.0.jar!/io/github/stavshamir/springwolf/asyncapi/DefaultAsyncApiService.class]: Unsatisfied dependency expressed through constructor parameter 1; nested exception is org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'defaultChannelsService': Invocation of init method failed; nested exception is java.lang.IllegalArgumentException: Only single parameter KafkaListener methods are supported

Thanks in Advance

Failed to inject WebTestClient when using springwolf-core

Hi!
Amazing project

I build and reactive api with spring webflux and kafka, I have an issue when I implement springwolf-core to document kafka listeners and WebTestClient for my tests, the log is next

Caused by: org.springframework.beans.factory.NoSuchBeanDefinitionException: No qualifying bean of type 'org.springframework.test.web.reactive.server.WebTestClient' available: expected at least 1 bean which qualifies as autowire candidate. Dependency annotations: {@org.springframework.beans.factory.annotation.Autowired(required=true)}
at org.springframework.beans.factory.support.DefaultListableBeanFactory.raiseNoMatchingBeanFound(DefaultListableBeanFactory.java:1777)
at org.springframework.beans.factory.support.DefaultListableBeanFactory.doResolveDependency(DefaultListableBeanFactory.java:1333)
at org.springframework.beans.factory.support.DefaultListableBeanFactory.resolveDependency(DefaultListableBeanFactory.java:1287)
at org.springframework.beans.factory.annotation.AutowiredAnnotationBeanPostProcessor$AutowiredFieldElement.inject(AutowiredAnnotationBeanPostProcessor.java:640)
... 87 more

The error looks like a conflict with swagger-inflector
This is my test class

@SpringBootTest(classes = [TestConfiguration::class])
@ExtendWith(SpringExtension::class)
@AutoConfigureWebTestClient
class CrudControllerTest {
    @Autowired
    private lateinit var client: WebTestClient
    ...    
}

RabbitMQ Support

I have an already existing spring boot application that listens to RabbitMQ queue. It consumes a specific message (wrapped in CloudEvent) under specific topic. Can this generator assist me in exposing it as AsyncApi? Does it respect CloudEvent object?

@RabbitListener scanning still supporting?

I tried several times with rabbit listener but channels was not recognized by spring-wolf.
the http://localhost:8081/springwolf/asyncapi-ui.html opens up fine, showing Server and Info, but not Channels.
In RabbitMQ Management console, channels are fine with each queue name

I did with springwolf 0.4.0, 0.3.1 but I found that documentation states that latest support for RabbitMQ is 0.2.0
So, tried with 0.2.0, some implementations being different(i.e. basePackage() doesn't exist in AsyncApiDocket), also not applicable.

additionally, I cannot find any debug log after this:
......i.g.s.s.asyncapi.DefaultAsyncApiService : Building AsyncAPI document

any test being done lately??

Spring Kafka - get consumers using @KafkaListener on a class and @KafkaHandler on

Our services typically have used @KafkaListener on the class declaration and @KafkaHandler on the method declarations within those classes. Springwolf-kafka does not seem to pick these up. I modified the code in one of my handler classes to only use @KafkaListener on the method declarations, and springwolf picks them up correctly.

I don't know whether this is considered an enhancement or a bug, but it'd be super nifty if this worked either way.

Create AsyncApiService

Create AsyncApiService, retrieved will be responsible to instantiate the AsyncAPI object.

  • User input should be received through an AsyncApiDocket bean.
  • Kafka Listener data should be converted to asynapi objects and retrieved from KafkaListenersScanner, which should be renamed to KafkaChannelsService.

publish/subscribe semantics are mixed up

Description

Currently, listeners are mapped as subscribe channels, and producers are mapped as publish channels. Apparently, it should be the other way around.
See https://www.asyncapi.com/blog/publish-subscribe-semantics.

Thanks to https://github.com/kalarani for notifying me.

Fix

  1. The subscribe operation in AbstractChannelScanner::buildChannel should be replace with publish.
  2. Similarly, publish should be switched to subscribe in the ProducerChannelScanner
  3. Recommended - add a unit test for AbstractChannelScanner and ProducerChannelScanner. For AbstractChannelScanner, this would require creating an inner class in the unit test the implements the abstract methods as simple as possible.
  4. After the fix in the core project, I will release a new version, and then the tests in kafka plugin project should be fixed. Also, the UI project should be fixed accordingly.

The multiple Producer configuration is not working

Hello, it seems that the multiple producer support does not work, it was fixed on this issue #59

Im attaching the project that does not work with multiple producers.

async-api-project.zip

Calling http://localhost:8080/springwolf/docs will bring the following json, but only one producer is in the json!

{
  "Async-api-test": {
    "asyncapi": "2.0.0",
    "info": {
      "title": "Async-api-test",
      "version": "0.0.1"
    },
    "servers": {
      "kafka": {
        "url": "localhost:9092",
        "protocol": "kafka"
      }
    },
    "channels": {
      "ACCOUNT": {
        "subscribe": {
          "bindings": {
            "kafka": {
              
            }
          },
          "message": {
            "name": "com.example.asyncapi.model.ClassB",
            "title": "ClassB",
            "payload": {
              "$ref": "#/components/schemas/ClassB"
            }
          }
        }
      }
    },
    "components": {
      "schemas": {
        "ClassA": {
          "type": "object"
        },
        "ClassB": {
          "type": "object"
        }
      }
    }
  }
}

Thanks

Cannot handle multiple group-ids with same topic

Describe the bug
When there multiple consumer group-id consuming the same topic with different payloads, ClassLevelKafkaListenerScanner throws an exception while scanning due to duplicate key of topics.

Dependencies and versions used
springwolf-kafka version 0.7.0.

Stack trace and error logs
i.g.s.s.asyncapi.DefaultChannelsService : An error was encountered during channel scanning with io.github.stavshamir.springwolf.asyncapi.scanners.channels.ClassLevelKafkaListenerScanner@1dd76982: Duplicate key CUSTOMER_PRODUCT_ADPT (attempted merging values ChannelItem(description=null, subscribe=null, publish=Operation(operationId=null, summary=null, description=null, tags=null, externalDocs=null, bindings={kafka=KafkaOperationBinding(groupId=GroupA, clientId=null, bindingVersion=null)}, traits=null, message={oneOf=[Message(name=java.lang.Object, title=Object, payload=io.github.stavshamir.springwolf.asyncapi.types.channel.operation.message.PayloadReference@1), Message(name=com.example.PayloadA, title=PayloadA, payload=io.github.stavshamir.springwolf.asyncapi.types.channel.operation.message.PayloadReference@1)]}), parameters=null, bindings={kafka=KafkaChannelBinding()}) and ChannelItem(description=null, subscribe=null, publish=Operation(operationId=null, summary=null, description=null, tags=null, externalDocs=null, bindings={kafka=KafkaOperationBinding(groupId=GroupB, clientId=null, bindingVersion=null)}, traits=null, message={oneOf=[Message(name=java.lang.Object, title=Object, payload=io.github.stavshamir.springwolf.asyncapi.types.channel.operation.message.PayloadReference@1), Message(name=com.example.PayloadB, title=PayloadB, payload=io.github.stavshamir.springwolf.asyncapi.types.channel.operation.message.PayloadReference@1)]}), parameters=null, bindings={kafka=KafkaChannelBinding()}))

IndexOutOfBoundsException when using bindings instead of queues

There is an IndexOutOfBoundsException when using RabbitListener bindings instead of queues.

Here is the stack trace :

Caused by: java.lang.IndexOutOfBoundsException: Index 0 out of bounds for length 0
	at java.base/jdk.internal.util.Preconditions.outOfBounds(Preconditions.java:64)
	at java.base/jdk.internal.util.Preconditions.outOfBoundsCheckIndex(Preconditions.java:70)
	at java.base/jdk.internal.util.Preconditions.checkIndex(Preconditions.java:248)
	at java.base/java.util.Objects.checkIndex(Objects.java:372)
	at java.base/java.util.ArrayList.get(ArrayList.java:459)
	at io.github.stavshamir.springwolf.asyncapi.scanners.channels.RabbitChannelsScanner.getChannelName(RabbitChannelsScanner.java:45)
	at io.github.stavshamir.springwolf.asyncapi.scanners.channels.RabbitChannelsScanner.getChannelName(RabbitChannelsScanner.java:21)
	at io.github.stavshamir.springwolf.asyncapi.scanners.channels.AbstractChannelScanner.mapMethodToChannel(AbstractChannelScanner.java:81)

Support Websocket feature

Hi!
Thank you for project!
I want to know, can i use it for documenting Spring Websocket functionality?

For example I use next configuration:

@Configuration
@EnableWebSocketMessageBroker
public class WebSocketConfig implements WebSocketMessageBrokerConfigurer {
	@Override
	public void configureMessageBroker(MessageBrokerRegistry config) {
		config.enableSimpleBroker("/topic", "/queue");
	}

	@Override
	public void registerStompEndpoints(StompEndpointRegistry registry) {
		registry
				.addEndpoint("/custom-endpoint")
				.setAllowedOrigins("*")
				.withSockJS();
	}
}

And next controller method:

@MessageMapping("/custom-path")
@SendToUser("/queue/custom-path")
public CustomModelResponse getStops(@Payload(required = false) CustomFilter filter) {

	// business logic invocation
        return ...
}

Can I describe it with your project to generate documentation and show it in Swagger like format (like Swagger for REST) and give it to client application? I guess, request, response models and url path will be show inside.
Thank you!

gradle plugin for build-time generation

I like the idea of generating documentation in the build time.
Adding idea of an option to generate asyncapi json|yaml in the build time using gradle.
Generating the html UI I see only as an optional step, as there are usually already provided solutions that can visualise the provided spec in a fancy way.

Producers(...) not working

When using springwolf-kafka 0.3.0 and springwolf-ui 0.3.1 and trying to declare multiple producers the portal does not show any channels.

return AsyncApiDocket.builder()
                .basePackage("foo.bar")
                .info(info)
                .server("kafka", Server.builder().protocol("kafka").url(BOOTSTRAP_SERVERS).build())
                .producers(List.of(
                        topicProducerData1,
                        topicProducerData2
                ))
                .build();

Add support for custom Model Converters

Having the ability to register custom model converters would be quite useful.

I ran into this issue because we are using MonetaryAmount in our POJOs and it breaks the schema generation. There was a similar issue in springdoc that was solved this way: springdoc/springdoc-openapi#606

As per our great discussion on discord, the best way to approach this is to add the ability to register new model converters in the core package and then to create a module that contains common model converters like the one for Monetary Amount to avoid bloating the core with unnecessary dependencies.

Thank you very much for the great work @stavshamir.

maven plugin for build-time generation

well, similar to: #55, maven is another popular build tool => adding it as well.

I like the idea of generating documentation in the build time.
Adding idea of an option to generate asyncapi json|yaml in the build time using maven.
Generating the html UI I see only as an optional step, as there are usually already provided solutions that can visualise the provided spec in a fancy way.

Add support for producers

Goal

Allow users to specify producers, so that the producers will be added to the asyncapi document.

API

The producers will be set in the AsyncApiDocket, in a new field: producers: List<ProducerData>.
The ProducerData should have a builder and contain the following fields:

  • channelName: String, not null
  • payloadType: Class<?>, not null
  • binding: Map<String, ? extends OperationBinding>, not null, and must contain exactly one item. The key must be the name of the protocol.

In order to make it easier for the users, protocol specific producer can be made where instead of providing the binding as a map, the user only need to provide the OperationBinding instance, and the key is hardcoded per protocol.
For example, KafkaProducerData, inherits ProducerData, and has a different setBinding method that accepts OperationBinding binding, and puts a "kafka": binding in the backing map.

ChannelScanner

A new implementation of ChannelScanner, ProducersChannelScanner should be implemented, taking the producers data from the docker and transform them to Channel.

Usage Example

@Bean
public AsyncApiDocket asyncApiDocket() {
    ProducerData kafkaProducerData = ProducerData.builder()
            .channelName("produce-topic")
            .payloadType(Foo.class)
            .binding(ImmutableMap.of("kafka", new KafkaOperationBinding())
            .build();

    return AsyncApiDocket.builder()
            .info(...)
            .server(...)
            .producers(Arrays.asList(kafkaProducerData))
            .build();
}

Support Kafka Header

There should be a support to add Kafka headers. For example the spring TypeId header should be provided in order for the spring to serialise and deserialize payloads. Without this header the publish button on the example tab cannot work as it need a TypeId header to do its job.

Thanks and keep up the good work.

Add support for STOMP in SpringWolf

Add support for YAML format

Feature request to add yaml format to the /springwolf/docs endpoint. This can be useful for certain tools that work from YAML files to render AsyncAPI documentation.

Configurable docs URL path

Would it be possible to make the docs path configurable via properties?

Example application.properties

springwolf.path=/my/docs/path

ConflictingBeanDefinitionException: Annotation-specified bean name 'kafkaProducer' already existed

Hi, while I running project, there occurs a exception :
org.springframework.beans.factory.BeanDefinitionStoreException: Failed to parse configuration class [com.caocao.dic.charge.DicChargeApplication]; nested exception is org.springframework.context.annotation.ConflictingBeanDefinitionException: Annotation-specified bean name 'kafkaProducer' for bean class [io.github.stavshamir.springwolf.producer.KafkaProducer] conflicts with existing, non-compatible bean definition of same name and class [com.caocao.dic.charge.server.mq.producer.KafkaProducer]
If project already exist class KafkaProducer, then will conflict with the class io.github.stavshamir.springwolf.producer.KafkaProducer. How should I solve the problem?

AMQP - Support multiple instances of RabbitTemplate

There's a handful of problems with supporting a case when there are multiple RabbitTemplate's.
Here's a use case, instead of using one rabbit template with a default exchange, and specifying a concrete exchange and routing key when sending messages, there could be multiple rabbit template with set exchanges, so that when sending a message you specify only the routing key.
The issues:

  1. The app fails to start when there are multiple instances of RabbitTemplate's as the SpringwolfAmqpProducer requires a single instance (I can mark one of the RabbitTemplate's with Primary annotation, but that brings up the following issue)
  2. The send method in the SpringwolfAmqpProducer uses the channelName to send the message, which is ok when we use a default exchange - we could just use the queue name here, but when we're not - we have to supply a routing key instead.

I think it needs to use a List of RabbitTemplate's in the SpringwolfAmqpProducer bean, and choose an appropriate one to send the message.
In order to send the message correctly, can the channel binding and operation binding info be used? https://github.com/asyncapi/bindings/tree/master/amqp#channel-binding-object (Specifically exchange.name/queue.name from channel binding and cc from operation binding)

Here's a starter demo if you want to test it out: https://github.com/DmitriButorchin/amqp-demo/tree/multiple-rabbit-templates

@ConditionalOnProperty annotation discards @KafkaListener candidate method

Describe the bug
When @ConditionalOnProperty is present in any listener class, it is discarded, even when Condition is true.
I've only tested with @ConditionalOnProperty, not sure if happening with other Conditional classes.
Only tested with @KafkaListener at method level.

Dependencies and versions used
springwolf-kafka version 0.7.0.

Code example

@Component
@ConditionalOnProperty("kafka.bootstrap-servers")
public class DenormalizationEventListener {
    @Autowired
    private DenormalizationService denormalizationService;

    @KafkaListener(topics = "test-topic",
                   containerFactory = "denormalizationEventListenerContainerFactory")
    public void transformUsuarioEstructuraToMongo(@Payload Evento event) {
        denormalizationService.denormalize(event);
    }
}

More details
As soon as @ConditionalOnProperty is removed, the channel is included in docs.
Property is being used in a lot of places, so I'm sure it's present.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.