Giter VIP home page Giter VIP logo

lagom-samples's Introduction

Gitter Join the contributors chat at https://gitter.im/lagom/contributors CI / Build and tests Open Source Helpers

Deprecated - Lagom - July 1 2023

This project will only recieve security patches until July 1, 2024, at that point the project will no longer receive any additional patches.

If you are an existing customer of Lightbend and we have not yet contacted you, please reach out to Support.

We recommend migrating any existing work to:

  • Akka for deeply customized projects with complex infrastructure needs. Akka now contains the vast majority of Lagom features.
  • Kalix for a managed scalable environment with an abstraction above the Akka framework layer to allow you to focus only on business logic.

Lagom - The Reactive Microservices Framework

Lagom is a Swedish word meaning just right, sufficient. Microservices are about creating services that are just the right size, that is, they have just the right level of functionality and isolation to be able to adequately implement a scalable and resilient system.

Lagom focuses on ensuring that your application realizes the full potential of the Reactive Manifesto while delivering a high productivity development environment, and seamless production deployment experience.

Learn More

License

Copyright (C) Lightbend Inc. (https://www.lightbend.com).

Licensed under the Apache License, Version 2.0 (the "License"); you may not use this project except in compliance with the License. You may obtain a copy of the License at http://www.apache.org/licenses/LICENSE-2.0.

Unless required by applicable law or agreed to in writing, software distributed under the License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the specific language governing permissions and limitations under the License.

lagom-samples's People

Contributors

abknanda avatar dwijnand avatar ennru avatar erip avatar ignasi35 avatar ihostage avatar johanandren avatar jroper avatar marcospereira avatar mergify[bot] avatar mliarakos avatar norfe avatar octonato avatar raboof avatar silver-soule avatar ygree avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

lagom-samples's Issues

Promote withGrpcClient to Akka-gRPC

#6 introduces withGrpcClient which could be extracted so it can be part of an akka-grpc-lagom-testkit (tools to write tests for lagom that uses gRPC).

Update contributions guidelines

We should be more explicit which kind of contributions are welcome and point out that community samples should be added as a link on the main README file.

EOL the merged-in sample repos

  • Update README and/or open an issue and/or update the repo description about the move
  • Archive the repo

Add an automated deployment test

Deployment to Central Park should be part of the CI build. Since the primary purpose of this repository is to demonstrate deployment to OpenShift, we need to ensure there aren't any regressions and the process documented in the guide works through updates.

Deployment jobs failures (RBAC issues?)

For the past few dasy (weeks?) deployment jobs run in CRON have been failing with the message:

You have access to the following projects and can switch between them with 'oc project <projectname>':

  * console-danny

    lagom-scala-openshift-smoketests

    lagom-shopping-cart-java-maven-travis-1-5-x

    reactive-bbq-danny

Using project "console-danny".

No resources found.

Error from server (Forbidden): projects.project.openshift.io "lagom-shopping-cart-java-sbt-travis-1-5-x" is forbidden: User "play-team" cannot get projects.project.openshift.io in the namespace "lagom-shopping-cart-java-sbt-travis-1-5-x": no RBAC policy matched

What I think is happening is that: (1) the namespace hasn't completely been built on the k8s cluster causing the User "play-team" cannot get projects.project.openshift.io in the namespace, or (2) there's new RBAC requirements to complete the operation.

The error is quite consistent across all 3 jobs and branches (1.5.x and 1.6.x) but this operation is unrelated to either the programming language and hasn't changed in a while so that was expected.

related to #116

Run build using Java 11

We currently don't run Lagom samples using AdoptOpenJDK 11. Need to add it there so that we can use the samples to validate Java 11 support.

gRPC examples: Why are .proto in `impl` and not `api`?

Do the samples currently demonstrate best practices around where .proto files should live?

The examples currently don't share a schema definition between the client and the implementing service. Why is that? Is this not something that we would expect to see in the api package, and then used by both the service making the client call and the service providing the handler?

Grpcc failed with " UNAVAILABLE: TCP Read failed"

Need help with some weird exception occured during testing grpc methods.

Testing "lagom-java-grpc-example", run ssl-lagom , then runAll command.
Http-GET requests via curl succeeded.
Tried to invoke grpc-method.
Grpcc console taken from njpatel/grpcc github repo.
Ran grpcc as pointed in README- had error. Can connect to service, but when tried to invoke client.sayHello("Alice", printReply) had the same exception.

Connecting:
grpcc --proto hello-impl/src/main/protobuf/helloworld.proto --insecure --address 127.0.0.1:11000
Connected:

Connecting to helloworld.GreeterService on 127.0.0.1:11000. Available globals:

  client - the client connection to GreeterService
    sayHello (HelloRequest, callback) returns HelloReply

  printReply - function to easily print a unary call reply (alias: pr)
  streamReply - function to easily print stream call replies (alias: sr)
  createMetadata - convert JS objects into grpc metadata instances (alias: cm)
  printMetadata - function to easily print a unary call's metadata (alias: pm)

Method invoked- exception:

Error:  { Error: 14 UNAVAILABLE: TCP Read failed
    at Object.exports.createStatusError (C:\Users\abcdef\AppData\Roaming\npm\node_modules\grpcc\node_modules\grpc\src\common.js:91:15)
    at Object.onReceiveStatus (C:\Users\abcdef\AppData\Roaming\npm\node_modules\grpcc\node_modules\grpc\src\client_interceptors.js:1204:28)
    at InterceptingListener._callNext (C:\Users\abcdef\AppData\Roaming\npm\node_modules\grpcc\node_modules\grpc\src\client_interceptors.js:568:42)
    at InterceptingListener.onReceiveStatus (C:\Users\abcdef\AppData\Roaming\npm\node_modules\grpcc\node_modules\grpc\src\client_interceptors.js:618:8)
    at callback (C:\Users\abcdef\AppData\Roaming\npm\node_modules\grpcc\node_modules\grpc\src\client_interceptors.js:845:24) code: 14, metadata: {}, details: 'TCP Read failed' }

Also made my project on maven, mostly copied from sample (Maven 3.6.0, java 1.8, x64, Win7). Have the same error.

May be some additional configuration needed? Or other grpc CLI?

Running tests for shopping-cart Scala throws exception when failing to connect to Kafka

To reproduce, run sbt test in ./shopping-cart/shopping-cart-scala project. The tests pass, but the following exception occurs:

2019-11-01 15:39:12,501 ERROR com.lightbend.lagom.internal.broker.kafka.TopicProducerActor - Unable to locate Kafka service named [kafka_native]. Retrying...
2019-11-01 15:39:12,501 ERROR com.lightbend.lagom.internal.broker.kafka.TopicProducerActor - Unable to locate Kafka service named [kafka_native]. Retrying...
2019-11-01 15:39:12,502 WARN  akka.stream.scaladsl.RestartWithBackoffSource - Restarting graph due to failure. stack_trace:
java.lang.IllegalArgumentException: Unable to locate Kafka service named [kafka_native]. Retrying...
	at com.lightbend.lagom.internal.broker.kafka.TopicProducerActor.$anonfun$eventualBrokersAndOffset$3(TopicProducerActor.scala:184)
	at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:430)
	at akka.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:55)
	at akka.dispatch.BatchingExecutor$BlockableBatch.$anonfun$run$1(BatchingExecutor.scala:92)
	at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.scala:18)
	at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94)
	at akka.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:92)
	at akka.dispatch.TaskInvocation.run(AbstractDispatcher.scala:47)
	at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:47)
	at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:289)
	at java.util.concurrent.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1056)
	at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1692)
	at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:157)

2019-11-01 15:39:12,502 WARN  akka.stream.scaladsl.RestartWithBackoffSource - Restarting graph due to failure. stack_trace:
java.lang.IllegalArgumentException: Unable to locate Kafka service named [kafka_native]. Retrying...
	at com.lightbend.lagom.internal.broker.kafka.TopicProducerActor.$anonfun$eventualBrokersAndOffset$3(TopicProducerActor.scala:184)
	at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:430)
	at akka.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:55)
	at akka.dispatch.BatchingExecutor$BlockableBatch.$anonfun$run$1(BatchingExecutor.scala:92)
	at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.scala:18)
	at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94)
	at akka.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:92)
	at akka.dispatch.TaskInvocation.run(AbstractDispatcher.scala:47)
	at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:47)
	at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:289)
	at java.util.concurrent.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1056)
	at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1692)
	at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:157)

shopping-cart-java not working for maven in Dev Mode

When running mvn lagom:runAll instead of 'sbt runAll' (after following the other necessary instructions in the readme) the project starts successfully but the curl commands do not work. That is when running any of the curl commands I get an error. The error is:

<!DOCTYPE html>
<html lang="en">
    <head>
        ...
    </head>
    <body>
        <h1>Action Not Found</h1>

        <p id="detail">
            For request 'GET /shoppingcart/123'
        </p>

        

                <h2>
                    These routes have been tried, in this order:
                </h2>

                <div>
                    
                </div>

            

    </body>
</html>

Switch services to ClusterIP

Right now, the Shopping Cart Kubernetes Service definitions use the LoadBalancer service type, but since we use an OpenShift Route to expose the service outside the cluster, the LoadBalancer service type adds unnecessary complexity. Since this example is likely to be copied by users without understanding the implications, we should use the simpler and more secure ClusterIP type in our examples.

See akka/akka-management#574 for more details.

Dev Mode tests

Currently, the gRPC example projects don't run in dev mode due to lagom/lagom#1857.

It would be nice if we could catch this in CI. I'm opening this issue to start a discussion with the team about how we might be able to do this, and whether it would be worthwhile. I guess we'd need to add scripted tests, which could inflate the build times a lot. Maybe as a nightly job?

ShoppingCart Java tests throwing exception even when tests are successfull

For example:

https://travis-ci.com/lagom/lagom-samples/jobs/251610682#L1512-L1586

This happens for the test using TestServer with defaultSetup().withJdbc():

2019-10-31 20:43:32,013 INFO  com.zaxxer.hikari.HikariDataSource - HikariPool-2 - Shutdown completed.
2019-10-31 20:43:32,013 INFO  play.api.db.HikariCPConnectionPool - Shutting down connection pool.
2019-10-31 20:43:32,836 WARN  slick.basic.BasicBackend.stream - Error scheduling synchronous streaming
java.util.concurrent.RejectedExecutionException: Task slick.basic.BasicBackend$DatabaseDef$$anon$4@258ceb2 rejected from slick.util.AsyncExecutor$$anon$1$$anon$2@7c123a79[Terminated, pool size = 0, active threads = 0, queued tasks = 0, completed tasks = 71]
	at java.util.concurrent.ThreadPoolExecutor$AbortPolicy.rejectedExecution(ThreadPoolExecutor.java:2063)
	at java.util.concurrent.ThreadPoolExecutor.reject(ThreadPoolExecutor.java:830)
	at java.util.concurrent.ThreadPoolExecutor.execute(ThreadPoolExecutor.java:1379)
	at slick.util.AsyncExecutor$$anon$1$$anon$4.execute(AsyncExecutor.scala:161)
	at slick.basic.BasicBackend$DatabaseDef.scheduleSynchronousStreaming(BasicBackend.scala:302)
	at slick.basic.BasicBackend$DatabaseDef.scheduleSynchronousStreaming$(BasicBackend.scala:300)
	at slick.jdbc.JdbcBackend$DatabaseDef.scheduleSynchronousStreaming(JdbcBackend.scala:37)
	at slick.basic.BasicBackend$DatabaseDef.streamSynchronousDatabaseAction(BasicBackend.scala:295)
	at slick.basic.BasicBackend$DatabaseDef.streamSynchronousDatabaseAction$(BasicBackend.scala:293)
	at slick.jdbc.JdbcBackend$DatabaseDef.streamSynchronousDatabaseAction(JdbcBackend.scala:37)
	at slick.basic.BasicBackend$DatabaseDef.slick$basic$BasicBackend$DatabaseDef$$runInContextInline(BasicBackend.scala:240)
	at slick.basic.BasicBackend$DatabaseDef.runInContextSafe(BasicBackend.scala:148)
	at slick.basic.BasicBackend$DatabaseDef.runInContext(BasicBackend.scala:142)
	at slick.basic.BasicBackend$DatabaseDef.runInContext$(BasicBackend.scala:141)
	at slick.jdbc.JdbcBackend$DatabaseDef.runInContext(JdbcBackend.scala:37)
	at slick.basic.BasicBackend$DatabaseDef$$anon$1.subscribe(BasicBackend.scala:118)
	at akka.stream.impl.fusing.ActorGraphInterpreter$BatchingActorInputBoundary.preStart(ActorGraphInterpreter.scala:134)
	at akka.stream.impl.fusing.GraphInterpreter.init(GraphInterpreter.scala:306)
	at akka.stream.impl.fusing.GraphInterpreterShell.init(ActorGraphInterpreter.scala:593)
	at akka.stream.impl.fusing.ActorGraphInterpreter.tryInit(ActorGraphInterpreter.scala:701)
	at akka.stream.impl.fusing.ActorGraphInterpreter.preStart(ActorGraphInterpreter.scala:750)
	at akka.actor.Actor.aroundPreStart(Actor.scala:543)
	at akka.actor.Actor.aroundPreStart$(Actor.scala:543)
	at akka.stream.impl.fusing.ActorGraphInterpreter.aroundPreStart(ActorGraphInterpreter.scala:690)
	at akka.actor.ActorCell.create(ActorCell.scala:637)
	at akka.actor.ActorCell.invokeAll$1(ActorCell.scala:509)
	at akka.actor.ActorCell.systemInvoke(ActorCell.scala:531)
	at akka.dispatch.Mailbox.processAllSystemMessages(Mailbox.scala:294)
	at akka.dispatch.Mailbox.run(Mailbox.scala:229)
	at akka.dispatch.Mailbox.exec(Mailbox.scala:242)
	at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:289)
	at java.util.concurrent.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1056)
	at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1692)
	at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:157)
2019-10-31 20:43:32,846 WARN  slick.basic.BasicBackend.stream - Error scheduling synchronous streaming
java.util.concurrent.RejectedExecutionException: Task slick.basic.BasicBackend$DatabaseDef$$anon$4@22f716e0 rejected from slick.util.AsyncExecutor$$anon$1$$anon$2@7c123a79[Terminated, pool size = 0, active threads = 0, queued tasks = 0, completed tasks = 71]
	at java.util.concurrent.ThreadPoolExecutor$AbortPolicy.rejectedExecution(ThreadPoolExecutor.java:2063)
	at java.util.concurrent.ThreadPoolExecutor.reject(ThreadPoolExecutor.java:830)
	at java.util.concurrent.ThreadPoolExecutor.execute(ThreadPoolExecutor.java:1379)
	at slick.util.AsyncExecutor$$anon$1$$anon$4.execute(AsyncExecutor.scala:161)
	at slick.basic.BasicBackend$DatabaseDef.scheduleSynchronousStreaming(BasicBackend.scala:302)
	at slick.basic.BasicBackend$DatabaseDef.scheduleSynchronousStreaming$(BasicBackend.scala:300)
	at slick.jdbc.JdbcBackend$DatabaseDef.scheduleSynchronousStreaming(JdbcBackend.scala:37)
	at slick.basic.BasicBackend$DatabaseDef.streamSynchronousDatabaseAction(BasicBackend.scala:295)
	at slick.basic.BasicBackend$DatabaseDef.streamSynchronousDatabaseAction$(BasicBackend.scala:293)
	at slick.jdbc.JdbcBackend$DatabaseDef.streamSynchronousDatabaseAction(JdbcBackend.scala:37)
	at slick.basic.BasicBackend$DatabaseDef.slick$basic$BasicBackend$DatabaseDef$$runInContextInline(BasicBackend.scala:240)
	at slick.basic.BasicBackend$DatabaseDef.runInContextSafe(BasicBackend.scala:148)
	at slick.basic.BasicBackend$DatabaseDef.runInContext(BasicBackend.scala:142)
	at slick.basic.BasicBackend$DatabaseDef.runInContext$(BasicBackend.scala:141)
	at slick.jdbc.JdbcBackend$DatabaseDef.runInContext(JdbcBackend.scala:37)
	at slick.basic.BasicBackend$DatabaseDef$$anon$1.subscribe(BasicBackend.scala:118)
	at akka.stream.impl.fusing.ActorGraphInterpreter$BatchingActorInputBoundary.preStart(ActorGraphInterpreter.scala:134)
	at akka.stream.impl.fusing.GraphInterpreter.init(GraphInterpreter.scala:306)
	at akka.stream.impl.fusing.GraphInterpreterShell.init(ActorGraphInterpreter.scala:593)
	at akka.stream.impl.fusing.ActorGraphInterpreter.tryInit(ActorGraphInterpreter.scala:701)
	at akka.stream.impl.fusing.ActorGraphInterpreter.preStart(ActorGraphInterpreter.scala:750)
	at akka.actor.Actor.aroundPreStart(Actor.scala:543)
	at akka.actor.Actor.aroundPreStart$(Actor.scala:543)
	at akka.stream.impl.fusing.ActorGraphInterpreter.aroundPreStart(ActorGraphInterpreter.scala:690)
	at akka.actor.ActorCell.create(ActorCell.scala:637)
	at akka.actor.ActorCell.invokeAll$1(ActorCell.scala:509)
	at akka.actor.ActorCell.systemInvoke(ActorCell.scala:531)
	at akka.dispatch.Mailbox.processAllSystemMessages(Mailbox.scala:294)
	at akka.dispatch.Mailbox.run(Mailbox.scala:229)
	at akka.dispatch.Mailbox.exec(Mailbox.scala:242)
	at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:289)
	at java.util.concurrent.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1056)
	at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1692)
	at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:157)

Repository for publishing is not specified.

This is regarding the shopping-cart-scala app.

I get the following error when I try to publish the docker images to docker hub:

[error] java.lang.RuntimeException: Repository for publishing is not specified.
[error] 	at scala.sys.package$.error(package.scala:26)
[error] 	at sbt.Classpaths$.$anonfun$getPublishTo$1(Defaults.scala:2644)
[error] 	at scala.Option.getOrElse(Option.scala:121)
[error] 	at sbt.Classpaths$.getPublishTo(Defaults.scala:2644)
[error] 	at sbt.Classpaths$.$anonfun$ivyBaseSettings$48(Defaults.scala:2089)
[error] 	at scala.Function1.$anonfun$compose$1(Function1.scala:44)

I pass docker username and respository as an argument to sbt like so:

sbt -Ddocker.username=codingkapoor -Ddocker.registry=index.docker.io

Images do get pushed but I still the error when I try to publish docker:publish.

Please suggest. TIA.

Running shopping cart java tests with Maven does not work with Akka's LogCapturing

It works with sbt and also when running the test in isolation within IntelliJ. But when running using Maven:

mvn -Dlogback.debug=true test

The following error occurs:

Running com.example.shoppingcart.impl.ShoppingCartReportTest
18:44:33,671 |-INFO in ch.qos.logback.classic.LoggerContext[default] - Could NOT find resource [logback-test.xml]
18:44:33,671 |-INFO in ch.qos.logback.classic.LoggerContext[default] - Could NOT find resource [logback.groovy]
18:44:33,671 |-INFO in ch.qos.logback.classic.LoggerContext[default] - Found resource [logback.xml] at [file:/Users/marcospereira/Lightbend/lagom/lagom-samples/shopping-cart/shopping-cart-java/shopping-cart/target/test-classes/logback.xml]
18:44:33,722 |-INFO in ch.qos.logback.core.joran.action.AppenderAction - About to instantiate appender of type [ch.qos.logback.core.ConsoleAppender]
18:44:33,729 |-INFO in ch.qos.logback.core.joran.action.AppenderAction - Naming appender as [STDOUT]
18:44:33,734 |-INFO in ch.qos.logback.core.joran.action.NestedComplexPropertyIA - Assuming default type [ch.qos.logback.classic.encoder.PatternLayoutEncoder] for [encoder] property
18:44:33,757 |-INFO in ch.qos.logback.core.joran.action.AppenderAction - About to instantiate appender of type [akka.actor.testkit.typed.internal.CapturingAppender]
18:44:33,824 |-INFO in ch.qos.logback.core.joran.action.AppenderAction - Naming appender as [CapturingAppender]
18:44:33,825 |-INFO in ch.qos.logback.core.joran.action.AppenderRefAction - Attaching appender named [STDOUT] to Logger[akka.actor.testkit.typed.internal.CapturingAppenderDelegate]
18:44:33,825 |-INFO in ch.qos.logback.classic.joran.action.LoggerAction - Setting level of logger [org.apache.cassandra] to ERROR
18:44:33,825 |-INFO in ch.qos.logback.classic.joran.action.LoggerAction - Setting level of logger [com.datastax.driver] to WARN
18:44:33,825 |-INFO in ch.qos.logback.classic.joran.action.LoggerAction - Setting level of logger [akka] to WARN
18:44:33,826 |-INFO in ch.qos.logback.classic.joran.action.RootLoggerAction - Setting level of ROOT logger to INFO
18:44:33,826 |-INFO in ch.qos.logback.core.joran.action.AppenderRefAction - Attaching appender named [STDOUT] to Logger[ROOT]
18:44:33,826 |-INFO in ch.qos.logback.core.joran.action.AppenderRefAction - Attaching appender named [CapturingAppender] to Logger[ROOT]
18:44:33,826 |-INFO in ch.qos.logback.classic.joran.action.ConfigurationAction - End of configuration.
18:44:33,827 |-INFO in ch.qos.logback.classic.joran.JoranConfigurator@7fc229ab - Registering current configuration as safe fallback point
18:44:34,296 |-INFO in ch.qos.logback.core.joran.action.AppenderAction - About to instantiate appender of type [ch.qos.logback.core.ConsoleAppender]
18:44:34,296 |-INFO in ch.qos.logback.core.joran.action.AppenderAction - Naming appender as [STDOUT]
18:44:34,297 |-INFO in ch.qos.logback.core.joran.action.NestedComplexPropertyIA - Assuming default type [ch.qos.logback.classic.encoder.PatternLayoutEncoder] for [encoder] property
18:44:34,297 |-INFO in ch.qos.logback.core.joran.action.AppenderAction - About to instantiate appender of type [akka.actor.testkit.typed.internal.CapturingAppender]
18:44:34,297 |-INFO in ch.qos.logback.core.joran.action.AppenderAction - Naming appender as [CapturingAppender]
18:44:34,297 |-INFO in ch.qos.logback.core.joran.action.AppenderRefAction - Attaching appender named [STDOUT] to Logger[akka.actor.testkit.typed.internal.CapturingAppenderDelegate]
18:44:34,298 |-INFO in ch.qos.logback.classic.joran.action.LoggerAction - Setting level of logger [org.apache.cassandra] to ERROR
18:44:34,298 |-INFO in ch.qos.logback.classic.joran.action.LoggerAction - Setting level of logger [com.datastax.driver] to WARN
18:44:34,298 |-INFO in ch.qos.logback.classic.joran.action.LoggerAction - Setting level of logger [akka] to WARN
18:44:34,298 |-INFO in ch.qos.logback.classic.joran.action.RootLoggerAction - Setting level of ROOT logger to INFO
18:44:34,298 |-INFO in ch.qos.logback.core.joran.action.AppenderRefAction - Attaching appender named [STDOUT] to Logger[ROOT]
18:44:34,298 |-INFO in ch.qos.logback.core.joran.action.AppenderRefAction - Attaching appender named [CapturingAppender] to Logger[ROOT]
18:44:34,298 |-INFO in ch.qos.logback.classic.joran.action.ConfigurationAction - End of configuration.
18:44:34,298 |-INFO in ch.qos.logback.classic.joran.JoranConfigurator@1921ad94 - Registering current configuration as safe fallback point
2019-10-31 18:44:34,915 INFO  play.api.db.HikariCPConnectionPool - Creating Pool for datasource 'default'
2019-10-31 18:44:34,932 INFO  com.zaxxer.hikari.HikariDataSource - HikariPool-1 - Starting...
2019-10-31 18:44:34,943 INFO  com.zaxxer.hikari.HikariDataSource - HikariPool-1 - Start completed.
2019-10-31 18:44:34,949 INFO  play.api.db.HikariCPConnectionPool - datasource [default] bound to JNDI as DefaultDS
2019-10-31 18:44:36,763 INFO  com.lightbend.lagom.internal.persistence.cluster.ClusterStartupTaskActor - Executing cluster start task jdbcCreateTables.
2019-10-31 18:44:36,811 INFO  com.lightbend.lagom.internal.persistence.cluster.ClusterStartupTaskActor - Executing cluster start task slickOffsetStorePrepare.
2019-10-31 18:44:36,822 INFO  com.lightbend.lagom.internal.persistence.cluster.ClusterStartupTaskActor - Cluster start task slickOffsetStorePrepare done.
2019-10-31 18:44:36,827 INFO  com.lightbend.lagom.internal.persistence.cluster.ClusterStartupTaskActor - Cluster start task jdbcCreateTables done.
log4j:WARN No appenders could be found for logger (org.jboss.logging).
log4j:WARN Please initialize the log4j system properly.
log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
2019-10-31 18:44:38,390 INFO  com.zaxxer.hikari.HikariDataSource - HikariPool-1 - Shutdown initiated...
2019-10-31 18:44:38,395 INFO  com.zaxxer.hikari.HikariDataSource - HikariPool-1 - Shutdown completed.
2019-10-31 18:44:38,418 INFO  play.api.db.HikariCPConnectionPool - Shutting down connection pool.
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.975 sec
Running com.example.shoppingcart.impl.ShoppingCartServiceTest
18:44:39,465 |-WARN in Logger[akka.actor.CoordinatedShutdown] - No appenders present in context [default] for logger [akka.actor.CoordinatedShutdown].
18:44:39,534 |-INFO in ch.qos.logback.core.joran.action.AppenderAction - About to instantiate appender of type [ch.qos.logback.core.ConsoleAppender]
18:44:39,534 |-INFO in ch.qos.logback.core.joran.action.AppenderAction - Naming appender as [STDOUT]
18:44:39,534 |-INFO in ch.qos.logback.core.joran.action.NestedComplexPropertyIA - Assuming default type [ch.qos.logback.classic.encoder.PatternLayoutEncoder] for [encoder] property
18:44:39,535 |-INFO in ch.qos.logback.core.joran.action.AppenderAction - About to instantiate appender of type [akka.actor.testkit.typed.internal.CapturingAppender]
18:44:39,535 |-INFO in ch.qos.logback.core.joran.action.AppenderAction - Naming appender as [CapturingAppender]
18:44:39,535 |-INFO in ch.qos.logback.core.joran.action.AppenderRefAction - Attaching appender named [STDOUT] to Logger[akka.actor.testkit.typed.internal.CapturingAppenderDelegate]
18:44:39,535 |-INFO in ch.qos.logback.classic.joran.action.LoggerAction - Setting level of logger [org.apache.cassandra] to ERROR
18:44:39,535 |-INFO in ch.qos.logback.classic.joran.action.LoggerAction - Setting level of logger [com.datastax.driver] to WARN
18:44:39,535 |-INFO in ch.qos.logback.classic.joran.action.LoggerAction - Setting level of logger [akka] to WARN
18:44:39,535 |-INFO in ch.qos.logback.classic.joran.action.RootLoggerAction - Setting level of ROOT logger to INFO
18:44:39,536 |-INFO in ch.qos.logback.core.joran.action.AppenderRefAction - Attaching appender named [STDOUT] to Logger[ROOT]
18:44:39,536 |-INFO in ch.qos.logback.core.joran.action.AppenderRefAction - Attaching appender named [CapturingAppender] to Logger[ROOT]
18:44:39,536 |-INFO in ch.qos.logback.classic.joran.action.ConfigurationAction - End of configuration.
18:44:39,536 |-INFO in ch.qos.logback.classic.joran.JoranConfigurator@54234569 - Registering current configuration as safe fallback point
2019-10-31 18:44:39,568 INFO  play.api.db.HikariCPConnectionPool - Creating Pool for datasource 'default'
2019-10-31 18:44:39,569 INFO  com.zaxxer.hikari.HikariDataSource - HikariPool-2 - Starting...
2019-10-31 18:44:39,569 INFO  com.zaxxer.hikari.HikariDataSource - HikariPool-2 - Start completed.
2019-10-31 18:44:39,569 INFO  play.api.db.HikariCPConnectionPool - datasource [default] bound to JNDI as DefaultDS
2019-10-31 18:44:39,643 INFO  com.lightbend.lagom.internal.persistence.cluster.ClusterStartupTaskActor - Executing cluster start task jdbcCreateTables.
2019-10-31 18:44:39,653 INFO  com.lightbend.lagom.internal.persistence.cluster.ClusterStartupTaskActor - Executing cluster start task slickOffsetStorePrepare.
2019-10-31 18:44:39,656 INFO  com.lightbend.lagom.internal.persistence.cluster.ClusterStartupTaskActor - Executing cluster start task readSideGlobalPrepare-ShoppingCartReportProcessor.
2019-10-31 18:44:39,659 INFO  com.lightbend.lagom.internal.persistence.cluster.ClusterStartupTaskActor - Cluster start task slickOffsetStorePrepare done.
2019-10-31 18:44:39,661 INFO  com.lightbend.lagom.internal.persistence.cluster.ClusterStartupTaskActor - Cluster start task jdbcCreateTables done.
2019-10-31 18:44:39,852 INFO  com.lightbend.lagom.internal.persistence.cluster.ClusterStartupTaskActor - Cluster start task readSideGlobalPrepare-ShoppingCartReportProcessor done.
2019-10-31 18:44:40,426 INFO  com.zaxxer.hikari.HikariDataSource - HikariPool-2 - Shutdown initiated...
2019-10-31 18:44:40,433 INFO  com.zaxxer.hikari.HikariDataSource - HikariPool-2 - Shutdown completed.
2019-10-31 18:44:40,445 INFO  play.api.db.HikariCPConnectionPool - Shutting down connection pool.
Tests run: 7, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.996 sec
Running com.example.shoppingcart.impl.ShoppingCartTest
Tests run: 12, Failures: 0, Errors: 12, Skipped: 0, Time elapsed: 0.083 sec <<< FAILURE!
shouldAllowGettingShoppingCartSummary(com.example.shoppingcart.impl.ShoppingCartTest)  Time elapsed: 0.002 sec  <<< ERROR!
java.lang.IllegalStateException: CapturingAppender not defined for [ROOT] in logback-test.xml
	at akka.actor.testkit.typed.internal.CapturingAppender$.get(CapturingAppender.scala:24)
	at akka.actor.testkit.typed.javadsl.LogCapturing.<init>(LogCapturing.scala:39)
	at com.example.shoppingcart.impl.ShoppingCartTest.<init>(ShoppingCartTest.java:31)
	at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
	at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
	at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
	at org.junit.runners.BlockJUnit4ClassRunner.createTest(BlockJUnit4ClassRunner.java:217)
	at org.junit.runners.BlockJUnit4ClassRunner$1.runReflectiveCall(BlockJUnit4ClassRunner.java:266)
	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
	at org.junit.runners.BlockJUnit4ClassRunner.methodBlock(BlockJUnit4ClassRunner.java:263)
	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:78)
	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:57)
	at org.junit.runners.ParentRunner$3.run(ParentRunner.java:290)
	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:71)
	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:288)
	at org.junit.runners.ParentRunner.access$000(ParentRunner.java:58)
	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:268)
	at org.junit.rules.ExternalResource$1.evaluate(ExternalResource.java:48)
	at org.junit.rules.RunRules.evaluate(RunRules.java:20)
	at org.junit.runners.ParentRunner.run(ParentRunner.java:363)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:252)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:141)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:112)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at org.apache.maven.surefire.util.ReflectionUtils.invokeMethodWithArray(ReflectionUtils.java:189)
	at org.apache.maven.surefire.booter.ProviderFactory$ProviderProxy.invoke(ProviderFactory.java:165)
	at org.apache.maven.surefire.booter.ProviderFactory.invokeProvider(ProviderFactory.java:85)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:115)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:75)

So, the correct file is picked:

18:44:33,671 |-INFO in ch.qos.logback.classic.LoggerContext[default] - Found resource [logback.xml] at [file:/Users/marcospereira/Lightbend/lagom/lagom-samples/shopping-cart/shopping-cart-java/shopping-cart/target/test-classes/logback.xml]

The appender is there:

18:44:34,297 |-INFO in ch.qos.logback.core.joran.action.AppenderAction - About to instantiate appender of type [akka.actor.testkit.typed.internal.CapturingAppender]
18:44:34,297 |-INFO in ch.qos.logback.core.joran.action.AppenderAction - Naming appender as [CapturingAppender]

And that it is attached to the ROOT logger:

18:44:39,536 |-INFO in ch.qos.logback.core.joran.action.AppenderRefAction - Attaching appender named [CapturingAppender] to Logger[ROOT]

Shopping Cart Maven build breaks when used with Tech Hub project starter

Reproduction

  1. Go to https://developer.lightbend.com/start/?group=lagom&project=lagom-samples-lagom-java-shopping-cart-example
  2. Click "Create a project for me"
  3. Unzip the downloaded lagom-samples-lagom-java-shopping-cart-example.zip
  4. cd lagom-samples-lagom-java-shopping-cart-example/
  5. mvn package
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time:  1.413 s
[INFO] Finished at: 2019-07-09T08:27:34+09:30
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal pl.project13.maven:git-commit-id-plugin:2.2.6:revision (default) on project shopping-cart-api: .git directory is not found! Please specify a valid [dotGitDirectory] in your pom.xml -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :shopping-cart-api

The problem is that the included maven-git-commit-id-plugin expects to be run in side a git repository, but when provided from the project starter, the resulting directory isn't a git repo.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.