vert-x / testtools Goto Github PK
View Code? Open in Web Editor NEWVert.x 2.x is deprecated - use instead
Home Page: http://vertx.io/docs/vertx-unit/java/
License: Other
Vert.x 2.x is deprecated - use instead
Home Page: http://vertx.io/docs/vertx-unit/java/
License: Other
Expected behaviour: https://github.com/junit-team/junit/wiki/Rules
Actual behaviour: no change in semantics.
If you have a Groovy Verticle as a Groovy class and try to deploy it within a TestVerticle class, it will throw a ClassCastException trying to cast it to the Java version of Verticle.
Normally when running junit you inherit the tests of your parent class and they also get run. testtools doesn't do that.
Expected parameter does not work if the exception is triggered on a different tick of the event loop.
Perhaps a way to get around this is to have a VertxAssert.expected(Class<? extends Exception>) method, and then compare any thrown exceptions with the expected exception in handleError().
Daryl
I'm thinking of the generateRandom* methods, mainly.
@test(expected=SomeException.class) does not work
Failing test:
https://gist.github.com/ewolff/5894753
To ensure all failures, in all langs are reported in the correct way
The old test annotations project had a way of declaring which verticles and modules you wanted to have deployed, ie what your test depended upon, before a test was run. This has been removed in testtools in favour of manual deployment.
The freedom given by manual deployment is good, but it would be helpful if there was some support for declaring things you want to deploy. This could come in the form of an annotation or it could come in the form of a method, for example:
public List<Deployable> getModulesToDeploy() {
return Arrays.asList(new Deployable("io.vertx~mod-mongo-persistor~2.0.0-beta1", "config.testdb.json"));
}
At the moment when using testtools we're finding we do this a lot. Its a requirement that verticles are brought up and deployed before tests start executing.
Currently you're sending a message to TESTRUNNER_HANDLER_ADDRESS
when a test succeeds/fails. This works fine, but it would be helpful from a developer debugging perspective if this was a publish.
That way it would be possible to reliably listen to messages sent to this address. It would also help anyone who wanted to write tools on top of the testtools project, such as logging the results of test runs.
For filling the gap between Junit sync mind and vert.x, would be nice to be able to run sync code inside @test methods.
The old test utility had the following 3 methods available. They are no longer present in vert-x/testtools, and as a result mod-lang-js has a dependency on both the old and new testing utilities.
TestUtils#generateRandomBuffer
TestUtils#randomUnicodeString
TestUtils#buffersEqual
Thanks!
@before and @after annotations as known from JUnit do not work
@BeforeClass do work, however.
https://gist.github.com/ewolff/5894732 fails in beforeIsExecuted()
https://gist.github.com/ewolff/5894741 fails in afterClass()
I am running the tests in the test-tools-tests project commit 493fa1314565173ecac2ed8fb5438e358cb687ce .
Tests currently do not fail if testComplete() is not called. This occurs regardless of any @test(expected) parameters.
Exception is thrown, and outputted to stderr, but test is still considered a pass.
JavaClassVerticle.java could perhaps call VertxAssert.fail() to fail the test.
Might be wise to set timeout to a couple of seconds for this one...
@Test(expected = IllegalStateException.class)
public void timeoutTest() {
}
TestVerticle.startTests() catches InvocationTargetExceptions, but in one of my Groovy-based tests, it caught a groovy.lang.MissingMethodException, and tries to handle it. This causes a serialization error in VertxAssert on line 51. This exception is uncaught, and causes no failure to be logged to Vertx, making the test pass as successful.
I tried 'fixing' this by again surrounding the VertxAssert.handleThrowable(targetEx);
with a try/catch, but that messes up the output stream for some reason. I haven't found time to look deeper into it, so I don't have a patch to offer. I'm afraid. My workaround so far is to override startTests():
@Override
protected void startTests() {
String methodName = container.config().getString("methodName");
try {
Method m = getClass().getDeclaredMethod(methodName);
m.invoke(this);
} catch (InvocationTargetException e) {
// Conflict with Groovy causes sometimes silent InvocationTargetExceptions, so we always fail here.
VertxAssert.fail(e.getMessage());
} catch (Throwable t) {
// Problem with invoking
VertxAssert.handleThrowable(t);
}
}
The exception causing the problems:
groovy.lang.MissingMethodException: No signature of method: ...
With this in pom.xml
:
<parent>
<groupId>com.unbounce.vertx</groupId>
<artifactId>vertx-java-module-parent</artifactId>
<version>2.1RC4.1-SNAPSHOT</version>
</parent>
<artifactId>vertx-http-dispatcher</artifactId>
JavaClassRunner
sets vertx.modulename
to com.unbounce.vertx~vertx-java-module-parent~2.1RC4.1-SNAPSHOT
.
I have some tests where I want to have two handlers and check something happens at both of them. At the moment if I put a 'testcomplete()' call at either of the handlers then the test will end without checking the asserts at the other handler.
One potential solution is to have an annotation on the test which declares how many expected testcomplete() handlers there are to be called.
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.