zolyfarkas / spf4j Goto Github PK
View Code? Open in Web Editor NEWSimple performance framework for java
Home Page: http://www.spf4j.org
Simple performance framework for java
Home Page: http://www.spf4j.org
See below error when using spf4j-avro when I do SchemaUtils.writeIdlProtocol
java.lang.IllegalAccessError: class org.spf4j.avro.schema.SchemaUtils tried to access field org.apache.avro.Schema.FACTORY (org.spf4j.avro.schema.SchemaUtils and org.apache.avro.Schema are in unnamed module of loader 'app')
at org.spf4j.avro.schema.SchemaUtils.createJsonGenerator(SchemaUtils.java:212)
at org.spf4j.avro.schema.SchemaUtils.writeIdl(SchemaUtils.java:194)
at org.spf4j.avro.schema.SchemaUtils.writeIdlProtocol(SchemaUtils.java:183)
Trying to use org.spf4j:spf4j-avro:8.9..1
If I am not wrong Root cause for this is FACTORY
in org.apache.avro.Schema
is not public, it has a default access modifier.
Because bintray is now sunset, we are getting the following error when building our avro-schemas project:
[ERROR] Unresolveable build extension: Plugin org.spf4j:maven-avro-schema-plugin:8.8.5 or one of its dependencies could not be resolved: Failed to collect dependencies at org.spf4j:maven-avro-schema-plugin:jar:8.8.5 -> org.spf4j:spf4j-maven-schema-resolver:jar:8.8.5 -> org.apache.avro:avro:jar:1.9.0.20p: Failed to read artifact descriptor for org.apache.avro:avro:jar:1.9.0.20p: Could not transfer artifact org.apache.avro:avro:pom:1.9.0.20p from/to bintray-zolyfarkas-core (https://dl.bintray.com/zolyfarkas/core): Access denied to: https://dl.bintray.com/zolyfarkas/core/org/apache/avro/avro/1.9.0.20p/avro-1.9.0.20p.pom , ReasonPhrase:Forbidden. -> [Help 2]
Our project is using the maven-avro-schema-plugin
plugin.
The slf4j-test method annotations are not effective without the support of the test runner.
It would help if slf4j-test would warn when it is used outside the context of a run listener.
(I suspect this may be harder to do programmatically, because some methods do useful things without the run listener, but hopefully there's a way to provide an interface for the common use case that resists misconfiguration.)
It took me a long time to figure out that the reason slf4j-test wasn't working was because another slf4j implementation was getting all the logs instead.
It would help if slf4j-test would yell loudly if you start using TestLogger methods but the logs are going somewhere else.
I meant this one:
'Most test should not log errors, if a error scenario is validated, please assert this behavior using TestLoggers.expect'
Exists some globally configuration for that?
If I send the txt output to a file that does not exist, I get this error...
java.nio.file.NoSuchFileException: /tmp/spf4j.metrics.tsdb.txt
Of course, all I need to do is touch the file before performance metrics start gathering and it works as expected...
com.google.common.io.Files.touch(new File("/tmp/spf4j.metrics.tsdb.txt"));
It does create the binary output file if it does not exist To be consistent, perhaps spf4j should also create the txt output file itself? I'm happy to raise a PR if you agree.
Currently a async method invocation can be done with:
function(...)&
sometimes we want to avoid the invocation overhead of a async call conditionally...
something like this can be implemented:
function(....)&(some condition)
slf4j.LoggerFactory.getLogger(Class clazz) seems widely used. It would be nice if slf4j-test's annotations could accept a class to specify a category to match.
[ERROR] Failed to execute goal org.spf4j:maven-avro-schema-plugin:8.5.21:avro-dependencies (default-avro-dependencies) on project test-schema: Execution default-avro-dependencies of goal org.spf4j:maven-avro-schema-plugin:8.5.21:avro-dependencies failed. FileSystemAlreadyExistsException -> [Help 1]
org.apache.maven.lifecycle.LifecycleExecutionException: Failed to execute goal org.spf4j:maven-avro-schema-plugin:8.5.21:avro-dependencies (default-avro-dependencies) on project popcorn-test-schema: Execution default-avro-dependencies of goal org.spf4j:maven-avro-schema-plugin:8.5.21:avro-dependencies failed.
at org.apache.maven.lifecycle.internal.MojoExecutor.execute (MojoExecutor.java:213)
at org.apache.maven.lifecycle.internal.MojoExecutor.execute (MojoExecutor.java:154)
at org.apache.maven.lifecycle.internal.MojoExecutor.execute (MojoExecutor.java:146)
at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject (LifecycleModuleBuilder.java:117)
at org.apache.maven.lifecycle.internal.builder.multithreaded.MultiThreadedBuilder$1.call (MultiThreadedBuilder.java:200)
at org.apache.maven.lifecycle.internal.builder.multithreaded.MultiThreadedBuilder$1.call (MultiThreadedBuilder.java:196)
at java.util.concurrent.FutureTask.run (FutureTask.java:266)
at java.util.concurrent.Executors$RunnableAdapter.call (Executors.java:511)
at java.util.concurrent.FutureTask.run (FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker (ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run (ThreadPoolExecutor.java:624)
at java.lang.Thread.run (Thread.java:748)
Caused by: org.apache.maven.plugin.PluginExecutionException: Execution default-avro-dependencies of goal org.spf4j:maven-avro-schema-plugin:8.5.21:avro-dependencies failed.
at org.apache.maven.plugin.DefaultBuildPluginManager.executeMojo (DefaultBuildPluginManager.java:148)
at org.apache.maven.lifecycle.internal.MojoExecutor.execute (MojoExecutor.java:208)
at org.apache.maven.lifecycle.internal.MojoExecutor.execute (MojoExecutor.java:154)
at org.apache.maven.lifecycle.internal.MojoExecutor.execute (MojoExecutor.java:146)
at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject (LifecycleModuleBuilder.java:117)
at org.apache.maven.lifecycle.internal.builder.multithreaded.MultiThreadedBuilder$1.call (MultiThreadedBuilder.java:200)
at org.apache.maven.lifecycle.internal.builder.multithreaded.MultiThreadedBuilder$1.call (MultiThreadedBuilder.java:196)
at java.util.concurrent.FutureTask.run (FutureTask.java:266)
at java.util.concurrent.Executors$RunnableAdapter.call (Executors.java:511)
at java.util.concurrent.FutureTask.run (FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker (ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run (ThreadPoolExecutor.java:624)
at java.lang.Thread.run (Thread.java:748)
Caused by: java.nio.file.FileSystemAlreadyExistsException
at com.sun.nio.zipfs.ZipFileSystemProvider.newFileSystem (ZipFileSystemProvider.java:113)
at java.nio.file.FileSystems.newFileSystem (FileSystems.java:326)
at java.nio.file.FileSystems.newFileSystem (FileSystems.java:276)
at org.spf4j.io.compress.Compress.unzip (Compress.java:228)
at org.spf4j.maven.plugin.avro.avscp.SchemaDependenciesMojo.execute (SchemaDependenciesMojo.java:89)
at org.apache.maven.plugin.DefaultBuildPluginManager.executeMojo (DefaultBuildPluginManager.java:137)
at org.apache.maven.lifecycle.internal.MojoExecutor.execute (MojoExecutor.java:208)
at org.apache.maven.lifecycle.internal.MojoExecutor.execute (MojoExecutor.java:154)
at org.apache.maven.lifecycle.internal.MojoExecutor.execute (MojoExecutor.java:146)
at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject (LifecycleModuleBuilder.java:117)
at org.apache.maven.lifecycle.internal.builder.multithreaded.MultiThreadedBuilder$1.call (MultiThreadedBuilder.java:200)
at org.apache.maven.lifecycle.internal.builder.multithreaded.MultiThreadedBuilder$1.call (MultiThreadedBuilder.java:196)
at java.util.concurrent.FutureTask.run (FutureTask.java:266)
at java.util.concurrent.Executors$RunnableAdapter.call (Executors.java:511)
at java.util.concurrent.FutureTask.run (FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker (ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run (ThreadPoolExecutor.java:624)
at java.lang.Thread.run (Thread.java:748)
Currently the supported retry is primitive. (some arbitrary number of retries)
Potential BUT in base 64, needs to be investigated:
SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder".
SLF4J: Defaulting to no-operation (NOP) logger implementation
SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further details.
java.lang.Exception: java.lang.reflect.InvocationTargetException
at org.openjdk.jmh.runner.BenchmarkHandler$BenchmarkTask.call(BenchmarkHandler.java:451)
at org.openjdk.jmh.runner.BenchmarkHandler$BenchmarkTask.call(BenchmarkHandler.java:412)
at java.util.concurrent.ForkJoinTask$AdaptedCallable.exec(ForkJoinTask.java:1424)
at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:289)
at java.util.concurrent.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1056)
at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1692)
at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:157)
Caused by: java.lang.reflect.InvocationTargetException
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.openjdk.jmh.runner.BenchmarkHandler$BenchmarkTask.call(BenchmarkHandler.java:430)
... 6 more
Caused by: java.lang.ArrayIndexOutOfBoundsException: 3292
at org.spf4j.base.Base64.decodeBase64(Base64.java:170)
at org.spf4j.base.Base64.decodeBase64(Base64.java:120)
at org.spf4j.base.StringsBenchmark.subStringDecode(StringsBenchmark.java:39)
at org.spf4j.base.generated.StringsBenchmark_subStringDecode_jmhTest.subStringDecode_Throughput(StringsBenchmark_subStringDecode_jmhTest.java:72)
... 11 more
There is no good reason for not using it for unit tests.
This will improve the quality of this library further.
Your docs say to use perf.ms.config
as a JVM arg to specify the output directory. I noticed this wasn't working, so I looked through the code and I found that it might be spf4j.perf.ms.config
. When I used this instead, it worked as expected.
Am I doing it wrong or do your docs simply need updating?
Special syntax/operators for scheduled async exec:
func(...)@instant
func(...)@*interval
Java FJP is a heavily optimized Thread Pool implementation which outperforms the legacy JDK thread pool and spf4j Lifo threadpool as shown by the beckmarks I have recently added to SPF4J:
Benchmark Mode Cnt Score Error Units
ThreadPoolBenchmark.spfLifoTpBenchmark thrpt 10 3322.753 ± 196.490 ops/s
ThreadPoolBenchmarkFjp.fjpBenchmark thrpt 10 9679.502 ± 1160.887 ops/s
ThreadPoolBenchmarkStdJdk.stdJdkBenchmark thrpt 10 3466.997 ± 81.594 ops/s
FJP uses Lifo thread scheduling just like spf4j pool, which made me think, yay! I can drop the spf4j implementation... But by looking into the implementation, I noticed that the pool is not as tweak-able as the spf4j pool is... which is quite important for certain use cases...
Additionally relatively serious issues with it have been fixed in FJP recently:
http://bugs.java.com/view_bug.do?bug_id=8078490
These fixes are not available for JDK 1.7 without paying for support.
Since spf4j is all about reliability and performance, I think the spf4j lifo threadpool next iteration will be a implementation based on a fork of the JDK FJP.
The performance of FJP is very promising, and will provide a significant boost to ZEL and the rest of the library and apps based on spf4j...
Awesome library! Minor issue I ran into with the org.spf4j.stackmonitor.Sampler - the generated filename w/ timestamp contains colon characters, which are not valid in Windows filenames.
Around line 262
/**
* Dumps the sampled stacks to file. the collected samples are reset
*
* @param id - id will be added to file name returns the name of the file.
* @return - the file name where the data was persisted or null if there was no data to persist.
* @throws IOException - io issues while persisting data.
*/
@JmxExport(value = "dumpToSpecificFile", description = "save stack samples to file")
@Nullable
@SuppressFBWarnings("PATH_TRAVERSAL_IN") // not possible the provided ID is validated for path separators.
public synchronized File dumpToFile(
@JmxExport(value = "fileID", description = "the ID that will be part of the file name")
@Nullable final String id) throws IOException {
String fileName = filePrefix + CharSequences.validatedFileName(((id == null) ? "" : '_' + id) + '_'
+ DateTimeFormats.TS_FORMAT.format(Timing.getCurrentTiming().fromNanoTimeToInstant(lastDumpTimeNanos))
+ '_' + DateTimeFormats.TS_FORMAT.format(Instant.now()));
File file = new File(dumpFolder, fileName);
return dumpToFile(file);
}
Hi,
How to do dynamic change of default log level for given logger?
I tried via TestLoggers.sys().getLogger("loggerName")
But returned TestLogger doesn't have method for change log level.
Thanks in advance :-)
Hi There,
This project is still using an old implementation of LGTM's automated code review, which has now been disabled. To continue using automated code review, and receive checks on your Pull Requests, please install the GitHub App on this repository.
Thanks,
The LGTM Team
Is it possible to set up spf4j-slf4j-test such that I can test that certain log messages are produced, but still use logback for logging to the console in the same test (and all other tests in the project, most of which do not test log messages)?
This issue is in regards to the maven-avro-schema-plugin
.
When using schema references, there appears to be a ordering dependency where if schema A depends on B, B must be compiled by the plugin before A. I believe this order is defined by the file system/OS's behavior, and is called this point in code. Here, if the OS decides to list schema file A before B, the plugin will complain with an error like:
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 3.406 s
[INFO] Finished at: 2020-04-07T08:13:41-04:00
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.spf4j:maven-avro-schema-plugin:8.8.0:avro-compile (default-avro-compile) on project avro-schemas: Execution default-avro-compile of goal org.spf4j:maven-avro-schema-plugin:8.8.0:avro-compile failed: "EventMetadata" is not a defined name. The type of the "metadata" field must be a defined name or a {"type": ...} expression. -> [Help 1]
In this example, EventMetadata
is schema B from my generalized description above.
Some operating systems will return files ordered alphabetically, some will return them ordered by their created at time, and I'm sure there are other variants. Ideally this plugin would take control over this behavior, thereby keeping it consistent across environments.
The avro-maven-plugin
also suffers from this same issue, however it allows users to manually configure the order in which schema files will be compiled by the plugin. While this solution isn't the friendliest to the user as they need to think about this order, it is at least a workable solution solution, where currently the maven-avro-schema-plugin
appears to have no work around available.
Thanks for your good work on this project and taking the time to read this issue 🙌
I tried putting it in src/test/resources, but nothing I put in there has any effect.
Hi, your docs talk about a HTML report for he profiling, but I am unable to get it created. When I try to run it it does not have the -f option in the usage output...
Error: "-f" is not a valid option
Usage:
-df VAL : dump folder (default: /tmp)
-di N : the stack dump to file interval in milliseconds (default: 3600000)
-dp VAL : dump file prefix (default: 7850@x220t)
-main VAL : the main class name
-si N : the stack sampling interval in milliseconds (default: 100)
-ss : start the stack sampling thread. (can also be done manually via
jmx) (default: false)
Without the -f option it works fine and I can load the output into the spf4j-ui. I am using v8.3.5. My command line is
-df /tmp -f /tmp/spf4j.profile.html -dp spf4j -ss -si 10 -main performance.Main
Thanks,
James
I'm using Java 11 and Eclipse 2020-03.
If I do this in a unit test (taken from examples on this page: http://www.spf4j.org/spf4j-slf4j-test/index.html):
LogAssert expect = TestLoggers.sys().expect("org.spf4j.test", Level.ERROR,
LogMatchers.hasFormat("Booo"));
I get the warning:
Type safety: A generic array of Matcher<TestLogRecord> is created for a varargs parameter
I think this can be fixed by annotating TestLoggers.expect()
(and other similar varargs methods) with @SafeVarags
. Then callers won't get the error. Otherwise, callers have to add @SuppressWarnings("unchecked")
in their own code.
See this article for more info on @SafeVarags
: https://www.informit.com/articles/article.aspx?p=2861454&seqNum=7
maven-avro-schema-plugin:8.5.25:avro-compile
fails for union type with following error while the maven-antrun-plugin:1.8:avro:schema
works fine :
Failed to execute goal org.spf4j:maven-avro-schema-plugin:8.5.25:avro-compile (default-cli) on project messaging-base-contract: Execution default-cli of goal org.spf4j:maven-avro-schema-plugin:8.5.25:avro-compile failed: Can't set properties on a union: [{"type":"record","name":"KeyValue","namespace":"com.foo.message.v1","doc":"The key value type.","fields":[{"name":"key","type":"string","doc":"The key string, must not be null."},{"name":"value","type":["null","string"],"doc":"The value string, value can be null.","default":"null"}]}]
The input avsc looks like:
[{
"type": "record",
"name": "KeyValue",
"namespace": "com.foo.message.v1",
"doc": "The key value type.",
"fields": [
{
"name": "key",
"type": "string",
"doc": "The key string, must not be null."
},
{
"name": "value",
"type": [
"null",
"string"
],
"doc": "The value string, value can be null.",
"default": "null"
}
]
}]
This functionality is similar with Golang defer, java Try with resources, but it will be simpler to use.
{
auto(close) a = new FileInputStream();
auto(close) b = new FileInputStream();
}
should be closed at the end of the block.
the more general differ pattern can be implemented as well:
{
a = new FileInputStream();
defer a.close();
...
}
Plugin will scan a code base for uses of System.getProperty, Long.getLong...
The result will be a avro schema that will be the equivalent of all detected System.getProperty calls...
Goals will be to:
The maven-acvo-schema-plugin
appears not to support the Avro IDL timestamp_ms
logic type, while the avro-maven-plugin
and Avro IDL itself, does.
I've put this repository together to demonstrate the issue: https://github.com/mpataki/avdl-demo
The error that is produced is as follows:
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 2.823 s
[INFO] Finished at: 2020-04-07T11:44:28-04:00
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.spf4j:maven-avro-schema-plugin:8.8.0:avro-compile (default-avro-compile) on project avdl-demo: cannot add mvnId to IDL /home/mat/code/avdl-demo/src/main/avro/A.avdl, Unable to resolve com.example.avro.timestamp_ms -> [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
And if you switch to using the avro-maven-plugin
, the project works as expected.
This documentation:
https://github.com/zolyfarkas/spf4j/blob/master/spf4j-slf4j-test/src/site/markdown/index.md
says to use the spf4j-slf4j-test-junit5
artifact for junit 5 support. However, there does not appear to be such an artifact (at least not on public maven repositories).
Is the documentation out-of-date, or is there some step I'm missing?
It took me several readings before I noticed that some properties are
spf4j.test.log.collect*
while others are
spf4j.testLog.root*
one with the period separating "test" from "log" and the other with a capital L.
Reducing the confusion would shorten my "why isn't it showing up!" headaches. 🙂
besides flush, some other operations should be exposed like:
export to csv
query for data
We have a stateLock which guards the state of the pool and needlessly synchronize on state and also use an AtomicInteger.
removing this yields a modest benchmark improvement from 3400 ops/s to 3600 ops/s
function deterministic fib (x) { fib(x-1) + fib(x-2) }
fib(0) = 0
fib(1) = 1
will throw a NPE.
the following will work:
fib = function deterministic fib (x) { fib(x-1) + fib(x-2) }
fib(0) = 0
fib(1) = 1
Avro 1.9.0 also upgraded the jackson libs.
I've been using this lib for a year or so, then yesterday I suddenly started getting an this null ptr exception.
x 11:35:58 x-3: 11:35:58.543 [vert.x-worker-thread-16] WARN o.spf4j.recyclable.impl.ObjectHolder - Validation of com.x.MyClass@93e7651 failed, detail null
x 11:35:58 x-3: 11:35:58.544 [vert.x-worker-thread-16] ERROR org.spf4j.base.AbstractRunnable - Exception in runnable:
x 11:35:58 x-3: java.lang.NullPointerException: null
x 11:35:58 x-3: at org.spf4j.recyclable.impl.ObjectHolder.validateObjectIfNotBorrowed(ObjectHolder.java:132)
x 11:35:58 x-3: at org.spf4j.recyclable.impl.RecyclingSupplierBuilder$AbstractRunnableImpl$1.handle(RecyclingSupplierBuilder.java:116)
x 11:35:58 x-3: at org.spf4j.recyclable.impl.RecyclingSupplierBuilder$AbstractRunnableImpl$1.handle(RecyclingSupplierBuilder.java:113)
x 11:35:58 x-3: at org.spf4j.recyclable.impl.SimpleSmartObjectPool.scan(SimpleSmartObjectPool.java:304)
x 11:35:58 x-3: at org.spf4j.recyclable.impl.ScalableObjectPool.scan(ScalableObjectPool.java:70)
x 11:35:58 x-3: at org.spf4j.recyclable.impl.RecyclingSupplierBuilder$AbstractRunnableImpl.doRun(RecyclingSupplierBuilder.java:113)
x 11:35:58 x-3: at org.spf4j.base.AbstractRunnable.run(AbstractRunnable.java:80)
x 11:35:58 x-3: at com.x.MyClassPoolReturnScheduler.lambda$scheduleWithFixedDelay$0(MyClassPoolReturnScheduler.java:55)
x 11:35:58 x-3: at io.vertx.core.impl.VertxImpl$InternalTimerHandler.handle(VertxImpl.java:812)
x 11:35:58 x-3: at io.vertx.core.impl.VertxImpl$InternalTimerHandler.handle(VertxImpl.java:775)
x 11:35:58 x-3: at io.vertx.core.impl.ContextImpl.lambda$wrapTask$2(ContextImpl.java:337)
x 11:35:58 x-3: at io.vertx.core.impl.TaskQueue.lambda$new$0(TaskQueue.java:60)
x 11:35:58 x-3: at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
x 11:35:58 x-3: at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
x 11:35:58 x-3: at java.lang.Thread.run(Thread.java:745)
This seems very odd to me considering the code here ensured the object is never null:
https://github.com/zolyfarkas/spf4j/blob/master/spf4j-core/src/main/java/org/spf4j/recyclable/impl/ObjectHolder.java#L128
Any ideas what could be causing this? I am initializing my RecyclingSupplierBuilder with a factory that has the following dispose method:
@Override
public void dispose(MyClass myClass) throws ObjectDisposeException {
if(myClass == null) {
System.out.println("myClass is null?");
}
myClass.close();
}
However, I don't see the "myClass is null?" being printed--the null ptr exception seems to happen before.
When trying to use spf4j-slf4j-test by adding
<dependency>
<groupId>org.spf4j</groupId>
<artifactId>spf4j-slf4j-test</artifactId>
<scope>test</scope>
<version>8.9.1</version>
</dependency>
to my pom.xml I get the following error when running mvn:
Caused by: org.apache.maven.wagon.authorization.AuthorizationException: authentication failed for https://maven.pkg.github.com/zolyfarkas/*/org/apache/avro/avro/1.10.0.2p/avro-1.10.0.2p.pom, status: 401 Unauthorized
I have verified that trying to download the file with curl also gives status 401 Unauthorized.
When sp4j-core-8.9.4is used with Spring 5.3.18 and Java 17 .0.1 I get this:
2022-05-03T15:17:19.854-07:00 [APP/PROC/WEB/0] [OUT] {"ts":"2022-05-03T15:17:19.822-07:00","tid":"main","level":"INFO","class":"o.s.b.Reflections","msg":"Para type stealing from Constructor not supported", "exception":"java.lang.NoSuchFieldException: parameterTypes", "stack":["java.lang.Class.getDeclaredField(^) [:]","o.s.b.Reflections.lambda$static$1(Reflections.java:98) [file:/home/vcap/app/BOOT-INF/lib/spf4j-core-8.9.4.jar:8.9.4]","j.s.AccessController.doPrivileged(Unknown Source) [:]","o.s.b.Reflections.<clinit>(Reflections.java:95) [file:/home/vcap/app/BOOT-INF/lib/spf4j-core-8.9.4.jar:8.9.4]","o.s.j.ExportedValue.toAttributeInfo(ExportedValue.java:67) [file:/home/vcap/app/BOOT-INF/lib/spf4j-core-8.9.4.jar:8.9.4]","o.s.j.ExportedValuesMBean.createBeanInfo(ExportedValuesMBean.java:282) [file:/home/vcap/app/BOOT-INF/lib/spf4j-core-8.9.4.jar:8.9.4]","^.<init>(^:82) [^]","o.s.j.DynamicMBeanBuilder.replace(DynamicMBeanBuilder.java:234) [file:/home/vcap/app/BOOT-INF/lib/spf4j-core-8.9.4.jar:8.9.4]","o.s.j.Registry.export(Registry.java:180) [file:/home/vcap/app/BOOT-INF/lib/spf4j-core-8.9.4.jar:8.9.4]","^.export(^:171) [^]","c.p.j.c.a.h.ArtifactorySchemaClient.<clinit>(ArtifactorySchemaClient.java:48) [file:/home/vcap/app/BOOT-INF/lib/core-avro-5.11.0.jar:5.11.0]"
I already have this command line option:
--add-exports java.base/java.lang=ALL-UNNAMED
but it is not helping. Any clue?
Caused by: java.lang.NullPointerException
at org.spf4j.maven.plugin.avro.avscp.validation.impl.SchemaCompatibilityValidator.validateCompatibility (SchemaCompatibilityValidator.java:183)
at org.spf4j.maven.plugin.avro.avscp.validation.impl.SchemaCompatibilityValidator.validate (SchemaCompatibilityValidator.java:133)
at org.spf4j.maven.plugin.avro.avscp.validation.impl.SchemaCompatibilityValidator.validate (SchemaCompatibilityValidator.java:79)
at org.spf4j.maven.plugin.avro.avscp.validation.Validators.validate (Validators.java:79)
at org.spf4j.maven.plugin.avro.avscp.SchemaValidatorMojo.execute (SchemaValidatorMojo.java:75)
at org.apache.maven.plugin.DefaultBuildPluginManager.executeMojo (DefaultBuildPluginManager.java:137)
at org.apache.maven.lifecycle.internal.MojoExecutor.execute (MojoExecutor.java:208)
at org.apache.maven.lifecycle.internal.MojoExecutor.execute (MojoExecutor.java:154)
at org.apache.maven.lifecycle.internal.MojoExecutor.execute (MojoExecutor.java:146)
at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject (LifecycleModuleBuilder.java:117)
at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject (LifecycleModuleBuilder.java:81)
at org.apache.maven.lifecycle.internal.builder.singlethreaded.SingleThreadedBuilder.build (SingleThreadedBuilder.java:56)
at org.apache.maven.lifecycle.internal.LifecycleStarter.execute (LifecycleStarter.java:128)
at org.apache.maven.DefaultMaven.doExecute (DefaultMaven.java:305)
at org.apache.maven.DefaultMaven.doExecute (DefaultMaven.java:192)
at org.apache.maven.DefaultMaven.execute (DefaultMaven.java:105)
at org.apache.maven.cli.MavenCli.execute (MavenCli.java:954)
at org.apache.maven.cli.MavenCli.doMain (MavenCli.java:288)
at org.apache.maven.cli.MavenCli.main (MavenCli.java:192)
at sun.reflect.NativeMethodAccessorImpl.invoke0 (Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke (NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke (DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke (Method.java:498)
at org.codehaus.plexus.classworlds.launcher.Launcher.launchEnhanced (Launcher.java:289)
at org.codehaus.plexus.classworlds.launcher.Launcher.launch (Launcher.java:229)
at org.codehaus.plexus.classworlds.launcher.Launcher.mainWithExitCode (Launcher.java:415)
at org.codehaus.plexus.classworlds.launcher.Launcher.main (Launcher.java:356)
This is based on the master as of 10/09/2018.
Tests run: 10, Failures: 1, Errors: 0, Skipped: 0, Time elapsed: 11.863 s <<< FAILURE! - in org.spf4j.base.RuntimeTest
testSomeParams(org.spf4j.base.RuntimeTest) Time elapsed: 0.242 s <<< FAILURE!
java.lang.AssertionError
at org.junit.Assert.fail(Assert.java:86)
at org.junit.Assert.assertTrue(Assert.java:41)
at org.junit.Assert.assertNotNull(Assert.java:712)
at org.junit.Assert.assertNotNull(Assert.java:722)
at org.spf4j.base.RuntimeTest.testSomeParams(RuntimeTest.java:102)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:50)
at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:47)
at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:325)
at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:78)
at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:57)
at org.junit.runners.ParentRunner$3.run(ParentRunner.java:290)
at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:71)
at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:288)
at org.junit.runners.ParentRunner.access$000(ParentRunner.java:58)
at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:268)
at org.junit.runners.ParentRunner.run(ParentRunner.java:363)
at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:365)
at org.apache.maven.surefire.junit4.JUnit4Provider.executeWithRerun(JUnit4Provider.java:273)
at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:238)
at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:159)
at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:383)
at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:344)
at org.apache.maven.surefire.booter.ForkedBooter.execute(ForkedBooter.java:125)
at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:417)
sure if this is a related Failure but I have see this type of error on separate builds:
[INFO] Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.558 s - in org.spf4j.net.SntpClientTest
2018-10-09T04:03:03.916Z ERROR o.s.b.AbstractRunnable "defaultExecutor1" "Exception in runnable: "
java.net.BindException:Address already in use (Bind failed)
at j.n.PlainDatagramSocketImpl.bind0(Native Method)[:1.8.0_181]
at j.n.AbstractPlainDatagramSocketImpl.bind(AbstractPlainDatagramSocketImpl.java:93)[^]
at j.n.DatagramSocket.bind(DatagramSocket.java:392)[^]
at ^.(^:242)[^]
at ^.(^:299)[^]
at ^.(^:271)[^]
at o.s.n.SntpClientTest$1$1.doRun(SntpClientTest.java:116)[test-classes/]
at o.s.b.AbstractRunnable.run(AbstractRunnable.java:97)[classes/]
at j.u.c.Executors$RunnableAdapter.call(Executors.java:511)[:1.8.0_181]
at j.u.c.FutureTask.run(FutureTask.java:266)[^]
at o.s.c.LifoThreadPoolExecutorSQP$QueuedThread.execute(LifoThreadPoolExecutorSQP.java:544)[classes/]
at ^.run(^:464)[^]
There are some useful new checks in there
@JmxExport
getSomething(SomeType someParam)
should be exported as operation, but it looks like it is not...
With this sort of case
LogAssert expect = TestLoggers.sys().expect("org.spf4j.test", Level.WARN,
LogMatchers.hasFormat("Booo"));
LOG.warn("Booo", new RuntimeException());
// oops, lost the `assertObservation`!
the test will not fail if it does not produce the expected logs.
The @ExpectLog
annotation avoids this hazard, but it is limited by the annotation requirement of only using compile-time values.
It'd be nice to have ways that make sure the assertion always runs and that still let you use matchers.
(Exactly how that looks probably depends on the test framework.)
Should allow for trending object instances...
The JVM can print this out, stdout can be captured automatically, data parsed and stored in a TSDB.
Data can then be visualized with the UI, new functionality will probably be required...
capture proof of concept in: https://github.com/zolyfarkas/spf4j/blob/master/spf4j-experimental/src/test/java/org/spf4j/perf/memory/VMHistogramsTest.java
Hi I've tagged my method with:
@JmxExport
@PerformanceMonitor(warnThresholdMillis=1, errorThresholdMillis=100, recorderSource = RecorderSourceInstance.Rs5m.class)
And I get exception on it:
Exception in thread "AWT-EventQueue-0" java.lang.NoSuchMethodError: org.spf4j.perf.aspects.PerformanceMonitorAspect.aspectOf()Lorg/spf4j/perf/aspects/PerformanceMonitorAspect;
I'm starting application From Eclipse with JDK 1.7.0_55
-javaagent:depend/aspectj/1.8.6/lib/aspectjweaver-1.8.6.jar
and with aop-ajc.xml in src/META-INFO folder
`
We can this way start/stop the profiler sampling, and have the profiler run only when we do something....
Library looks very interesting and I'd like to use it in some of my open source projects but LGPL is incompatible with the Apache License 2.0. Would it be possible to dual license this library as MIT or Apache in addition to the current LGPL license?
Although there are several libs already doing this... (https://mina.apache.org, http://netty.io, https://github.com/terma/java-nio-tcp-proxy)
The aim here is the have a simple high quality implementation that is easy to use for testing distributed services with standard or custom protocols.
When I run my unit tests, I get this warning:
WARNING: An illegal reflective access operation has occurred
WARNING: Illegal reflective access by org.spf4j.log.SLF4JBridgeHandler$3 (file:/C:/Users/bob.marinier/.m2/repository/org/spf4j/spf4j-slf4j-test/8.8.1/spf4j-slf4j-test-8.8.1.jar) to field java.util.logging.LogRecord.needToInferCaller
WARNING: Please consider reporting this to the maintainers of org.spf4j.log.SLF4JBridgeHandler$3
WARNING: Use --illegal-access=warn to enable warnings of further illegal reflective access operations
WARNING: All illegal access operations will be denied in a future release
This is a Java 9+ issue. The answer to this question has more info: https://stackoverflow.com/questions/50251798/what-is-an-illegal-reflective-access
I can think of a few ways to potentially address this:
--add-opens=java.logging/java.util.logging=ALL-UNNAMED
--add-opens=java.base/java.lang=ALL-UNNAMED
(The first line addresses the specific warning above. The second line addresses another warning that comes up when the first one is addressed. Note, the previous suggestion would have this information in the module-info.java file, and thus end users wouldn't have to do this themselves.)
The existence of the maven-avro-schema-plugin
plugin in a project breaks the mvn deploy
phase if you're deploying to a nexus repository.
Here's an example error message:
[INFO] --- maven-deploy-plugin:3.0.0-M1:deploy (default-deploy) @ avro-schemas ---
[INFO] Using alternate deployment repository nexus-snapshot::default::https://nexus.company.com/repository/maven-snapshots
Uploading to : https://nexus.company.com/repository/maven-snapshots/com/company/avro-schemas/20-04-12_20-35_d7d3aa1/avro-schemas-20-04-12_20-35_d7d3aa1.jar
Uploading to : https://nexus.company.com/repository/maven-snapshots/com/company/avro-schemas/20-04-12_20-35_d7d3aa1/avro-schemas-20-04-12_20-35_d7d3aa1.pom
Uploading to : https://nexus.company.com/repository/maven-snapshots/com/company/avro-schemas/20-04-12_20-35_d7d3aa1/avro-schemas-20-04-12_20-35_d7d3aa1-avsc.jar
Uploading to : https://nexus.company.com/repository/maven-snapshots/com/company/avro-schemas/20-04-12_20-35_d7d3aa1/avro-schemas-20-04-12_20-35_d7d3aa1-avroSources.jar
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 4.721 s
[INFO] Finished at: 2020-04-12T19:25:29-04:00
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-deploy-plugin:3.0.0-M1:deploy (default-deploy) on project avro-schemas: ArtifactDeployerException: Failed to deploy artifacts: Could not transfer artifact com.company:avro-schemas:jar:20-04-12_20-35_d7d3aa1 from/to nexus-snapshot::default (https://nexus.company.com/repository/maven-snapshots): Unauthorized (401) -> [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
Don't let the 401 Unauthorized
error code mislead you. Nexus tends to through this error for a number of reasons that have nothing to do with credentials. I can confirm that the credentials I'm using work via deploying other projects with them.
Further, if I package my avro application, then comment out the maven-avro-schema-plugin
plugin, and attempt to deploy my jar, it works just fine. Leaving it present causes the above error.
With the plugin in play, the deploy phase attempts to upload two additional jar, ending with -avroSources.jar
and -avsc.jar
, to my nexus repository. I suspect these are violating some naming rule, or similar constraint, but I've unfortunately not been able to track down the exact source of the violation from nexus just yet. If I do, I'll report back.
At least for my use case, I don't believe these additional jars are necessary. If nothing else, it would be helpful if there were some mechanism provided which would let me turn off this behaviour as I really just need this plugin for it's avro-compile
goal. Of course, if the jars were to upload without issue to nexus then I'd have no problem letting the plugin do this, alternatively.
Thanks in advance.
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.