Giter VIP home page Giter VIP logo

logstash-gelf's Introduction

logstash-gelf

This project is now archived, after a decade of maintenance, with 36 releases and several occasional contributors. The project is feature complete, and we see little issue traffic. With me being the sole maintainer, it is about time to turn off the lights here and move on to free up time for other duties. K, thx, good bye.

Join the chat at https://gitter.im/mp911de/logstash-gelf

Maven Central

Provides logging to logstash using the Graylog Extended Logging Format (GELF 1.0 and 1.1) for using with:

logstash-gelf requires as of version 1.14.0 Java 7 or higher. Version 1.13.x and older require Java 6. See also http://logging.paluch.biz/ or http://www.graylog2.org/resources/gelf/specification for further documentation.

Including it in your project

Maven:

<dependency>
    <groupId>biz.paluch.logging</groupId>
    <artifactId>logstash-gelf</artifactId>
    <version>x.y.z</version>
</dependency>

Direct download from Maven Central

JBoss AS/WildFly Module Download:

<dependency>
    <groupId>biz.paluch.logging</groupId>
    <artifactId>logstash-gelf</artifactId>
    <version>x.y.z</version>
    <classifier>logging-module</classifier>
</dependency>

Direct download from Maven Central

Using snapshot builds:

<dependency>
    <groupId>biz.paluch.logging</groupId>
    <artifactId>logstash-gelf</artifactId>
    <version>x.y.z-SNAPSHOT</version>
</dependency>

<repositories>
    <repository>
        <id>sonatype-nexus-snapshots</id>
        <url>https://oss.sonatype.org/content/repositories/snapshots</url>
        <snapshots>
            <enabled>true</enabled>
        </snapshots>
    </repository>
</repositories>

Properties

handlers = biz.paluch.logging.gelf.jul.GelfLogHandler, java.util.logging.ConsoleHandler

.handlers = biz.paluch.logging.gelf.jul.GelfLogHandler, java.util.logging.ConsoleHandler
.level = INFO

biz.paluch.logging.gelf.jul.GelfLogHandler.host=udp:localhost
biz.paluch.logging.gelf.jul.GelfLogHandler.port=12201
biz.paluch.logging.gelf.jul.GelfLogHandler.version=1.1
biz.paluch.logging.gelf.jul.GelfLogHandler.facility=java-test
biz.paluch.logging.gelf.jul.GelfLogHandler.extractStackTrace=true
biz.paluch.logging.gelf.jul.GelfLogHandler.filterStackTrace=true
biz.paluch.logging.gelf.jul.GelfLogHandler.timestampPattern=yyyy-MM-dd HH:mm:ss,SSS
biz.paluch.logging.gelf.jul.GelfLogHandler.maximumMessageSize=8192

# This are static fields
biz.paluch.logging.gelf.jul.GelfLogHandler.additionalFields=fieldName1=fieldValue1,fieldName2=fieldValue2
# Optional: Specify field types
biz.paluch.logging.gelf.jul.GelfLogHandler.additionalFieldTypes=fieldName1=String,fieldName2=Double,fieldName3=Long
biz.paluch.logging.gelf.jul.GelfLogHandler.level=INFO

Glassfish/Payara configuration

Install the library with its dependencies (see download above) in Glassfish. Place it below the $GFHOME/glassfish/domains/$YOURDOMAIN/lib/ext/ path, then add the Java Util Logging to your logging.properties file.

Properties

log4j.appender.gelf=biz.paluch.logging.gelf.log4j.GelfLogAppender
log4j.appender.gelf.Threshold=INFO
log4j.appender.gelf.Host=udp:localhost
log4j.appender.gelf.Port=12201
log4j.appender.gelf.Version=1.1
log4j.appender.gelf.Facility=java-test
log4j.appender.gelf.ExtractStackTrace=true
log4j.appender.gelf.FilterStackTrace=true
log4j.appender.gelf.MdcProfiling=true
log4j.appender.gelf.TimestampPattern=yyyy-MM-dd HH:mm:ss,SSS
log4j.appender.gelf.MaximumMessageSize=8192

# This are static fields
log4j.appender.gelf.AdditionalFields=fieldName1=fieldValue1,fieldName2=fieldValue2
# Optional: Specify field types
log4j.appender.gelf.AdditionalFieldTypes=fieldName1=String,fieldName2=Double,fieldName3=Long

# This are fields using MDC
log4j.appender.gelf.MdcFields=mdcField1,mdcField2
log4j.appender.gelf.DynamicMdcFields=mdc.*,(mdc|MDC)fields
log4j.appender.gelf.DynamicMdcFieldTypes=my_field.*=String,business\..*\.field=double
log4j.appender.gelf.IncludeFullMdc=true

XML

<appender name="gelf" class="biz.paluch.logging.gelf.log4j.GelfLogAppender">
    <param name="Threshold" value="INFO" />
    <param name="Host" value="udp:localhost" />
    <param name="Port" value="12201" />
    <param name="Version" value="1.1" />
    <param name="Facility" value="java-test" />
    <param name="ExtractStackTrace" value="true" />
    <param name="FilterStackTrace" value="true" />
    <param name="MdcProfiling" value="true" />
    <param name="TimestampPattern" value="yyyy-MM-dd HH:mm:ss,SSS" />
    <param name="MaximumMessageSize" value="8192" />
    
    <!-- This are static fields -->
    <param name="AdditionalFields" value="fieldName1=fieldValue1,fieldName2=fieldValue2" />
    <!-- Optional: Specify field types -->
    <param name="AdditionalFieldTypes" value="fieldName1=String,fieldName2=Double,fieldName3=Long" />
    
    <!-- This are fields using MDC -->
    <param name="MdcFields" value="mdcField1,mdcField2" />
    <param name="DynamicMdcFields" value="mdc.*,(mdc|MDC)fields" />
    <param name="DynamicMdcFieldTypes" value="my_field.*=String,business\..*\.field=double" />
    <param name="IncludeFullMdc" value="true" />
</appender>

Fields

Log4j v2 supports an extensive and flexible configuration in contrast to other log frameworks (JUL, log4j v1). This allows you to specify your needed fields you want to use in the GELF message. An empty field configuration results in a message containing only

  • timestamp
  • level (syslog level)
  • host
  • facility
  • message
  • short_message

You can add different fields:

  • Static Literals
  • MDC Fields
  • Log-Event fields (using Pattern Layout)

In order to do so, use nested Field elements below the Appender element.

Static Literals

<Field name="fieldName1" literal="your literal value" />

MDC Fields

<Field name="fieldName1" mdc="name of the MDC entry" />

Dynamic MDC Fields

<DynamicMdcFields regex="mdc.*" />

In contrast to the configuration of other log frameworks log4j2 config uses one DynamicMdcFields element per regex (not separated by comma).

Log-Event fields

See also: Pattern Layout

Set the desired pattern and the field will be sent using the specified pattern value.

Additionally, you can add the host-Field, which can supply you either the FQDN hostname, the simple hostname or the local address.

Option Description
host{["fqdn"
"simple"
"address"]}
Outputs either the FQDN hostname, the simple hostname or the local address. You can follow the throwable conversion word with an option in the form %host{option}.
%host{fqdn} default setting, outputs the FQDN hostname, e.g. www.you.host.name.com.
%host{simple} outputs simple hostname, e.g. www.
%host{address} outputs the local IP address of the found hostname, e.g. 1.2.3.4 or affe:affe:affe::1.

XML

<Configuration packages="biz.paluch.logging.gelf.log4j2">
    <Appenders>
        <Gelf name="gelf" host="udp:localhost" port="12201" version="1.1" extractStackTrace="true"
              filterStackTrace="true" mdcProfiling="true" includeFullMdc="true" maximumMessageSize="8192"
              originHost="%host{fqdn}" additionalFieldTypes="fieldName1=String,fieldName2=Double,fieldName3=Long">
            <Field name="timestamp" pattern="%d{dd MMM yyyy HH:mm:ss,SSS}" />
            <Field name="level" pattern="%level" />
            <Field name="simpleClassName" pattern="%C{1}" />
            <Field name="className" pattern="%C" />
            <Field name="server" pattern="%host" />
            <Field name="server.fqdn" pattern="%host{fqdn}" />
            
            <!-- This is a static field -->
            <Field name="fieldName2" literal="fieldValue2" />
             
            <!-- This is a field using MDC -->
            <Field name="mdcField2" mdc="mdcField2" /> 
            <DynamicMdcFields regex="mdc.*" />
            <DynamicMdcFields regex="(mdc|MDC)fields" />
            <DynamicMdcFieldType regex="my_field.*" type="String" />
        </Gelf>
    </Appenders>
    <Loggers>
        <Root level="INFO">
            <AppenderRef ref="gelf" />
        </Root>
    </Loggers>
</Configuration>    

YAML

rootLogger:
    level: INFO
    appenderRef.gelf.ref: GelfAppender

appender.gelf:
    type: Gelf
    name: GelfAppender
    host: udp:localhost
    port: 12201
    version: 1.0
    includeFullMdc: true
    mdcProfiling: true
    maximumMessageSize: 32768
    dynamicMdcFields:
        type: DynamicMdcFields
        regex: "mdc.*,(mdc|MDC)fields"
    field:
        - name: fieldName2
          literal: fieldName2 # This is a static field
        - name: className
          pattern: "%C"
        - name: lineNumber
          pattern: "%line"

Include the library as module (see download above), then add following lines to your configuration:

standalone.xml

<custom-handler name="GelfLogger" class="biz.paluch.logging.gelf.jboss7.JBoss7GelfLogHandler" module="biz.paluch.logging">
    <level name="INFO" />
    <properties>
        <property name="host" value="udp:localhost" />
        <property name="port" value="12201" />
        <property name="version" value="1.1" />
        <property name="facility" value="java-test" />
        <property name="extractStackTrace" value="true" />
        <property name="filterStackTrace" value="true" />
        <property name="mdcProfiling" value="true" />
        <property name="timestampPattern" value="yyyy-MM-dd HH:mm:ss,SSS" />
        <property name="maximumMessageSize" value="8192" />
        
        <!-- This are static fields -->
        <property name="additionalFields" value="fieldName1=fieldValue1,fieldName2=fieldValue2" />
        <!-- Optional: Specify field types -->
        <property name="additionalFieldTypes" value="fieldName1=String,fieldName2=Double,fieldName3=Long" />
        
        <!-- This are fields using MDC -->
        <property name="mdcFields" value="mdcField1,mdcField2" />
        <property name="dynamicMdcFields" value="mdc.*,(mdc|MDC)fields" />
        <property name="dynamicMdcFieldTypes" value="my_field.*=String,business\..*\.field=double" />
        <property name="includeFullMdc" value="true" />
    </properties>
</custom-handler>

...

<root-logger>
    <level name="INFO"/>
    <handlers>
        <handler name="FILE"/>
        <handler name="CONSOLE"/>
        <handler name="GelfLogger"/>
    </handlers>
</root-logger>

Include the library as module (see download above). Place it below the $JBOSS_HOME/modules/system/layers/base path, then add following lines to your configuration:

standalone.xml

<custom-handler name="GelfLogger" class="biz.paluch.logging.gelf.wildfly.WildFlyGelfLogHandler" module="biz.paluch.logging">
    <level name="INFO" />
    <properties>
        <property name="host" value="udp:localhost" />
        <property name="port" value="12201" />
        <property name="version" value="1.1" />
        <property name="facility" value="java-test" />
        <property name="extractStackTrace" value="true" />
        <property name="filterStackTrace" value="true" />
        <property name="mdcProfiling" value="true" />
        <property name="timestampPattern" value="yyyy-MM-dd HH:mm:ss,SSS" />
        <property name="maximumMessageSize" value="8192" />
        
        <!-- This are static fields -->
        <property name="additionalFields" value="fieldName1=fieldValue1,fieldName2=fieldValue2" />
        <!-- Optional: Specify field types -->
        <property name="additionalFieldTypes" value="fieldName1=String,fieldName2=Double,fieldName3=Long" />
        
        <!-- This are fields using MDC -->
        <property name="mdcFields" value="mdcField1,mdcField2" />
        <property name="dynamicMdcFields" value="mdc.*,(mdc|MDC)fields" />
        <property name="dynamicMdcFieldTypes" value="my_field.*=String,business\..*\.field=double" />
        <property name="includeFullMdc" value="true" />
    </properties>
</custom-handler>

...

<root-logger>
    <level name="INFO"/>
    <handlers>
        <handler name="FILE"/>
        <handler name="CONSOLE"/>
        <handler name="GelfLogger"/>
    </handlers>
</root-logger>

Include module-thorntail.xml from the logging module zip (see download above). Place it below the src/main/resources/modules/biz/paluch/logging/main path as module.xml, then add following lines to your project-stages.yml:

project-stages.yml:

swarm:
  logging:
    custom-handlers:
      GelfLogger:
        attribute-class: biz.paluch.logging.gelf.wildfly.WildFlyGelfLogHandler
        module: biz.paluch.logging
        properties:
            host: "udp:localhost"
            port: 12201
            version: "1.0"
            facility: "java-test"
            extractStackTrace: true
            filterStackTrace: true
            includeLocation: true
            mdcProfiling: true
            timestampPattern: "yyyy-MM-dd HH:mm:ss,SSS"
            maximumMessageSize: 8192
            additionalFields: "fieldName1=fieldValue1,fieldName2=fieldValue2"
            additionalFieldTypes: "my_field.*=String,business\..*\.field=double"
            MdcFields: "mdcField1,mdcField2"
            dynamicMdcFields: "mdc.*,(mdc|MDC)fields"
            includeFullMdc: true
    root-logger:
      level: INFO
      handlers:
      - GelfLogger

logback.xml Example:

<!DOCTYPE configuration>

<configuration>
    <contextName>test</contextName>
    <jmxConfigurator/>

    <appender name="gelf" class="biz.paluch.logging.gelf.logback.GelfLogbackAppender">
        <host>udp:localhost</host>
        <port>12201</port>
        <version>1.1</version>
        <facility>java-test</facility>
        <extractStackTrace>true</extractStackTrace>
        <filterStackTrace>true</filterStackTrace>
        <mdcProfiling>true</mdcProfiling>
        <timestampPattern>yyyy-MM-dd HH:mm:ss,SSS</timestampPattern>
        <maximumMessageSize>8192</maximumMessageSize>
        
        <!-- This are static fields -->
        <additionalFields>fieldName1=fieldValue1,fieldName2=fieldValue2</additionalFields>
        <!-- Optional: Specify field types -->
        <additionalFieldTypes>fieldName1=String,fieldName2=Double,fieldName3=Long</additionalFieldTypes>
        
        <!-- This are fields using MDC -->
        <mdcFields>mdcField1,mdcField2</mdcFields>
        <dynamicMdcFields>mdc.*,(mdc|MDC)fields</dynamicMdcFields>
        <dynamicMdcFieldTypes>my_field.*=String,business\..*\.field=double</dynamicMdcFieldTypes>
        <includeFullMdc>true</includeFullMdc>
        <filter class="ch.qos.logback.classic.filter.ThresholdFilter">
            <level>INFO</level>
        </filter>
    </appender>

    <root level="DEBUG">
        <appender-ref ref="gelf" />
    </root>
</configuration>

License

Contributing

Github is for social coding: if you want to write code, I encourage contributions through pull requests from forks of this repository. Create Github tickets for bugs and new features and comment on the ones that you are interested in and take a look into CONTRIBUTING.md

logstash-gelf's People

Contributors

batigoal avatar cchet avatar checktheflow avatar cumafo avatar dengliming avatar dependabot[bot] avatar elektro-wolle avatar felixhamel avatar gitter-badger avatar janheuninck avatar jeepers avatar jhorstmann avatar jjungnickel avatar kenche avatar loicmathieu avatar madmuffin1 avatar matthiasblaesing avatar mmichailidis avatar mp911de avatar netudima avatar normakm avatar pax90 avatar salex89 avatar sfuhrm avatar snyk-bot avatar stankevichevg avatar steven-aerts avatar tkaefer avatar waldeinburg avatar yuri1969 avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

logstash-gelf's Issues

Library update

Update to

  • json-simple 1.1.1
  • log4j2 2.0
  • commons-pool2 2.2

and pom.xml updates

The upgrade to log4j2 breaks backwards compatibility to any beta and rc of log4j2 since classes moved and signatures of used methods changed.

Code Cleanup

  • Remove Test-Sender attributes from Code, use Java Services strategy for test-sender classes
  • Adjust test-properties in order to deal with removed test-sender property

Line number

Hello,
I'd like to use logstash-gelf library to log with the gelf appender to a logstash agent. I am using logback library and I could not find the possibilty to include line numbers in gelf log messages. Is it possible and how can I configure it? Here is my GelfLogAppender configuration:

<appender name="GELF" class="biz.paluch.logging.gelf.logback.GelfLogbackAppender">
        <host>udp:localhost</host>
        <port>12201</port>
        <version>1.0</version>
        <extractStackTrace>true</extractStackTrace>
        <filterStackTrace>false</filterStackTrace>
        <mdcProfiling>true</mdcProfiling>
        <timestampPattern>yyyy-MM-dd HH:mm:ss,SSSS</timestampPattern>
        <includeFullMdc>true</includeFullMdc>
        <line>true</line>
    </appender>

RuntimeContainer.lookupHostname can take a REALLY long time.

Hi,

I added gelf logging to a project of mine and noticed a considerable drop in performance. The project usually runs within a couple of seconds, but now it took about 15 seconds.

After some profiling and debugging I found that the lookupHostname method in RuntimeContainer takes a long time to process.

The reason for this is (as I understand it) this class goes over all network adapters in order to lookup a hostname. I have 3 network adapters (a USB/ethernet connection, WiFi and a virtual box Host-Only network adapter). Especially that last one took a long time with its hostname lookup (about 10 seconds).

I just set the logstash-gelf.skipHostnameResolution to true and was happy that I got my usual performance back.

Now here's my question: what purpose does hostname resolution serve? Can I just keep it disabled?

Kind Regards,

DenEwout

Log4j 1.2.x messages are silently dropped if MessageSize is not set

2013-12-20 08:55:04,037 ERROR  [STDERR] java.lang.ArithmeticException: / by zero
2013-12-20 08:55:04,060 ERROR  [STDERR]     at biz.paluch.logging.gelf.intern.GelfMessage.toUDPBuffers(GelfMessage.java:71)
2013-12-20 08:55:04,078 ERROR  [STDERR]     at biz.paluch.logging.gelf.intern.GelfUDPSender.sendMessage(GelfUDPSender.java:40)
2013-12-20 08:55:04,095 ERROR  [STDERR]     at biz.paluch.logging.gelf.log4j.GelfLogAppender.append(GelfLogAppender.java:100)
2013-12-20 08:55:04,103 ERROR  [STDERR]     at org.apache.log4j.AppenderSkeleton.doAppend(AppenderSkeleton.java:230)
2013-12-20 08:55:04,113 ERROR  [STDERR]     at org.apache.log4j.helpers.AppenderAttachableImpl.appendLoopOnAppenders(AppenderAttachableImpl.java:65)
2013-12-20 08:55:04,120 ERROR  [STDERR]     at org.apache.log4j.Category.callAppenders(Category.java:203)
2013-12-20 08:55:04,128 ERROR  [STDERR]     at org.apache.log4j.Category.forcedLog(Category.java:388)
2013-12-20 08:55:04,148 ERROR  [STDERR]     at org.apache.log4j.Category.error(Category.java:319)
2013-12-20 08:55:04,156 ERROR  [STDERR]     at com.kaufland.dms.filenet.logging.ExceptionLoggingInterceptor.executeOperation(ExceptionLoggingInterceptor.java:20)
2013-12-20 08:55:04,165 ERROR  [STDERR]     at sun.reflect.GeneratedMethodAccessor141.invoke(Unknown Source)
2013-12-20 08:55:04,172 ERROR  [STDERR]     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
2013-12-20 08:55:04,179 ERROR  [STDERR]     at java.lang.reflect.Method.invoke(Method.java:606)
2013-12-20 08:55:04,187 ERROR  [STDERR]     at org.jboss.ejb3.interceptor.InvocationContextImpl.proceed(InvocationContextImpl.java:118)
2013-12-20 08:55:04,195 ERROR  [STDERR]     at com.kaufland.dms.filenet.profiling.ProfilingInterceptor.interceptInvocation(ProfilingInterceptor.java:36)
2013-12-20 08:55:04,204 ERROR  [STDERR]     at sun.reflect.GeneratedMethodAccessor140.invoke(Unknown Source)
2013-12-20 08:55:04,213 ERROR  [STDERR]     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
2013-12-20 08:55:04,221 ERROR  [STDERR]     at java.lang.reflect.Method.invoke(Method.java:606)
2013-12-20 08:55:04,229 ERROR  [STDERR]     at org.jboss.ejb3.interceptor.InvocationContextImpl.proceed(InvocationContextImpl.java:118)
2013-12-20 08:55:04,237 ERROR  [STDERR]     at com.kaufland.dms.filenet.ContentEngineDAOInterceptor.executeOperationAsAdmin(ContentEngineDAOInterceptor.java:23)
2013-12-20 08:55:04,246 ERROR  [STDERR]     at sun.reflect.GeneratedMethodAccessor139.invoke(Unknown Source)
2013-12-20 08:55:04,255 ERROR  [STDERR]     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
2013-12-20 08:55:04,263 ERROR  [STDERR]     at java.lang.reflect.Method.invoke(Method.java:606)
2013-12-20 08:55:04,272 ERROR  [STDERR]     at org.jboss.ejb3.interceptor.InvocationContextImpl.proceed(InvocationContextImpl.java:118)
2013-12-20 08:55:04,284 ERROR  [STDERR]     at org.jboss.ejb3.interceptor.EJB3InterceptorsInterceptor.invoke(EJB3InterceptorsInterceptor.java:63)
2013-12-20 08:55:04,292 ERROR  [STDERR]     at org.jboss.aop.joinpoint.MethodInvocation.invokeNext(MethodInvocation.java:101)
2013-12-20 08:55:04,300 ERROR  [STDERR]     at org.jboss.ejb3.entity.TransactionScopedEntityManagerInterceptor.invoke(TransactionScopedEntityManagerInterceptor.java:54)
2013-12-20 08:55:04,308 ERROR  [STDERR]     at org.jboss.aop.joinpoint.MethodInvocation.invokeNext(MethodInvocation.java:101)
2013-12-20 08:55:04,317 ERROR  [STDERR]     at org.jboss.ejb3.AllowedOperationsInterceptor.invoke(AllowedOperationsInterceptor.java:47)
2013-12-20 08:55:04,325 ERROR  [STDERR]     at org.jboss.aop.joinpoint.MethodInvocation.invokeNext(MethodInvocation.java:101)
2013-12-20 08:55:04,334 ERROR  [STDERR]     at org.jboss.aspects.tx.TxPolicy.invokeInOurTx(TxPolicy.java:79)
2013-12-20 08:55:04,354 ERROR  [STDERR]     at org.jboss.aspects.tx.TxInterceptor$Required.invoke(TxInterceptor.java:191)
2013-12-20 08:55:04,365 ERROR  [STDERR]     at org.jboss.aop.joinpoint.MethodInvocation.invokeNext(MethodInvocation.java:101)
2013-12-20 08:55:04,385 ERROR  [STDERR]     at org.jboss.aspects.tx.TxPropagationInterceptor.invoke(TxPropagationInterceptor.java:95)
2013-12-20 08:55:04,406 ERROR  [STDERR]     at org.jboss.aop.joinpoint.MethodInvocation.invokeNext(MethodInvocation.java:101)
2013-12-20 08:55:04,417 ERROR  [STDERR]     at org.jboss.ejb3.stateless.StatelessInstanceInterceptor.invoke(StatelessInstanceInterceptor.java:62)
2013-12-20 08:55:04,428 ERROR  [STDERR]     at org.jboss.aop.joinpoint.MethodInvocation.invokeNext(MethodInvocation.java:101)
2013-12-20 08:55:04,439 ERROR  [STDERR]     at org.jboss.aspects.security.RoleBasedAuthorizationInterceptor.invoke(RoleBasedAuthorizationInterceptor.java:166)
2013-12-20 08:55:04,450 ERROR  [STDERR]     at org.jboss.ejb3.security.RoleBasedAuthorizationInterceptor.invoke(RoleBasedAuthorizationInterceptor.java:115)
2013-12-20 08:55:04,460 ERROR  [STDERR]     at org.jboss.aop.joinpoint.MethodInvocation.invokeNext(MethodInvocation.java:101)
2013-12-20 08:55:04,471 ERROR  [STDERR]     at org.jboss.aspects.security.AuthenticationInterceptor.invoke(AuthenticationInterceptor.java:77)
2013-12-20 08:55:04,478 ERROR  [STDERR]     at org.jboss.ejb3.security.Ejb3AuthenticationInterceptor.invoke(Ejb3AuthenticationInterceptor.java:110)
2013-12-20 08:55:04,490 ERROR  [STDERR]     at org.jboss.aop.joinpoint.MethodInvocation.invokeNext(MethodInvocation.java:101)
2013-12-20 08:55:04,499 ERROR  [STDERR]     at org.jboss.ejb3.ENCPropagationInterceptor.invoke(ENCPropagationInterceptor.java:46)
2013-12-20 08:55:04,505 ERROR  [STDERR]     at org.jboss.aop.joinpoint.MethodInvocation.invokeNext(MethodInvocation.java:101)
2013-12-20 08:55:04,513 ERROR  [STDERR]     at org.jboss.ejb3.asynchronous.AsynchronousInterceptor.invoke(AsynchronousInterceptor.java:106)
2013-12-20 08:55:04,521 ERROR  [STDERR]     at org.jboss.aop.joinpoint.MethodInvocation.invokeNext(MethodInvocation.java:101)
2013-12-20 08:55:04,528 ERROR  [STDERR]     at org.jboss.ejb3.stateless.StatelessContainer.localInvoke(StatelessContainer.java:240)
2013-12-20 08:55:04,537 ERROR  [STDERR]     at org.jboss.ejb3.stateless.StatelessContainer.localInvoke(StatelessContainer.java:210)
2013-12-20 08:55:04,544 ERROR  [STDERR]     at org.jboss.ejb3.stateless.StatelessLocalProxy.invoke(StatelessLocalProxy.java:84)
2013-12-20 08:55:04,551 ERROR  [STDERR]     at com.sun.proxy.$Proxy212.create(Unknown Source)
2013-12-20 08:55:04,558 ERROR  [STDERR]     at com.kaufland.dms.filenet.ws.impl.DocumentServiceImpl.createDocument(DocumentServiceImpl.java:86)
2013-12-20 08:55:04,565 ERROR  [STDERR]     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
2013-12-20 08:55:04,571 ERROR  [STDERR]     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
2013-12-20 08:55:04,577 ERROR  [STDERR]     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
2013-12-20 08:55:04,581 ERROR  [STDERR]     at java.lang.reflect.Method.invoke(Method.java:606)
2013-12-20 08:55:04,585 ERROR  [STDERR]     at org.jboss.wsf.container.jboss42.InvocationHandlerJSE.invoke(InvocationHandlerJSE.java:106)
2013-12-20 08:55:04,591 ERROR  [STDERR]     at org.jboss.wsf.stack.cxf.AbstractInvoker._invokeInternal(AbstractInvoker.java:154)
2013-12-20 08:55:04,599 ERROR  [STDERR]     at org.jboss.wsf.stack.cxf.AbstractInvoker.invoke(AbstractInvoker.java:104)
2013-12-20 08:55:04,608 ERROR  [STDERR]     at org.apache.cxf.interceptor.ServiceInvokerInterceptor$1.run(ServiceInvokerInterceptor.java:57)
2013-12-20 08:55:04,614 ERROR  [STDERR]     at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
2013-12-20 08:55:04,621 ERROR  [STDERR]     at java.util.concurrent.FutureTask.run(FutureTask.java:262)
2013-12-20 08:55:04,628 ERROR  [STDERR]     at org.apache.cxf.workqueue.SynchronousExecutor.execute(SynchronousExecutor.java:37)
2013-12-20 08:55:04,635 ERROR  [STDERR]     at org.apache.cxf.interceptor.ServiceInvokerInterceptor.handleMessage(ServiceInvokerInterceptor.java:95)
2013-12-20 08:55:04,642 ERROR  [STDERR]     at org.apache.cxf.phase.PhaseInterceptorChain.doIntercept(PhaseInterceptorChain.java:236)
2013-12-20 08:55:04,650 ERROR  [STDERR]     at org.apache.cxf.transport.ChainInitiationObserver.onMessage(ChainInitiationObserver.java:89)
2013-12-20 08:55:04,658 ERROR  [STDERR]     at org.apache.cxf.transport.servlet.ServletDestination.invoke(ServletDestination.java:99)
2013-12-20 08:55:04,666 ERROR  [STDERR]     at org.apache.cxf.transport.servlet.ServletController.invokeDestination(ServletController.java:337)
2013-12-20 08:55:04,673 ERROR  [STDERR]     at org.jboss.wsf.stack.cxf.ServletControllerExt.invoke(ServletControllerExt.java:160)
2013-12-20 08:55:04,680 ERROR  [STDERR]     at org.jboss.wsf.stack.cxf.RequestHandlerImpl.handleHttpRequest(RequestHandlerImpl.java:61)
2013-12-20 08:55:04,686 ERROR  [STDERR]     at org.jboss.wsf.stack.cxf.CXFServletExt.service(CXFServletExt.java:134)
2013-12-20 08:55:04,692 ERROR  [STDERR]     at javax.servlet.http.HttpServlet.service(HttpServlet.java:803)
2013-12-20 08:55:04,698 ERROR  [STDERR]     at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:290)
2013-12-20 08:55:04,704 ERROR  [STDERR]     at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
2013-12-20 08:55:04,710 ERROR  [STDERR]     at de.kaufland.util.infra.monitoring.mdc.AbstractMDCFilter.doFilter(AbstractMDCFilter.java:91)
2013-12-20 08:55:04,717 ERROR  [STDERR]     at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:235)
2013-12-20 08:55:04,724 ERROR  [STDERR]     at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
2013-12-20 08:55:04,730 ERROR  [STDERR]     at com.kaufland.dms.commons.tracing.web.TracingServletFilter.doFilter(TracingServletFilter.java:83)
2013-12-20 08:55:04,737 ERROR  [STDERR]     at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:235)
2013-12-20 08:55:04,744 ERROR  [STDERR]     at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
2013-12-20 08:55:04,750 ERROR  [STDERR]     at org.jboss.web.tomcat.filters.ReplyHeaderFilter.doFilter(ReplyHeaderFilter.java:96)
2013-12-20 08:55:04,757 ERROR  [STDERR]     at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:235)
2013-12-20 08:55:04,764 ERROR  [STDERR]     at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
2013-12-20 08:55:04,771 ERROR  [STDERR]     at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:230)
2013-12-20 08:55:04,778 ERROR  [STDERR]     at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:175)
2013-12-20 08:55:04,784 ERROR  [STDERR]     at org.jboss.web.tomcat.security.SecurityAssociationValve.invoke(SecurityAssociationValve.java:182)
2013-12-20 08:55:04,791 ERROR  [STDERR]     at org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:524)
2013-12-20 08:55:04,798 ERROR  [STDERR]     at org.jboss.web.tomcat.security.JaccContextValve.invoke(JaccContextValve.java:84)
2013-12-20 08:55:04,805 ERROR  [STDERR]     at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:127)
2013-12-20 08:55:04,812 ERROR  [STDERR]     at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:102)
2013-12-20 08:55:04,818 ERROR  [STDERR]     at org.jboss.web.tomcat.service.jca.CachedConnectionValve.invoke(CachedConnectionValve.java:157)
2013-12-20 08:55:04,825 ERROR  [STDERR]     at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:109)
2013-12-20 08:55:04,833 ERROR  [STDERR]     at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:262)
2013-12-20 08:55:04,841 ERROR  [STDERR]     at org.apache.coyote.ajp.AjpProcessor.process(AjpProcessor.java:437)
2013-12-20 08:55:04,847 ERROR  [STDERR]     at org.apache.coyote.ajp.AjpProtocol$AjpConnectionHandler.process(AjpProtocol.java:366)
2013-12-20 08:55:04,853 ERROR  [STDERR]     at org.apache.tomcat.util.net.JIoEndpoint$Worker.run(JIoEndpoint.java:446)
2013-12-20 08:55:04,860 ERROR  [STDERR]     at java.lang.Thread.run(Thread.java:724)

logstash-gelf not sending all logs logged in catalina.out to remote logstash server

Hi,

I am using logstash-gelf appender in JULI to send tomcat logs to remote logstash server. I observed that in my catalina.out, some logs are in different format which are not being sent to logstash server.

In the below example, the first 2 logs in catalina.out which start with timestamp are sent to logstash and indexed in Elasticsearch, but the logs which start with log4j:WARN are not sent to logstash. Any idea what is the issue here?

logs in catalina.out:

Apr 13, 2015 7:23:35 AM org.apache.catalina.startup.TaglibUriRule body
INFO: TLD skipped. URI: http://java.sun.com/jsp/jstl/fmt is already defined
Apr 13, 2015 7:23:35 AM org.apache.catalina.startup.TaglibUriRule body
INFO: TLD skipped. URI: http://java.sun.com/jsp/jstl/functions is already defined
log4j:WARN Continuable parsing error 9 and column 16
log4j:WARN The content of element type "appender" must match "(errorHandler?,param_,rollingPolicy?,triggeringPolicy?,connectionSource?,layout?,filter_,appender-ref*)".

TCP logger reacts poorly to closed connection, drops messages

We discovered this issue when directing our GELF messages (in a logback configuration) by TCP to an Amazon Web Services Elastic Load Balancer. The default AWS ELB settings close an idle connection after 60 seconds.

When the connection between our application and the ELB timed out (i.e. was idle for 60 seconds), we could observe the following behavior in tcpdump captures:

  • A GELF message sent to the ELB was rejected. (The ELB responded with an RST reset packet.)
  • A new connection to the ELB would subsequently be SYNchronized and acknowledged...
  • ...but the initially-rejected message would not be re-sent.
  • Additionally, any messages that might have occurred between the RST and SYN/ACK were dropped completely.

It appears as though the logger is appropriately re-establishing a dropped connection, but does not attempt to re-send a message that failed due to this error, and is also ignoring messages going to the logger during this re-establishment time.

We are using logstash-gelf 1.7.2; based on the release notes for 1.8.0, I don't necessarily expect an upgrade will affect this behavior.

An excerpt of our logback configuration (omitting specific hostnames and field names) is below.

    <appender name="graylog" class="biz.paluch.logging.gelf.logback.GelfLogbackAppender">
        <host>tcp:...</host>
        <port>12201</port>
        <version>1.1</version>
        <additionalFields>...</additionalFields>
        <includeFullMdc>true</includeFullMdc>
        <extractStackTrace>true</extractStackTrace>
        <filterStackTrace>true</filterStackTrace>
    </appender>

    <appender name="asyncGraylog" class="ch.qos.logback.classic.AsyncAppender">
        <queueSize>1024</queueSize>
        <appender-ref ref="graylog"/>
    </appender>

Change log framework dependencies from compile to provided

Change log framework dependencies from compile to provided in order not having to exclude the frameworks. Usually, if the framework is integrated, the user have already their logging framework libs in place, therefore it's not necessary to bring all the libraries.

Using UDP appender -> Slowing down jboss application?

Hi,

I currently have implemented the logtransfer to logstash via udp and just realized that this is slowing down my jboss application. Of course i have set the loglevel to "ALL" which means I recieve a whole lot of messages, especially from hibernate/jpa...

An hints how to work around this problem?

Could it be that my logstash config which also logs to stout is slowing things down?

commons-pool2 2.3 causes ClassNotFoundError on JBoss EAP 6.3

When running logstash-gelf 1.7.2 in a JBoss EAP 6.3 using Redis Transport the GelfHandler is not able to ship logs to Redis:

Unable to create EvictionPolicy instance of type org.apache.commons.pool2.impl.DefaultEvictionPolicy

This does not happen when using commons-pool2 2.2 or 2.4.x. Need to investigate why this happens.

Add prefix-notation for mdcFields

At first: Thanks for this adapter! Lightweight and works like a charm.
It would be great if one could pass all mdcFields with a given prefix to the GELF host.

Example:
log4j.appender.gelf.MdcFields=mdcField1,mdcField2,myFields-*
or
<mdcFields>mdcField1,mdcField2,myFields-*</mdcFields>

Log4j2 seem to be a an exception with since every field is configured explicitly.

List on Graylog Marketplace

We have launched the beta of the Graylog Marketplace: https://marketplace.graylog.org/

The Marketplace is the central directory of all Graylog add-ons and integrations, including GELF libraries.

You would help us a lot if you could submit your GELF library there. :)

Use appropriate data type when parsing values from additional fields

logstash-gelf converts any integer values in MDC fields into a double value with a floating point right before generating a JSON formatted GELF message based on those fields.
example MDC Field (name,value):
"MyID": "1234" --> converted to --> "MyID":1234.0

A query for [MyID: 1234] in Graylog v1.2.2 won't return any results, since the query is ignoring any entries with [MyID:1234.0] now.
Graylog v1.0 was less exact on that and returned a result while Graylog v1.2.2 obviously is more strict.

logstash-gelf needs to pay attention on that as a matter of fact.
It should convert integer values appropriatly.

tested with application running on (source):
JBoss7
WildFly 8.1

graylog versions (sink):
v1.0
v1.2.2

"No default-logstash-fields.properties resource present" output breaks Apache Hive

Hi,

"No default-logstash-fields.properties resource present, using defaults": If the file is optional, is it appropriate to print that warning? Or, is creating the file a requirement? Or is there some way to disable this warning?

Hive runs 'hadoop version'; parsing its output fails due to this issue:

$ hive
Unable to determine Hadoop version information.
'hadoop version' returned:
No default-logstash-fields.properties resource present, using defaults Hadoop 2.3.0-cdh5.0.2 [...]

Cloudera Hadoop 5.0.2 on Ubuntu 12.04 with Logstash-Gelf 1.5.1
Hadoop configuration follows https://gist.github.com/mp911de/9130280

Thanks!

Configuration of default field names

Currently, only Log4j2 provides the ability to configure every field. Other implementations (log4j v1.2, logback, JUL) can configure only additional fields (static, mdc). There is no mechanism to set field names for time, severity, etc.

Increase dependency versions

Update

  • slf4j 1.7.5 -> 1.7.9
  • log4j2 2.0 -> 2.1
  • jedis 2.5.1 -> 2.6.2

Leave

  • logback-classic stays at 1.0.13
  • log4j stays at 1.2.14

logback and log4j 1.2.x will stay at older versions to not introduce functionality that is available only on newer releases.

Gelf Appender not working when running on same machine as graylog

I am using log4j and log4j2 in my application.

The Appender in my log4j2.xml looks like this:

<Gelf name="Gelf" host="udp:devbox" port="12202"
            extractStackTrace="true" filterStackTrace="true" maximumMessageSize="8192"
            originHost="server_dev_log4j2">
            <Field name="level" pattern="INFO" />
</Gelf>

In log4j.properties it looks like this:

log4j.appender.gelf=biz.paluch.logging.gelf.log4j.GelfLogAppender
log4j.appender.gelf.Threshold=INFO
log4j.appender.gelf.Host=udp:devbox
log4j.appender.gelf.Port=12202
log4j.appender.gelf.OriginHost=server_dev_log4j
log4j.appender.gelf.ExtractStackTrace=true
log4j.appender.gelf.FilterStackTrace=true
log4j.appender.gelf.MaximumMessageSize=8192

"devbox" is the name of the maschine (Ubuntu 14.04) running the server. The graylog server runs inside a Docker container on the same machine.

This configuration works when I start it locally(eclipse). On the server only log4j is sending logs to the graylog server. In addition to the Gelf Appender a RollingFile Appender is defined in the log4j2.xml. This works as expected.

Here the dependencies of the application:

dependencies {
    compile group: 'commons-collections', name: 'commons-collections', version: '3.2'
    compile 'biz.paluch.logging:logstash-gelf:1.6.0'
    compile 'org.eclipse.jetty:jetty-server:9.3.0.v20150612'
    compile 'org.eclipse.jetty:jetty-servlet:9.3.0.v20150612'
    compile 'org.glassfish.jersey.core:jersey-server:2.19'
    compile 'org.glassfish.jersey.containers:jersey-container-jetty-http:2.19'
    compile 'org.glassfish.jersey.containers:jersey-container-servlet:2.19'
    compile 'org.glassfish.jersey.media:jersey-media-moxy:2.19'
    compile 'org.mongodb:mongo-java-driver:3.0.2'
    compile 'com.google.code.gson:gson:2.3.1'
    compile 'org.slf4j:slf4j-log4j12:1.7.12'
    compile 'org.slf4j:jul-to-slf4j:1.7.12'
    compile 'org.apache.logging.log4j:log4j-core:2.3'
    testCompile group: 'junit', name: 'junit', version: '4.+'
}

Data type specification for MDC fields

MDC fields contain various data types. String and numeric types are currently passed over GELF. When a field contains a numeric content (such as 42 or 12.34) the numeric value is parsed using Long.parseLong and Double.parseDouble to determine a parseable format.

Users of Graylog run into issues as soon as the data type of a particular field varies. logstash-gelf requires a possibility to specify fixed types to support Graylog > 1.2.x.

The type specification is not needed in most cases so it should be opt-in but once a type is set, it must be honored. A possible style to define types could be either:

mdcFields=Application,Version=long,SomeOtherFieldName=double,SomeOtherFieldName2=String

with colon:

mdcFields=Application,Version:long,SomeOtherFieldName:double,SomeOtherFieldName2:String

or

mdcFields=Application,Version,SomeOtherFieldName,SomeOtherFieldName2
mdcFieldTypes=Version=long,SomeOtherFieldName=double,SomeOtherFieldName2=String
includeFullMdc=true
mdcFieldTypes=Version=long,SomeOtherFieldName=double,SomeOtherFieldName2=String

Types

  • long
  • double
  • String

Open points

  • What if a value cannot be parsed (e.g. type double but value is fadsfds)?
    • Possibility 1: Omit the field
    • Possibility 2: Return zero value

Reported by @kebers

Could not initialize class biz.paluch.logging.RuntimeContainer

Hi!

I'm facing a problem when deploying to Tomcat (7.0.42).

Both json-simple-1.1.jar and logstash-gelf-1.3.0.jar are in {tomcat home}/endorsed.

When I start tomcat, catalina.out gives a NoClassDefFoundError for RuntimeContainer, see below for stack trace.

Uncommenting "biz.paluch.logging.gelf.jul.GelfLogHandler.originHost=remmelt" in the logging.properties makes it work.
I do not want to do this because the end result of my tests should be this same configuration running on Glassfish4, where the domain admin server distributes the logging.properties file to all its nodes without the ability to edit it per node.

I've copied the RuntimeContainer class to a hello world war, included it in a simple servlet and called log.debug(RuntimeContainer.FQDN_HOSTNAME);
This works well and outputs my machine's hostname.

I'm running OSX 10.9.2.

conf/logging.properties:

#handlers = 1catalina.org.apache.juli.FileHandler, 2localhost.org.apache.juli.FileHandler, 3manager.org.apache.juli.FileHandler, 4host-manager.org.apache.juli.FileHandler, java.util.logging.ConsoleHandler

#.handlers = 1catalina.org.apache.juli.FileHandler, java.util.logging.ConsoleHandler
handlers = biz.paluch.logging.gelf.jul.GelfLogHandler, java.util.logging.ConsoleHandler

.handlers = biz.paluch.logging.gelf.jul.GelfLogHandler, java.util.logging.ConsoleHandler
.level = INFO

biz.paluch.logging.gelf.jul.GelfLogHandler.host=udp:localhost
#biz.paluch.logging.gelf.jul.GelfLogHandler.originHost=remmelt
biz.paluch.logging.gelf.jul.GelfLogHandler.port=12201
biz.paluch.logging.gelf.jul.GelfLogHandler.level=INFO

com.remmelt.level=FINEST

# ... snip ... rest of the file is standard as packaged with tomcat
java.lang.NoClassDefFoundError: Could not initialize class biz.paluch.logging.RuntimeContainer
    at biz.paluch.logging.gelf.GelfMessageAssembler.getOriginHost(GelfMessageAssembler.java:201)
    at biz.paluch.logging.gelf.GelfMessageAssembler.getValues(GelfMessageAssembler.java:141)
    at biz.paluch.logging.gelf.GelfMessageAssembler.createGelfMessage(GelfMessageAssembler.java:89)
    at biz.paluch.logging.gelf.jul.GelfLogHandler.createGelfMessage(GelfLogHandler.java:129)
    at biz.paluch.logging.gelf.jul.GelfLogHandler.publish(GelfLogHandler.java:102)
    at java.util.logging.Logger.log(Logger.java:610)
    at java.util.logging.Logger.doLog(Logger.java:631)
    at java.util.logging.Logger.logp(Logger.java:831)
    at org.apache.juli.logging.DirectJDKLog.log(DirectJDKLog.java:185)
    at org.apache.juli.logging.DirectJDKLog.error(DirectJDKLog.java:151)
    at org.apache.tomcat.util.digester.Digester.startElement(Digester.java:1281)
    at com.sun.org.apache.xerces.internal.parsers.AbstractSAXParser.startElement(AbstractSAXParser.java:509)
    at com.sun.org.apache.xerces.internal.parsers.AbstractXMLDocumentParser.emptyElement(AbstractXMLDocumentParser.java:182)
    at com.sun.org.apache.xerces.internal.impl.XMLDocumentFragmentScannerImpl.scanStartElement(XMLDocumentFragmentScannerImpl.java:1342)
    at com.sun.org.apache.xerces.internal.impl.XMLDocumentFragmentScannerImpl$FragmentContentDriver.next(XMLDocumentFragmentScannerImpl.java:2770)
    at com.sun.org.apache.xerces.internal.impl.XMLDocumentScannerImpl.next(XMLDocumentScannerImpl.java:606)
    at com.sun.org.apache.xerces.internal.impl.XMLDocumentFragmentScannerImpl.scanDocument(XMLDocumentFragmentScannerImpl.java:510)
    at com.sun.org.apache.xerces.internal.parsers.XML11Configuration.parse(XML11Configuration.java:848)
    at com.sun.org.apache.xerces.internal.parsers.XML11Configuration.parse(XML11Configuration.java:777)
    at com.sun.org.apache.xerces.internal.parsers.XMLParser.parse(XMLParser.java:141)
    at com.sun.org.apache.xerces.internal.parsers.AbstractSAXParser.parse(AbstractSAXParser.java:1213)
    at com.sun.org.apache.xerces.internal.jaxp.SAXParserImpl$JAXPSAXParser.parse(SAXParserImpl.java:649)
    at org.apache.tomcat.util.digester.Digester.parse(Digester.java:1537)
    at org.apache.catalina.startup.Catalina.load(Catalina.java:617)
    at org.apache.catalina.startup.Catalina.load(Catalina.java:665)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:606)
    at org.apache.catalina.startup.Bootstrap.load(Bootstrap.java:281)
    at org.apache.catalina.startup.Bootstrap.main(Bootstrap.java:455)

NPE on Spring Boot

java.lang.NullPointerException
at biz.paluch.logging.gelf.log4j.Log4jLogEvent.getAllMdcNames(Log4jLogEvent.java:155)
at biz.paluch.logging.gelf.log4j.Log4jLogEvent.getMdcNames(Log4jLogEvent.java:174)
at biz.paluch.logging.gelf.MdcGelfMessageAssembler.createGelfMessage(MdcGelfMessageAssembler.java:39)
at biz.paluch.logging.gelf.log4j.GelfLogAppender.createGelfMessage(GelfLogAppender.java:124)
at biz.paluch.logging.gelf.log4j.GelfLogAppender.append(GelfLogAppender.java:88)
at org.apache.log4j.AppenderSkeleton.doAppend(AppenderSkeleton.java:251)
at org.apache.log4j.helpers.AppenderAttachableImpl.appendLoopOnAppenders(AppenderAttachableImpl.java:66)
at org.apache.log4j.Category.callAppenders(Category.java:206)
at org.apache.log4j.Category.forcedLog(Category.java:391)
at org.apache.log4j.Category.log(Category.java:856)
at org.slf4j.impl.Log4jLoggerAdapter.log(Log4jLoggerAdapter.java:601)
at org.apache.commons.logging.impl.SLF4JLocationAwareLog.info(SLF4JLocationAwareLog.java:159)
at org.springframework.boot.StartupInfoLogger.logStarting(StartupInfoLogger.java:52)
at org.springframework.boot.SpringApplication.logStartupInfo(SpringApplication.java:583)
at org.springframework.boot.SpringApplication.run(SpringApplication.java:308)
at de.paluch.heckenlights.Application.main(Application.java:20)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:483)
at org.springframework.boot.loader.MainMethodRunner.run(MainMethodRunner.java:53)
at java.lang.Thread.run(Thread.java:745)

Create test cases for slow/not available logstash/redis

Provide test cases for some network/service errors:

  • GELF is not available which means use UDP on a service that is not reachable
  • Redis is not available which means connect Redis using TCP but the service does not respond

Clustered Logging: Identify multiple nodes from one Machine

Is there any possibility to identify logs from multiple nodes running on one machine?

E.g.: Having one server which is running two jboss wildfly instances in a cluster. Each Jboss will run on a different port then. Logstah-GELF is only configured in the domain's configuration.

I have read about the Log Event Fields (https://github.com/mp911de/logstash-gelf#log-event-fields). As far as I got, I was only able to get the output of ip's or the server's name.

get the full stacktrace on wildfly

Hi,

I just set up the wildfly 9 module to send the logs to graylog2. It worked. But my stack trace appears to be cut. Is there a way I can format the output to see the full stack trace, with the error and all the caused by clauses?

Extracted stacktrace is almost unreadable

I'm using Graylog v1.1.1 + GELF plugin v1.6.0, ch.qos.logback from java app. I've enabled 'extractStackTrace' but graylog receives hardly readable unstructured stacktrace message, even if I enable 'filterStackTrace' option.
Looks like '\t' (or '\n', or both?) at the end of the line in stacktrace is lost somewhere.
Partial screenshot attached
stacktrace

originHost set to IP address

The gelf host field value is set to the IP address of the sending IP. By default, this is supposed to be the host's FQDN. I'm using logback.

I tried the following workaround, see, e.g.: http://logback.qos.ch/manual/groovy.html#automaticallyExported

def HOSTNAME=hostname
originHost = ${HOSTNAME}

Still no luck. I was able to set an originHost additional field.

additionalFields = "originHost= ${HOSTNAME}"

logback.groovy

https://gist.github.com/dmourati/7b5c14c735caa348c8cb

example json inserted into redis

https://gist.github.com/dmourati/fe6f5cc7d1c9fa30eb92

Any ideas why host is not set correctly?

logstash-gelf-1.1.0.jar config does not match the docs

I downloaded logstash-gelf-1.1.0.jar, md5sum: 225a6252d14511bb6477c5182e4dd8d7.

When I tried to launch it with the config in the README, I got:

15:29:42,339 |-ERROR in ch.qos.logback.classic.gaffer.AppenderDelegate@78374f9f - Appender [gelf1] of type [biz.paluch.logging.gelf.logback.GelfLogbackAppender] has no appplicable [host] property

15:29:42,340 |-ERROR in ch.qos.logback.classic.gaffer.AppenderDelegate@78374f9f - Appender [gelf1] of type [biz.paluch.logging.gelf.logback.GelfLogbackAppender] has no appplicable [port] property

Looking at the jar, it expects graylogHost and graylogPort. Changing my config to use those elements worked.

This commit shows the issue.

7186df8

Thanks!

Extraneous string characters in SourceClassName and SourceSimpleClassName

I've noticed that SourceClassName and SourceSimpleClassName sometimes pick up extraneous characters at the end of the string. Here's an example.

DEBUG [2013-12-13 21:51:23,418] com.domain.lib.job.AccountProcessQueue: RELEASED lock on account Account[number = 1234567890]

In the JSON message, the characters "$4" are appended to values for SourceSimpleClassName and SourceClassName:

{
"_index": "logstash-2013.12.13",
"_type": "logs",
"_id": "ofr_yI5aQ82NkoJ2s0UBTg",
"_score": null,
"_source": {
"host": "lib.domain.com",
"timestamp": "1386971483.418",
"level": "7",
"facility": "logstash-gelf",
"@timestamp": "2013-12-13T21:51:24.213Z",
"@Version": "1",
"source_host": "10.0.2.162",
"message": "RELEASED lock on account Account[number = 1234567890]",
"SourceSimpleClassName": "AccountProcessQueue$4",
"SourceClassName": "com.domain.lib.job.AccountProcessQueue$4",
"Severity": "DEBUG",
"Thread": "AccountProcess-Job-14",
"Time": "2013-12-13 21:51:23,0418",
"Server": "unknown",
"SourceMethodName": "apply",
"type": "gelf"
},
"sort": [
1386971484213,
1386971484213
]
}

What is causing these extra characters?

I'm trying to create pie graphs around the short names and they are coming back with "4" as the label. This just bit me in a demo.

Thanks!

logstash gelf input

Hi,
i am using graylog2 in my infra . i like to migrate to logstash . before proceeding that, i like to test a scenario's like

gelf-input in logstash-shipper => MQ => logstash => kibana . this setup will work ?

or the gelf modules are only for graylog output
gelfinput in logstash-shipprt => graylog2-server => graylog2-web .

i am trying the 1st option for testing and i am getting a parsing error .

:message=>"Gelfd failed to parse a message skipping", :exception=>#, :backtrace=>["/opt/logstash/vendor/bundle/jruby/1.9/gems/gelfd-0.2.0/lib/gelfd/parser.rb:14:in parse'", "/opt/logstash/lib/logstash/inputs/gelf.rb:83:inudp_listener'", "/opt/logstash/lib/logstash/inputs/gelf.rb:60:in run'", "/opt/logstash/lib/logstash/pipeline.rb:163:ininputworker'", "/opt/logstash/lib/logstash/pipeline.rb:157:in `start_input'"], :level=>:warn}

Please give your input .

Thanks .

Datenpumpe

Enable the usage of logstash-gelf outside logging frameworks to submit any messages using GELF or Redis. This allows to pump events, statistics and messages to logstash.

Components:

  • GelfMessageBuilder
  • Datenpumpe interface and impl (send/close)
  • Datenpumpe factory (create Datenpumpe)

Passing originHost from log4j2.xml

Hi.

I couldn't find a way how to add issue to https://github.com/mp911de/logstash-gelf

Problem is that it's not possible to pass originHost from log4j2.xml
I see that it's possible to set originHost in https://github.com/mp911de/logstash-gelf/blob/master/src/main/java/biz/paluch/logging/gelf/GelfMessageAssembler.java
But GelfLogAppender doesn't contain such PluginAttribute https://github.com/mp911de/logstash-gelf/blob/master/src/main/java/biz/paluch/logging/gelf/log4j2/GelfLogAppender.java

For now logstash-gelf is only project which I could find for work with maven and log4j2. Thanks for your work :)

NPE in GelfLogHandler when LEVEL is set but AdditionalFields is not set

java.lang.NullPointerException
at biz.paluch.logging.gelf.jul.GelfLogHandler.setAdditionalFields(GelfLogHandler.java:144)
at biz.paluch.logging.gelf.jul.GelfLogHandler.(GelfLogHandler.java:65)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:408)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.