Giter VIP home page Giter VIP logo

cwh's Introduction

Stand With Ukraine

AWS CloudWatch Logs Handler for Monolog

Actions Status Coverage Status License Version Downloads

Handler for PHP logging library Monolog for sending log entries to AWS CloudWatch Logs service.

Before using this library, it's recommended to get acquainted with the pricing for AWS CloudWatch services.

Please press β˜… Star button if you find this library useful.

Disclaimer

This library uses AWS API through AWS PHP SDK, which has limits on concurrent requests. It means that on high concurrent or high load applications it may not work on it's best way. Please consider using another solution such as logging to the stdout and redirecting logs with fluentd.

Requirements

  • PHP ^7.3
  • AWS account with proper permissions (see list of permissions below)

Features

  • Up to 10000 batch logs sending in order to avoid Rate exceeded errors
  • Log Groups creating with tags
  • AWS CloudWatch Logs staff lazy loading
  • Suitable for web applications and for long-living CLI daemons and workers

Installation

Install the latest version with Composer by running

$ composer require maxbanton/cwh:^2.0

Basic Usage

<?php

use Aws\CloudWatchLogs\CloudWatchLogsClient;
use Maxbanton\Cwh\Handler\CloudWatch;
use Monolog\Logger;
use Monolog\Formatter\JsonFormatter;

$sdkParams = [
    'region' => 'eu-west-1',
    'version' => 'latest',
    'credentials' => [
        'key' => 'your AWS key',
        'secret' => 'your AWS secret',
        'token' => 'your AWS session token', // token is optional
    ]
];

// Instantiate AWS SDK CloudWatch Logs Client
$client = new CloudWatchLogsClient($sdkParams);

// Log group name, will be created if none
$groupName = 'php-logtest';

// Log stream name, will be created if none
$streamName = 'ec2-instance-1';

// Days to keep logs, 14 by default. Set to `null` to allow indefinite retention.
$retentionDays = 30;

// Instantiate handler (tags are optional)
$handler = new CloudWatch($client, $groupName, $streamName, $retentionDays, 10000, ['my-awesome-tag' => 'tag-value']);

// Optionally set the JsonFormatter to be able to access your log messages in a structured way
$handler->setFormatter(new JsonFormatter());

// Create a log channel
$log = new Logger('name');

// Set handler
$log->pushHandler($handler);

// Add records to the log
$log->debug('Foo');
$log->warning('Bar');
$log->error('Baz');

Frameworks integration

And many others

AWS IAM needed permissions

if you prefer to use a separate programmatic IAM user (recommended) or want to define a policy, make sure following permissions are included:

  1. CreateLogGroup aws docs
  2. CreateLogStream aws docs
  3. PutLogEvents aws docs
  4. PutRetentionPolicy aws docs
  5. DescribeLogStreams aws docs
  6. DescribeLogGroups aws docs

When setting the $createGroup argument to false, permissions DescribeLogGroups and CreateLogGroup can be omitted

AWS IAM Policy full json example

{
    "Version": "2012-10-17",
    "Statement": [
        {
            "Effect": "Allow",
            "Action": [
                "logs:CreateLogGroup",
                "logs:DescribeLogGroups"
            ],
            "Resource": "*"
        },
        {
            "Effect": "Allow",
            "Action": [
                "logs:CreateLogStream",
                "logs:DescribeLogStreams",
                "logs:PutRetentionPolicy"
            ],
            "Resource": "{LOG_GROUP_ARN}"
        },
        {
            "Effect": "Allow",
            "Action": [
                "logs:PutLogEvents"
            ],
            "Resource": [
                "{LOG_STREAM_1_ARN}",
                "{LOG_STREAM_2_ARN}"
            ]
        }
    ]
}

Issues

Feel free to report any issues

Contributing

Please check this document


Made in Ukraine πŸ‡ΊπŸ‡¦

cwh's People

Contributors

amandas-sukionis avatar bfoosness avatar chris-lu avatar guillaumesmo avatar holtkamp avatar jackwakefield avatar joshbernfeld avatar kingmar avatar localheinz avatar maxbanton avatar mhdhaddad avatar mkingbe avatar mstovicek avatar uduncanu avatar zaaom avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

cwh's Issues

Invalid sequence token

  • I'm submitting a ...

    • bug report
    • feature request
    • support request
  • Do you want to request a feature or report a bug?
    I think that it's a bug. I see that this issue was also reported in #55 and the person there says it was fixed but I'm still getting this issue.

  • What is the current behavior?
    When using this has the handler for my monolog instance I intermittently get "invalid sequence token" errors returned from AWS. It seems like this happens when there are connections open and writing to the stream at the same time.

  • What is the expected behavior?
    Should successfully write to the log.

  • Please tell about your environment:

    • PHP Version: 7.2 with cwh version 1.1.10
    • Operating system (distro): Amazon Linux
    • Application mode (web app / cli app / daemon cli app): job running daemon in laravel.
  • Other information (e.g. detailed explanation, stacktraces, related issues, suggestions how to fix, links to have context, eg. stackoverflow, etc)

exception 'Aws\CloudWatchLogs\Exception\CloudWatchLogsException' with message 'Error executing "PutLogEvents" on "https://logs.us-east-1.amazonaws.com"; AWS HTTP error: Client error: POST https://logs.us-east-1.amazonaws.com` resulted in a 400 Bad Request response:
{"__type":"InvalidSequenceTokenException","expectedSequenceToken":"4959143730787630021320614391808164829339315... (truncated...)
InvalidSequenceTokenException (client): The given sequenceToken is invalid. The next expected sequenceToken is: 49591437307876300213206143918081648293393151102... - {"__type":"InvalidSequenceTokenException","expectedSequenceToken":"49591437307876300213206143918081648293393151...","message":"The given sequenceToken is invalid. The next expected sequenceToken is: 4959143730787630021320614391808..."}'`

Cloudwatch logs 5 log events per second limit

Hi,
thanks for writing this monolog handler, it's been really useful. We've hit one issue with it - Cloudwatch logs has a limit of 5 log events/second.

When our application exceeds this limit it causes an exception to be thrown:

Exception in run: Error executing "PutLogEvents" on "https://logs.us-east-1.amazonaws.com"; AWS HTTP error: Client error: `POST https://logs.us-east-1.amazonaws.com` resulted in a `400 Bad Request` response:
{
    "__type": "ThrottlingException",
    "message": "Rate exceeded"
}
ThrottlingException (client): Rate exceeded -
{
    "__type": "ThrottlingException",
    "message": "Rate exceeded"
}
[] []

Other cloudwatch agents get around this by internally buffering log events and then periodically sending batches of to the PutLogEvents API. It would be great if this handler could do the same.

Justin.

Getting error: The batch of log events in a single PutLogEvents request cannot span more than 24 hours

Lumen version: 5.6
package version: v1.1.14

Recently I started getting this error while running my lumen application: The batch of log events in a single PutLogEvents request cannot span more than 24 hours.

Eexception 'Aws\CloudWatchLogs\Exception\CloudWatchLogsException' with message 'Error executing "PutLogEvents" on "https://logs.us-east-1.amazonaws.com"; AWS HTTP error: Client error: `POST https://logs.us-east-1.amazonaws.com` resulted in a `400 Bad Request` response:
{"__type":"InvalidParameterException","message":"The batch of log events in a single PutLogEvents request cannot span mo (truncated...)
 InvalidParameterException (client): The batch of log events in a single PutLogEvents request cannot span more than 24 hours. - {"__type":"InvalidParameterException","message":"The batch of log events in a single PutLogEvents request cannot span more than 24 hours."}'

GuzzleHttp\Exception\ClientException: Client error: `POST https://logs.us-east-1.amazonaws.com` resulted in a `400 Bad Request` response:
{"__type":"InvalidParameterException","message":"The batch of log events in a single PutLogEvents request cannot span mo (truncated...)
 in /application/path/vendor/guzzlehttp/guzzle/src/Exception/RequestException.php:113
Stack trace:
...

Is anyone familiar with this type of error? Does this error mean the log stream was pushing logs for more than 24 hours? Can this package handle log streams that old?

Thanks for your help.

[Bug] Log group created but stream is not.

  • I'm submitting a ...

    • bug report
  • What is the current behavior?

Using the Laravel example, seems when logging info/error/debug it creates the Log Group but no log stream. I cam confirm the permissions are correct following what you require:

{
    "Version": "2012-10-17",
    "Statement": [
        {
            "Effect": "Allow",
            "Action": [
                "logs:CreateLogStream",
                "logs:DescribeLogGroups",
                "logs:DescribeLogStreams",
                "logs:PutRetentionPolicy",
                "logs:CreateLogGroup",
                "logs:PutLogEvents"
            ],
            "Resource": [
                "arn:aws:logs:*:*:*"
            ]
        }
    ]
}
  • What is the expected behavior?

Expected to create group + stream + log to it.

  • Please tell about your environment:

    • PHP Version: 7.3
    • Operating system (distro): Ubuntu 18.04
    • Application mode (web app / cli app / daemon cli app): web app

Long wait on async call, DescribeLogGroups

  • I'm submitting a support request

I see a peculiar behaviour, that may be intentional. I log to cloudwatch using Monolog and maxbanton/cwh

Most PHP request runs as expected, and complete in a timely manner.

But a few appear to take "forever" > 60 secs when monitoring in devops tools. I have yet to see it myself in the browser, it may not affect users.

When I analyze traces they are not actually doing anything, rather they are waiting on async http request initiated by Maxbanton\Cwh\Handler\CloudWatch::send

Is this expected behaviour?


It happens like this, simplified, times are picked from actual example.

00.000s PHP request starts
00.183s Normal code finishes
00.183s Monolog\Handler\AbstractHandler::__destruct
00.183s Maxbanton\Cwh\Handler\CloudWatch::close
00.183s Maxbanton\Cwh\Handler\CloudWatch::flushBuffer
00.183s Maxbanton\Cwh\Handler\CloudWatch::send
00.183s Maxbanton\Cwh\Handler\CloudWatch::initialize
00.183s Aws\AwsClient::describeLogGroups
00.183s Aws\AwsClient::__call
.... GuzzleHttp working ...
0.190s GuzzleHttp\Promise\Promise::waitIfPending
65.502s Maxbanton\Cwh\Handler\CloudWatch::refreshSequenceToken
65.527s Aws\AwsClient::putLogEvents

Throttling limit exceeding on `DescribeLogGroups`

Hello,

first thanks for this cool tool!

I'm getting this error message:

Error executing "DescribeLogGroups" on "https://logs.eu-west-1.amazonaws.com"; AWS HTTP error: Client error: `POST https://logs.eu-west-1.amazonaws.com` resulted in a `400 Bad Request`

I'm just wondering how I can avoid it as it's not possible to ask for an increase rate limit on AWS support as this parameter (DescribeLogGroups) is not is the list of valid parameters to increase...

Dynamic stream names

Hello,
it is possible to create dynamic stream names with the current version?

I have for example this config here:

    cloudwatch_handler:
        class: Maxbanton\Cwh\Handler\CloudWatch
        arguments:
            - "@cloudwatch_client"
            - "/cw/path/for/logs-%env%-%version%/var/log/prod.log" # groupName
            - "%kernel.environment%" # streamName
            - 7                      # retentionDays
            - 10000                  # logsInBatch
            - []                     # tags
            - INFO                   # logLevel

The stream name is always "prod" in my case, because it's defined like this: "%kernel.environment%". But as far as I know, it's much better to split the stream names in Cloudwatch by time or another identifier.

I'm thinking about something like this here "%kernel.environment%-@CURRENT_DATE@", which should resolve to "prod-2020-10-07".

What do you think about it?

AWS permission requirements

Hi again,

Another issue, related to the previous one :)

Is this handler really need such extensive permission like CloudWatchLogsFullAccess?
Or could AWSOpsWorksCloudWatchLogs be sufficient?
Particularly it allows to create groups, streams and put log events, which seems cover the needs of this handler.

Best regards,

Cyril

Is it possible to dispatch log entries asynchronously / fire and forget?

I am currently researching whether it is possible to asynchronously send over log entries as a "send and forget" approach. The rationale of this is that during profiling with Blackfire.io, it turned out that dispatching the logs (and waiting for the response) currently takes around 25% of the time of a complete request cycle. Instead of reducing / disabling dispatching log entries, I thought maybe use a fire and forget approach... and just hope the entries arrive at Amazon🀞 .

Realizing this seems to be possible by using CurlMultiHandler as HTTP Handler of the CloudWatchLogClient and manually invoking the tick() as described here:

aws/aws-sdk-php#621 (comment)

In that case instead of $this->client->putLogEvents($data);, we should invoke $this->client->putLogEventsAsync($data);

@maxbanton have you already considered this?

Whether this is completely possible is still something which seems debatable when looking at
guzzle/guzzle#1127
guzzle/guzzle#1425
guzzle/guzzle#1429 (comment)

I already tried to do this for the AwsCloudWatch client when sending over metrics. But it seems the request is still being "blocked". That might be covered by https://github.com/guzzle/guzzle/pull/1924/files

Another, 'simpler' way might be to set the response timeout really low: guzzle/guzzle#1429 (comment)

BatchSize = 1 will only flush when bufferSize reaches 2

  • I'm submitting a ...

    • bug report
    • feature request
    • support request
  • Do you want to request a feature or report a bug?
    I think this is a bug.

  • What is the current behavior?
    When using a low batch size like 1 (or 0?) I would expect that a logged record is flushed immediately, which is currently not the case.

  • If the current behavior is a bug, please provide the steps to reproduce and if possible a minimal demo of the problem
    Use a batch size of 1 and log 1 record to the logger.

  • What is the expected behavior?
    That the record is flushed / sent to CloudWatch immediately.

  • What is the motivation / use case for changing the behavior?
    In a long running process a status of a device is checked, logged to Cloudwatch and the process goes to sleep for x minutes.

In the current behavior, the buffer is first flushed when the batchSize is reached and then the record is added to the buffer:

foreach ($records as $record) {
if ($this->currentDataAmount + $this->getMessageSize($record) >= $this->dataAmountLimit ||
count($this->buffer) >= $this->batchSize
) {
$this->flushBuffer();
$this->addToBuffer($record);
} else {
$this->addToBuffer($record);
}
}

The result is that it will take at least X minutes, before the first record is sent to Cloudwatch. So imagine X = 60 minutes, then it will take 60 minutes for the first status check to reach the CloudWatch Log.

The current defensive approach seems to be chosen to prevent the data amount to be exceeded.

I would suggest to separate the check for the data amount and the buffer size:

        foreach ($records as $record) {
            if ($this->currentDataAmount + $this->getMessageSize($record) >= $this->dataAmountLimit) {
                $this->flushBuffer(); //Ensure data amount is never exceeded
            }

            $this->addToBuffer($record);

            if(count($this->buffer) >= $this->batchSize){
                 $this->flushBuffer(); //Flush the buffer as soon a the indicated batch size has been reached
            }
        }
  • Please tell about your environment:

    • PHP Version: 7.3
    • Operating system (distro): OSX
    • Application mode (web app / cli app / daemon cli app): CLI
  • Other information (e.g. detailed explanation, stacktraces, related issues, suggestions how to fix, links to have context, eg. stackoverflow, etc)

I think this relates to #52

InvalidSequenceTokenException

Hi,

I am using your cloud watch handler for symfony. I rolled out the cwh in multiple environments (dev, staging, demo, prod, .. ) and are now experiencing something odd after a couple of successfull pushes:

  [Aws\CloudWatchLogs\Exception\CloudWatchLogsException]
  Error executing "PutLogEvents" on "https://logs.eu-central-1.amazonaws.com"; AWS HTTP error: Client error response [url] https://logs.eu-central-
  1.amazonaws.com [status code] 400 [reason phrase] Bad Request InvalidSequenceTokenException (client): The given sequenceToken is invalid. The nex
  t expected sequenceToken is: 49578481559872967104658335510065862421249324983057912274 - {"__type":"InvalidSequenceTokenException","expectedSequen
  ceToken":"49578481559872967104658335510065862421249324983057912274","message":"The given sequenceToken is invalid. The next expected sequenceToke
  n is: 49578481559872967104658335510065862421249324983057912274"}

This is my configuration:

        class: Maxbanton\Cwh\Handler\CloudWatch
        arguments:
            - "@cloudwatch_client"
            - "%elasticbeanstalk_app%"              # groupName
            - "%kernel.environment%" # streamName
            - 30                     # retentionDays
            - 5                  # logsInBatch
            - { mytag: "tag" }       # tags
            - NOTICE                # logLevel

Is it maybe an issue, that I chose batchsize of 5?

It is neccesary to use use different access key / secret for different environment?

[BUG] Cover the case of the AWS Outage

Describe the bug
We had to experience the problem in terms of the AWS Outage us-east-1 with CloudWatch. https://www.theverge.com/2020/11/25/21719396/amazon-web-services-aws-outage-down-internet

Expected behavior
The application should still run even if AWS CloudWatch is not responding.

Please provide the steps to reproduce and if possible a minimal demo of the problem
Any instructions on how to reproduce a bug: play locally around the firewall to disable AWS connection and try to run.

Please tell about your environment:

  • PHP Version: 7.3
  • Operating system (distro): Official PHP Docker php:7.3.11-fpm-stretch
    • Web app
    • CLI app

[BUG] I get no errors, but no data is transferred to the Cloudwatch

Describe the bug
I am trying to use this package in my laravel application.
My steps:

  1. I created new group in IAM and added CloudwatchFullAccess policy
  2. I created a user and added it to this group
  3. After this I got Access key ID and Secret Access key
  4. I installed package according to package documentation:
  • composer require maxbanton/cwh:^2.0
  • next I added cloudwatch setings to my logging.php like here:
        'cloudwatch' => [
            'driver' => 'custom',
            'via' => \App\Logging\CloudWatchLoggerFactory::class,
            'sdk' => [
                'region' => env('AWS_DEFAULT_REGION', 'us-east-1'),
                'version' => 'latest',
                'credentials' => [
                    'key' => env('CLOUDWATCH_LOG_KEY'),
                    'secret' => env('CLOUDWATCH_LOG_SECRET')
                ]
            ],
            'retention' => env('CLOUDWATCH_LOG_RETENTION',7),
            'level' => env('CLOUDWATCH_LOG_LEVEL','info'),
            'stream' => env('CLOUDWATCH_STREAM', 'backend-log'),
        ],
  • created factory class here: App/Logging/CloudWatchLoggerFactory.php
<?php

namespace App\Logging;

use Aws\CloudWatchLogs\CloudWatchLogsClient;
use Maxbanton\Cwh\Handler\CloudWatch;
use Monolog\Logger;

class CloudWatchLoggerFactory
{
    /**
     * Create a custom Monolog instance.
     *
     * @param  array  $config
     * @return \Monolog\Logger
     */
    public function __invoke(array $config)
    {
        $sdkParams = $config["sdk"];
        $tags = $config["tags"] ?? [ ];
        $name = $config["name"] ?? 'cloudwatch';

        // Instantiate AWS SDK CloudWatch Logs Client
        $client = new CloudWatchLogsClient($sdkParams);

        // Log group name, will be created if none
        $groupName = config('app.name') . '-' . config('app.env');

        // Log stream name, will be created if none
        $streamName = config('app.hostname');

        // Days to keep logs, 14 by default. Set to `null` to allow indefinite retention.
        $retentionDays = $config["retention"];

        // Instantiate handler (tags are optional)
        $handler = new CloudWatch($client, $groupName, $streamName, $retentionDays, 10000, $tags);

        // Create a log channel
        $logger = new Logger($name);
        // Set handler
        $logger->pushHandler($handler);

        return $logger;
    }
}
  • Next I used this in my index controller: Log::info('Test CloudWatch');
  • added LOG_CHANNEL=cloudwatch to my .env
    I get no errors, but no data is transferred to the Cloudwatch. Also a Log group is created on Cloudwatch side, but the log streams is empty.
    What I did wrong?

Expected behavior

  1. create new group in IAM and added CloudwatchFullAccess policy
  2. create new user and add into this group
  3. install package according to package documentation
  4. switch to cloudwatch channel in your .env
  5. when you run Log::info() in your code logging data should be transmitted to Cloudwatch

Please tell about your environment:
Laravel Version: v7.19.1
PHP Version: v7.2.24
Ubuntu 18.04.3 LTS

  • Application mode:
    Web app

Your requirements could not be resolved to an installable set of packages.

  • I'm submitting a ...

    • bug report
    • feature request
    • support request
  • Do you want to request a feature or report a bug?

  • What is the current behavior?

  • If the current behavior is a bug, please provide the steps to reproduce and if possible a minimal demo of the problem

  • What is the expected behavior?

  • What is the motivation / use case for changing the behavior?

  • Please tell about your environment:

    • PHP Version:
    • Operating system (distro):
    • Application mode (web app / cli app / daemon cli app):
  • Other information (e.g. detailed explanation, stacktraces, related issues, suggestions how to fix, links to have context, eg. stackoverflow, etc)

Timeout error

Hi Max,

I have encountered the following error. Would you have some suggestion on about might have caused this?

Fatal error: Uncaught exception 'Aws\CloudWatchLogs\Exception\CloudWatchLogsException' with message 'Error executing "DescribeLogGroups" on "https://logs.eu-west-1.amazonaws.com"; AWS HTTP error: Error creating resource: [message] fopen(https://logs.eu-west-1.amazonaws.com): failed to open stream: Connection timed out

Thank you.

Context is not searchable in cloudwatch

Hi, recently discovered that the context data provided as second argument to the logger's error or any other method is not searchable in cloud watch. Only the message is searchable but no the context data.

No apply retention days

  • I'm submitting a ...

    • bug report
    • feature request
    • support request
  • Do you want to request a feature or report a bug?
    Report a bug

  • What is the current behavior?
    When create a new log group, no put the correct retention

  • If the current behavior is a bug, please provide the steps to reproduce and if possible a minimal demo of the problem
    I use laravel 5.5 monolog

  • What is the expected behavior?
    Put the correct number in retention

  • What is the motivation / use case for changing the behavior?

  • Please tell about your environment:

    • PHP Version: 7.1
    • Operating system (distro): debian
    • Application mode (web app / cli app / daemon cli app): web app
  • Other information (e.g. detailed explanation, stacktraces, related issues, suggestions how to fix, links to have context, eg. stackoverflow, etc)
    Laravel 5.5

$cwClient = App::make('aws')->createClient('CloudWatchLogs');
$cwGroupName = config('laravel-monolog-ext.drivers.cloudwatch.group');
$cwStreamNameApp = 'laravel-' . now()->toDateString() . '.log';
$cwRetentionDays = config('laravel-monolog-ext.drivers.cloudwatch.retention');
$cwLevel = config('laravel-monolog-ext.drivers.cloudwatch.level');
$this->cwHandlerApp = new CloudWatch($cwClient, $cwGroupName, $cwStreamNameApp, $cwRetentionDays, 10000, [], $cwLevel);
$monolog->pushHandler($this->cwHandlerApp);

The user AWS have full access in CloudWatch Logs

I checked in code, and enter in this part:

if ($this->retention !== null) {
                $this
                    ->client
                    ->putRetentionPolicy(
                        [
                            'logGroupName' => $this->group,
                            'retentionInDays' => $this->retention,
                        ]
                    );
            }

But in AWS CloudWatch, didn't change, always put never expire
Thanks for your support!

[FEATURE]

how to install in laravel 5.2 and above version

Update Readme to mention actions required

  • I'm submitting a ...

    • bug report
    • feature request
    • support request
  • Do you want to request a feature or report a bug? Feature

  • What is the current behavior?
    List of a minimum set of required AWS IAM policy actions is not specified in the ReadMe

  • What is the expected behavior?
    Let's update ReadMe.md to mention a minimum set of CloudWatch IAM policy actions that have to be attached to a user.

  • What is the motivation / use case for changing the behavior?
    Many people prefer to create custom policies and specify only the actions that are necessary for an app to function. It took some time to figure out those that I was missing.

  • Other information (e.g. detailed explanation, stacktraces, related issues, suggestions how to fix, links to have context, eg. stackoverflow, etc):
    Seems like this should be a complete list:

                "logs:CreateLogGroup",
                "logs:CreateLogStream",
                "logs:PutLogEvents",
                "logs:PutRetentionPolicy",
                "logs:DescribeLogStreams",
                "logs:DescribeLogGroups"

Allow Monolog 2.0

  • I'm submitting a ...

    • bug report
    • feature request
  • Do you want to request a feature or report a bug?
    Feature

  • What is the current behavior?

  • If the current behavior is a bug, please provide the steps to reproduce and if possible a minimal demo of the problem

  • What is the expected behavior?

  • What is the motivation / use case for changing the behavior?
    Allow to upgrade to Monolog 2.0

  • Please tell about your environment:

    • PHP Version:
    • Operating system (distro):
    • Application mode:
      • Web app
      • CLI app
      • Daemon worker
  • Other information (e.g. detailed explanation, stacktraces, related issues, suggestions how to fix, links to have context, eg. stackoverflow, etc)

Upgrade path seems quite straight forward: https://github.com/Seldaek/monolog/blob/master/UPGRADE.md

[BUG] Logs are not sent in Laravel queues

Description

Logs from an asynchronous job in a Laravel queue are not sent to CloudWatch until you shutdown the queue process.

This is because php artisan queue:work does not close the process. Queue workers do not "reboot" the framework before processing each job.

How to reproduce

  1. Create a new job:
<?php

namespace App\Jobs;

use Illuminate\Bus\Queueable;
use Illuminate\Contracts\Queue\ShouldQueue;
use Illuminate\Foundation\Bus\Dispatchable;
use Illuminate\Queue\InteractsWithQueue;
use Illuminate\Queue\SerializesModels;
use Illuminate\Support\Facades\Log;

class TestJob implements ShouldQueue
{
    use Dispatchable, InteractsWithQueue, Queueable, SerializesModels;

    public function handle()
    {
        Log::info('The job has been executed');
    }
}
  1. Create a new route:
use App\Jobs\TestJob;

Route::get('dispatch-job', function () {
    TestJob::dispatch();
});
  1. Run the queue:
php artisan queue:work

Note: make sure that in your .env you do not use the sync queue driver.

  1. Call GET /dispatch-job

You will receive the logs in CloudWatch only when you'll interrupt the queue:work command. Not before.

Related GitHub issues

[BUG] Can't install on PHP 8 (RC3)

Describe the bug
PHP version constraints are not compatible with PHP 8 (RC3). Code may or may not work though.

Expected behavior
Installation succeeds without errors.

Please provide the steps to reproduce and if possible a minimal demo of the problem
Trying composer require maxbanton/cwh=^2 results in:

  [InvalidArgumentException]                                                                                                         
  Package maxbanton/cwh at version ^2 has a PHP requirement incompatible with your PHP version, PHP extensions and Composer version  

and same for master branch.

Please tell about your environment:

  • PHP Version: 8.0.0-RC3
  • Operating system (distro): *

Add a mechanism to handle SIGTERM/SIGINT/SIGUP

  • I'm submitting a ...

    • feature request
  • Do you want to request a feature or report a bug?
    Feature

  • What is the current behavior?
    Unknown

  • If the current behavior is a bug, please provide the steps to reproduce and if possible a minimal demo of the problem

  • What is the expected behavior?
    When the script closes, it should send records to the AWS from the buffer

  • What is the motivation / use case for changing the behavior?
    Long running process

  • Please tell about your environment:

    • PHP Version: 7.1

    • Operating system (distro): Linux

    • Application mode (web app / cli app / daemon cli app): Cli

      if (extension_loaded('pcntl')) {
          if (!function_exists('pcntl_signal')) {
              throw new \BadFunctionCallException("Function 'pcntl_signal' is referenced in the php.ini 'disable_functions' and can't be called.");
          }
          pcntl_signal(SIGTERM, array(&$this, 'close'));
          pcntl_signal(SIGINT, array(&$this, 'close'));
          pcntl_signal(SIGHUP, array(&$this, 'close'));
      }
      

[BUG] Log not getting written to CW

Describe the bug
I am using Symfony 5.3. The group and stream get created but $log->debug('something') / $log->warning('something else) etc do not write anything to CW. I have copy/pasted the AWS policy against the IAM user but it doesn't seem permission related because there are no errors and the group/stream get created from the code being ran

Expected behavior
To see log entries in the CW stream

Please provide the steps to reproduce and if possible a minimal demo of the problem
Any instructions how to reproduce a bug:

    ```

$client = new CloudWatchLogsClient(self::sdkParams);

    $groupName = 'php-logtest';

    $streamName = 'ec2-instance-1';

    $handler = new CloudWatch($client, $groupName, $streamName);

    $handler->setFormatter(new JsonFormatter());

    $log = new Logger('name');

    $log->pushHandler($handler);

    $log->debug('Foo');
    $log->warning('Bar');
    $log->error('Baz');
    $log->addRecord(1, 'test');

**Please tell about your environment:**
  - PHP Version: 8.0
  - Operating system (distro): Ubuntu
  - Application mode:
    - [ ] Web app
    - [x] CLI app
    - [ ] Daemon worker

Performance

  • I'm submitting a ...

    • bug report
    • feature request
    • support request
  • Do you want to request a feature or report a bug?
    More of a question

  • What is the current behavior?
    It is doing what it supposed to be, but adding a 35% of lag to each operation compared to not-using this library.

  • What is the expected behavior?
    Decrease the affected performance compared to not using this library.

  • Please tell about your environment:

    • PHP Version: 7.2.25
    • Operating system (distro): Debian 8 y 9
    • Application mode (web app / cli app / daemon cli app): web app + cli app
  • Other information (e.g. detailed explanation, stacktraces, related issues, suggestions how to fix, links to have context, eg. stackoverflow, etc)
    We just made some comparison in our framework WITHOUT cwh and with it.

Times for a 100 requests with concurrency of 50, goes from 9 secs to 13 secs. Thats more than 33% of latency.

We made consistent tests to confirm this behaviour.

Exception on event size limit

Hello,

I allow myself to open a new issue regarding the size of the event "Amazon CloudWatch Logs".
Indeed, the limit is 256kb and if the limit is exceeded, AWS returns a "400 Bad Request". Following this, the execution interrupted. The exact answer is:
{"__type":"InvalidParameterException","message":"Log event too large: xxxxxx bytes exceeds limit of 262144"}

So I propose to just catch the exception in the "send" method:
`

private function send(array $entries)
{
    try{
        if (false === $this->initialized) {
            $this->initialize();
        }

        $data = [
            'logGroupName' => $this->group,
            'logStreamName' => $this->stream,
            'logEvents' => $entries
        ];

        if (!empty($this->sequenceToken)) {
            $data['sequenceToken'] = $this->sequenceToken;
        }

        $response = $this->client->putLogEvents($data);

        $this->sequenceToken = $response->get('nextSequenceToken');

    } catch (\Exception $ex){}
}

`

What do you think?

Buffer gets log in Laravel Queues

I integrated your code in my Lumen project that is essentially Queues that do asynchronous jobs on the server. Since Lumen (or Laravel) keeps launching them repeatedly, your code isn't uploading the log buffer to AWS. Nothing shows up. I downgraded to version 0.1.1 and it works. The buffer mecanism isn't working for Laravel.

Laravel 7 - Creates a Cloudwatch Log Group but no logs

Describe the bug
Laravel 7 - Creates a Cloudwatch Log Group but no logs.

Expected behavior
I have it working on EC2 and it created a Log Group but no error logs making it through to Cloudwatch.

Please provide the steps to reproduce and if possible a minimal demo of the problem

Please tell about your environment:

  • PHP Version: 7.3
  • Operating system (distro): Ubuntu
  • Application mode:
    • Web app

[FEATURE] Use "keyword arguments" for the CloudWatch constructor

Is your feature request related to a problem? Please describe.

I recently wanted to change the $createGroup argument. This is the last argument in a long-ish list of arguments. To solve this, I had to copy and paste all the default arguments when instantiating the handler. It now looks like:

                $handler = new CloudWatch(
                    $client,
                    $log_group,
                    $stream_name,
                    14,            // $retention (default)
                    10000,         // $batchSize (default)
                    [],            // $tags (default)
                    Logger::DEBUG, // $level (default)
                    true,          // $bubble (default)
                    false,         // $createGroup
                );

If the library changes the default value, my instantiation could easily go out of sync.

The values themselves are opaque (what is 14?) requiring me to add multiple inline comments.

Describe the solution you'd like

Allow passing an array of keywords to the constructor. For example:

                $handler = new CloudWatch([
                    'client' => $client,
                    'group' => $log_group,
                    'stream' => $stream_name,
                    'createGroup' => false,
                ]);

This is self documenting and allows omitting default values.

This could be backwards compatible by checking the number and type of arguments and then acting accordingly.

AWS API limit exceeded errors

When using the log handler to send logs from queue workers to AWS, we often get a throttle error due to exceeding an API limit. AWS Support beleive it is the DescribeLogGroups API that is being throttled.

The initialize method calls the describe every time the handler is initialized. AWS Support have recommended to implement trying to save the log and catching the error response if the log stream/group doesn't exist, and then creating them if needed.

I will work on creating a PR for this issue, as we need the changes implemented.

Empty tags not allowed when creating LogGroup

When a LogGroup does not exist yet, it will be created. However, when no tags are used during instantiation of the CloudWatchLogsHandler, this results in the following error:

GuzzleHttp\Exception\ClientException: Client error: `POST https://logs.eu-west-1.amazonaws.com` resulted in a `400 Bad Request` response:
{"__type":"InvalidParameterException","message":"1 validation error detected: Value '{}' at 'tags' failed to satisfy in /app/project/vendor/aws/aws-sdk-php/src/WrappedHttpHandler.php on line 192

Two workarounds:

  • do provide tags upon instantiation of the CloudWatchLogsHandler
  • commenting out the tags in the client

Question: is it possibile to print full error stack?

Good afternoon,
first of all i'd like to thank you for this useful package. After reading Laravel, Monolog and this package documentation i'm a little bit confused about the error printed in logs. It seems like i have in my logs only the first entry of Monolog error stack, but most of the time in Laravel it would be useful to have the full error stack printed. Is there a way to accomplish this?

Log events in a single PutLogEvents request must be in chronological order

Re-creating a new entry for the problem explained in #32, because the fix there had to be reverted and the exception is still occurring. I'm also investigating libraries for other languages for potential fixes.

Timestamps in my case look e.g. like this:

1568894292321.3
1568894292321.3
1568894292322.5
1568894292322.5
1568894292322.5
1568894292322.5
1568894292323.6
1568894292323.6
1568894292323.6
1568894292323.6
1568894292324.7
1568894292324.7
1568894292324.7

json formatter for symfony?

hey there,

first of all: awesome work, thank you! from zero to working in less than an hour.

second: is it possible to use a json formatter?

I tried this:
services.yml


    cloudwatch_formatter:
        class: Monolog\Formatter\JsonFormatter

config.yml


monolog:
    handlers:
        main:
            type: stream
            path: "%kernel.logs_dir%/%kernel.environment%.log"
            level: notice
            channels: [!event]
        console:
            type:   console
            channels: [!event, !doctrine]
        custom:
            type: service
            id: cloudwatch_handler
            level: notice
            formatter: cloudwatch_formatter

sadly I got this error:


  [Symfony\Component\Config\Definition\Exception\InvalidConfigurationException]
  Invalid configuration for path "monolog.handlers.custom": Service handlers can not have a formatter configured in the bundle, you must reconfigur
  e the service itself instead

[BUG] Fails if encountering ThrottlingException from AWS for DescribeLogGroups request

Describe the bug
If the DescribeLogGroups call encounters a ThrottlingException from AWS the logger will throw a CloudWatchLogsException
400 Bad Request
{"__type":"ThrottlingException","message":"Rate exceeded"}

Expected behavior
If exception is thrown, sleep 1 second and retry (at least once).

Please provide the steps to reproduce and if possible a minimal demo of the problem
Run 6 parallel processes all trying to initialize the CloudWatch handler within the same second

Please tell about your environment:

  • PHP Version: 7.2
  • Operating system (distro): AWS Linux
  • Application mode:
    • Web app
    • CLI app
    • Daemon worker

trace:
"/var/app/current/carrot/vendor/aws/aws-sdk-php/src/WrappedHttpHandler.php:100",
"/var/app/current/carrot/vendor/guzzlehttp/promises/src/Promise.php:203",
"/var/app/current/carrot/vendor/guzzlehttp/promises/src/Promise.php:174",
"/var/app/current/carrot/vendor/guzzlehttp/promises/src/RejectedPromise.php:40",
"/var/app/current/carrot/vendor/guzzlehttp/promises/src/TaskQueue.php:47",
"/var/app/current/carrot/vendor/guzzlehttp/guzzle/src/Handler/CurlMultiHandler.php:104",
"/var/app/current/carrot/vendor/guzzlehttp/guzzle/src/Handler/CurlMultiHandler.php:131",
"/var/app/current/carrot/vendor/guzzlehttp/promises/src/Promise.php:246",
"/var/app/current/carrot/vendor/guzzlehttp/promises/src/Promise.php:223",
"/var/app/current/carrot/vendor/guzzlehttp/promises/src/Promise.php:267",
"/var/app/current/carrot/vendor/guzzlehttp/promises/src/Promise.php:225",
"/var/app/current/carrot/vendor/guzzlehttp/promises/src/Promise.php:267",
"/var/app/current/carrot/vendor/guzzlehttp/promises/src/Promise.php:225",
"/var/app/current/carrot/vendor/guzzlehttp/promises/src/Promise.php:62",
"/var/app/current/carrot/vendor/aws/aws-sdk-php/src/AwsClientTrait.php:58",
"/var/app/current/carrot/vendor/aws/aws-sdk-php/src/AwsClientTrait.php:86",
"/var/app/current/carrot/vendor/maxbanton/cwh/src/Handler/CloudWatch.php:285",
"/var/app/current/carrot/vendor/maxbanton/cwh/src/Handler/CloudWatch.php:248",
"/var/app/current/carrot/vendor/maxbanton/cwh/src/Handler/CloudWatch.php:172",
"/var/app/current/carrot/vendor/maxbanton/cwh/src/Handler/CloudWatch.php:150",
"/var/app/current/carrot/vendor/monolog/monolog/src/Monolog/Handler/AbstractProcessingHandler.php:39",
"/var/app/current/carrot/vendor/monolog/monolog/src/Monolog/Logger.php:344",
"/var/app/current/carrot/vendor/monolog/monolog/src/Monolog/Logger.php:637",
...

[BUG] ERROR PutLogEvents, InvalidSequenceTokenException

Describe the bug
The package has been integrated with laravel 7.0 to sync logs to cloudWatch but i'm getting this error sometimes

InvalidSequenceTokenException (client): The given sequenceToken is invalid. The next expected sequenceToken is: 49610385040921553485418540785990629652624902462896384722 - {"__type":"InvalidSequenceTokenException","expectedSequenceToken":"49610385040921553485418540785990629652624902462896384722","message":"The given sequenceToken is invalid. The next expected sequenceToken is: 49610385040921553485418540785990629652624902462896384722"}

Expected behavior
PutLogEvents should be executed smoothly .

Please tell about your environment:

  • PHP Version: 7.3
  • Operating system (linux AWS ECS with Fargate type):
  • Application mode:
    • Web app
    • CLI app
    • Daemon worker

Symfony configuration

Hi there,

We would like to use this package in our Symfony project. Is there a chance that you have an example configuration for Symfony?

Many thanks!

Does not work correctly with FallbackGroupHandler

Describe the bug
Monolog FallbackGroupHandler expects handlers to throw error while they are handling the records. CloudWatch handler does not throw cloudwatch connection related errors during call to handling process.
Only when everything is finished (in close()), CloudWatch handler realizes that its not able to connect etc. and then there is an exception, but it's already too late in case of FallbackGroupHandler.

Expected behavior
CloudWatch should check during the handling of the record, if it's able to connect to cloudwatch service or not. If it's not able to connect, it should through proper error.

Please provide the steps to reproduce and if possible a minimal demo of the problem

  • Define a FallbackGroupHandler with CloudWatch as first handler and file as second handler in the group.
  • Make cloudwatch service fail (wrong credentials etc.)
  • Expected: File handler is executed when CloudWatch handler fails.
  • Actual: File handler is not executed when CloudWatch handler fails, because CloudWatch handler does not throw exception during handle() call.

Please tell about your environment:

  • PHP Version: 7.3.12
  • Operating system (distro): Alpine Linux v3.10
  • Application mode:
    • Web app
    • CLI app

Help request - Using along Laravel

Hello,

First of all thank you for this nice library.

I'm more than a baby on Laravel/PHP but I am required to log on Cloudwatch from my Laravel application.
I have successfully logged on Cloudwatch using the "Basic usage" sample and adding "$handler->close()".

Now I am trying to configure Monolog and Laravel correctly using this example. Unfortunately it does not seem to log. Could anyone explain to me when the "$handler->close()" is supposed to happen?

Thank you,
Denis

initialize must throw exceptions if something failed

Hi,

First of all thank you for this useful handler recently discovered through this article ;)

One improvement might be that initialize could throw exceptions if something failed, especially if client is not allowed to perform some actions.

It happens to me while I'm trying using it with a user having the permission of AWSOpsWorksCloudWatchLogs.
The absence of any return from the script make me putting some var_dump & cie into the handler until finding what went wrong.
Solution was to basically grant user with permission CloudWatchLogsFullAccess for make it work.

Best regards,

Cyril

Exception: Log events in a single PutLogEvents request must be in chronological order

Hello folks,

On a Symfony application, after I enabled this library in my production environment I started to get a lot of Exceptions such as the one below:

[2018-02-26 11:44:55] prod.EMERGENCY: Message: Error executing "PutLogEvents" on "https://logs.us-west-2.amazonaws.com"; AWS HTTP error: Client error: POST https://logs.us-west-2.amazonaws.com resulted in a 400 Bad Request response: {"__type":"InvalidParameterException","message":"Log events in a single PutLogEvents request must be in chronological or (truncated...) InvalidParameterException (client): Log events in a single PutLogEvents request must be in chronological order. - {"__type":"InvalidParameterException","message":"Log events in a single PutLogEvents request must be in chronological order."} []

Below is my current configuration:

services:
    monolog.formatter.api:
        class: Monolog\Formatter\LineFormatter
        arguments:
            - "[%%extra.route%%] [%%extra.server_hostname%%] [%%extra.request%%] %%level_name%%: %%message%% %%context%% %%extra%%\n"

    cloudwatchlogs.api:
        class: Maxbanton\Cwh\Handler\CloudWatch
        arguments:
            - '@aws.cloudwatchlogs'
            - '/path'
            - 'api'
            - 30
            - 10000
            - { mytag: 'api' }
            - DEBUG
        calls:
            - ['setFormatter', ['@monolog.formatter.api']]

monolog:
    handlers:
        api:
            type:         service
            id:           cloudwatchlogs.api
            formatter:    ~

Any ideas on why would this happen? One of my channels generates a lot of DEBUG messages but I presume this wouldn't really be a problem?

Thank you in advance.

Renato.

Cloudwatch Dashboard

so im using the following code
`
use Aws\CloudWatchLogs\CloudWatchLogsClient;
use Maxbanton\Cwh\Handler\CloudWatch;
use Monolog\Logger;

$sdkParams = [
'region' => 'eu-west-1',
'version' => 'latest',
'credentials' => [
'key' => 'my AWS key',
'secret' => 'my AWS secret',
'token' => '', // token is optional
]
];

// Instantiate AWS SDK CloudWatch Logs Client
$client = new CloudWatchLogsClient($sdkParams);

// Log group name, will be created if none
$groupName = 'php-logtest';

// Log stream name, will be created if none
$streamName = 'ec2-instance-1';

// Days to keep logs, 14 by default. Set to null to allow indefinite retention.
$retentionDays = 30;

// Instantiate handler (tags are optional)
$handler = new CloudWatch($client, $groupName, $streamName, $retentionDays, 10000, ['my-awesome-tag' => 'tag-value']);
`

but then how can i get the values of a specific cloudwatch Dashboard that i have in my account using the library ?

Regards,

Incompatible with most recent Monolog version? PHP Fatal Error with getDefaultFormatter()

  • I'm submitting a ...

    • [ X ] bug report
    • feature request
    • support request
  • Do you want to request a feature or report a bug?
    Bug

  • What is the current behavior?
    Does not work. Gives error:
    [27-Apr-2018 21:52:17 America/Chicago] PHP Fatal error: Declaration of Maxbanton\Cwh\Handler\CloudWatch::getDefaultFormatter() must be compatible with Monolog\Handler\AbstractProcessingHandler::getDefaultFormatter(): Monolog\Formatter\FormatterInterface in /site/classes/Maxbanton/Cwh/Handler/CloudWatch.php on line 10

  • Please tell about your environment:

    • PHP Version: 7.1.12
    • Operating system (distro): Mac OS High Sierra 10.13.3
    • Application mode (web app / cli app / daemon cli app): web app
    • Monolog version 1.23.0

Extra array?

Great plugin, but i'm curious why there is an extra empty array that gets appended after submitting a log event.

For example...
Request:
$message = "$i of $j - ($x)";
$this->log->info('Done saving record', [$message]);
Result:
INFO: Done saving record ["41158 of 129700 - (14801)"] []

What is that extra empty array at the end? Can it be populated? If so how?

Flush everything in the queue

  • I'm submitting a ...

    • [ X] feature request
  • Do you want to request a feature or report a bug?
    Feature

  • What is the current behavior?
    https://github.com/maxbanton/cwh/blob/master/src/Handler/CloudWatch.php#L149
    It flushes the queue and then add the latest log/record in the queue. As a result there is alway something in the queue/buffer.

  • If the current behavior is a bug, please provide the steps to reproduce and if possible a minimal demo of the problem

  • What is the expected behavior?
    We should always add the new record to buffer and then do the checking. If the buffer size qualify the limit checking then it should flush everything.

  • What is the motivation / use case for changing the behavior?
    We are having some logs in the queue and when consumer dies we lose data

  • Please tell about your environment:

    • PHP Version: 7.1
    • Operating system (distro): Linux
    • Application mode (web app / cli app / daemon cli app): Cli
  • Other information (e.g. detailed explanation, stacktraces, related issues, suggestions how to fix, links to have context, eg. stackoverflow, etc)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    πŸ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. πŸ“ŠπŸ“ˆπŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❀️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.