Giter VIP home page Giter VIP logo

s3-uploads's Introduction

S3 Uploads
Lightweight "drop-in" for storing WordPress uploads on Amazon S3 instead of the local filesystem.
Psalm coverage Build status Coverage via codecov.io
A Human Made project. Maintained by @joehoyle.

S3 Uploads is a WordPress plugin to store uploads on S3. S3 Uploads aims to be a lightweight "drop-in" for storing uploads on Amazon S3 instead of the local filesystem.

It's focused on providing a highly robust S3 interface with no "bells and whistles", WP-Admin UI or much otherwise. It comes with some helpful WP-CLI commands for generating IAM users, listing files on S3 and Migrating your existing library to S3.

Requirements

  • PHP >= 7.1
  • WordPress >= 5.3

Getting Set Up

Install Using Composer

composer require humanmade/s3-uploads

Note: Composer's autoloader must be loaded before S3 Uploads is loaded. We recommend loading it in your wp-config.php before wp-settings.php is loaded as shown below.

require_once __DIR__ . '/vendor/autoload.php';

Install Manually

If you do not use Composer to manage plugins or other dependencies, you can install the plugin manually. Download the manual-install.zip file from the Releases page and extract the ZIP file to your plugins directory.

You can also git clone this repository, and run composer install in the plugin folder to pull in its dependencies.


Once you've installed the plugin, add the following constants to your wp-config.php:

define( 'S3_UPLOADS_BUCKET', 'my-bucket' );
define( 'S3_UPLOADS_REGION', '' ); // the s3 bucket region (excluding the rest of the URL)

// You can set key and secret directly:
define( 'S3_UPLOADS_KEY', '' );
define( 'S3_UPLOADS_SECRET', '' );

// Or if using IAM instance profiles, you can use the instance's credentials:
define( 'S3_UPLOADS_USE_INSTANCE_PROFILE', true );

Please refer to this region list http://docs.aws.amazon.com/general/latest/gr/rande.html#s3_region for the S3_UPLOADS_REGION values.

Use of path prefix after the bucket name is allowed and is optional. For example, if you want to upload all files to 'my-folder' inside a bucket called 'my-bucket', you can use:

define( 'S3_UPLOADS_BUCKET', 'my-bucket/my-folder' );

You must then enable the plugin. To do this via WP-CLI use command:

wp plugin activate S3-Uploads

The plugin name must match the directory you have cloned S3 Uploads into; If you're using Composer, use

wp plugin activate s3-uploads

The next thing that you should do is to verify your setup. You can do this using the verify command like so:

wp s3-uploads verify

You will need to create your IAM user yourself, or attach the necessary permissions to an existing user, you can output the policy via wp s3-uploads generate-iam-policy

Listing files on S3

S3-Uploads comes with a WP-CLI command for listing files in the S3 bucket for debugging etc.

wp s3-uploads ls [<path>]

Uploading files to S3

If you have an existing media library with attachment files, use the below command to copy them all to S3 from local disk.

wp s3-uploads upload-directory <from> <to> [--verbose]

For example, to migrate your whole uploads directory to S3, you'd run:

wp s3-uploads upload-directory /path/to/uploads/ uploads

There is also an all purpose cp command for arbitrary copying to and from S3.

wp s3-uploads cp <from> <to>

Note: as either <from> or <to> can be S3 or local locations, you must specify the full S3 location via s3://mybucket/mydirectory for example cp ./test.txt s3://mybucket/test.txt.

Private Uploads

WordPress (and therefor S3 Uploads) default behaviour is that all uploaded media files are publicly accessible. In certain cases which may not be desireable. S3 Uploads supports setting S3 Objects to a private ACL and providing temporarily signed URLs for all files that are marked as private.

S3 Uploads does not make assumptions or provide UI for marking attachments as private, instead you should integrate the s3_uploads_is_attachment_private WordPress filter to control the behaviour. For example, to mark all attachments as private:

add_filter( 's3_uploads_is_attachment_private', '__return_true' );

Private uploads can be transitioned to public by calling S3_Uploads::set_attachment_files_acl( $id, 'public-read' ) or vica-versa. For example:

S3_Uploads::get_instance()->set_attachment_files_acl( 15, 'public-read' );

The default expiry for all private file URLs is 6 hours. You can modify this by using the s3_uploads_private_attachment_url_expiry WordPress filter. The value can be any string interpreted by strtotime. For example:

add_filter( 's3_uploads_private_attachment_url_expiry', function ( $expiry ) {
	return '+1 hour';
} );

Cache Control

You can define the default HTTP Cache-Control header for uploaded media using the following constant:

define( 'S3_UPLOADS_HTTP_CACHE_CONTROL', 30 * 24 * 60 * 60 );
	// will expire in 30 days time

You can also configure the Expires header using the S3_UPLOADS_HTTP_EXPIRES constant For instance if you wanted to set an asset to effectively not expire, you could set the Expires header way off in the future. For example:

define( 'S3_UPLOADS_HTTP_EXPIRES', gmdate( 'D, d M Y H:i:s', time() + (10 * 365 * 24 * 60 * 60) ) .' GMT' );
	// will expire in 10 years time

Default Behaviour

As S3 Uploads is a plug and play plugin, activating it will start rewriting image URLs to S3, and also put new uploads on S3. Sometimes this isn't required behaviour as a site owner may want to upload a large amount of media to S3 using the wp-cli commands before enabling S3 Uploads to direct all uploads requests to S3. In this case one can define the S3_UPLOADS_AUTOENABLE to false. For example, place the following in your wp-config.php:

define( 'S3_UPLOADS_AUTOENABLE', false );

To then enable S3 Uploads rewriting, use the wp-cli command: wp s3-uploads enable / wp s3-uploads disable to toggle the behaviour.

URL Rewrites

By default, S3 Uploads will use the canonical S3 URIs for referencing the uploads, i.e. [bucket name].s3.amazonaws.com/uploads/[file path]. If you want to use another URL to serve the images from (for instance, if you wish to use S3 as an origin for CloudFlare), you should define S3_UPLOADS_BUCKET_URL in your wp-config.php:

// Define the base bucket URL (without trailing slash)
define( 'S3_UPLOADS_BUCKET_URL', 'https://your.origin.url.example/path' );

S3 Uploads' URL rewriting feature can be disabled if the current website does not require it, nginx proxy to s3 etc. In this case the plugin will only upload files to the S3 bucket.

// disable URL rewriting alltogether
define( 'S3_UPLOADS_DISABLE_REPLACE_UPLOAD_URL', true );

S3 Object Permissions

The object permission of files uploaded to S3 by this plugin can be controlled by setting the S3_UPLOADS_OBJECT_ACL constant. The default setting if not specified is public-read to allow objects to be read by anyone. If you don't want the uploads to be publicly readable then you can define S3_UPLOADS_OBJECT_ACL as one of private or authenticated-read in you wp-config file:

// Set the S3 object permission to private
define('S3_UPLOADS_OBJECT_ACL', 'private');

For more information on S3 permissions please see the Amazon S3 permissions documentation.

Custom Endpoints

Depending on your requirements you may wish to use an alternative S3 compatible object storage system such as Minio, Ceph, Digital Ocean Spaces, Scaleway and others.

You can configure the endpoint by adding the following code to a file in the wp-content/mu-plugins/ directory, for example wp-content/mu-plugins/s3-endpoint.php:

<?php
// Filter S3 Uploads params.
add_filter( 's3_uploads_s3_client_params', function ( $params ) {
	$params['endpoint'] = 'https://your.endpoint.com';
	$params['use_path_style_endpoint'] = true;
	$params['debug'] = false; // Set to true if uploads are failing.
	return $params;
} );

Temporary Session Tokens

If your S3 access is configured to require a temporary session token in addition to the access key and secret, you should configure the credentials using the following code:

// Filter S3 Uploads params.
add_filter( 's3_uploads_s3_client_params', function ( $params ) {
	$params['credentials']['token'] = 'your session token here';
	return $params;
} );

Offline Development

While it's possible to use S3 Uploads for local development (this is actually a nice way to not have to sync all uploads from production to development), if you want to develop offline you have a couple of options.

  1. Just disable the S3 Uploads plugin in your development environment.
  2. Define the S3_UPLOADS_USE_LOCAL constant with the plugin active.

Option 2 will allow you to run the S3 Uploads plugin for production parity purposes, it will essentially mock Amazon S3 with a local stream wrapper and actually store the uploads in your WP Upload Dir /s3/.

Credits

Created by Human Made for high volume and large-scale sites. We run S3 Uploads on sites with millions of monthly page views, and thousands of sites.

Written and maintained by Joe Hoyle. Thanks to all our contributors.

Interested in joining in on the fun? Join us, and become human!

s3-uploads's People

Contributors

austinpray avatar bdurette avatar benmay avatar di-dvaness avatar dsawardekar avatar eduardoboucas avatar ericmann avatar eugene-manuilov avatar fklein-lu avatar goldenapples avatar iamlili avatar jeremyfelt avatar jezemery avatar joehoyle avatar kodie avatar kovshenin avatar mattheu avatar mikelittle avatar nathanielks avatar ocean90 avatar phowen-cisco avatar rmccue avatar roborourke avatar sc0tth0lden avatar shadyvb avatar spacedmonkey avatar stuartshields avatar tillkruss avatar zacscott avatar zamoose avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

s3-uploads's Issues

File names with spaces in them seem to break s3 URL

If I upload the image: Logo-De-overburen-wit+payoff.png (space in the name as +). This builds the URL:

http://hmn-uploads.s3.amazonaws.com/ht/uploads/sites/3449/2014/02/Logo-De-overburen-wit+payoff.png

This 404s, it needs to be encoded as http://hmn-uploads.s3.amazonaws.com/ht/uploads/sites/3449/2014/02/Logo-De-overburen-wit%2Bpayoff.png

The bucket you are attempting to access must be addressed using the specified endpoint

Getting this weird error in the beginning on verify:

bitnami@ip-172-31-23-173:~/apps/wordpress/htdocs$ wp s3-uploads verify
Attempting to upload file s3://newsroom.macleay.net/uploads/1824561755.jpg
PHP Warning:  The bucket you are attempting to access must be addressed using the specified endpoint. Please send all future requests to this endpoint: "newsroom.macleay.net.s3.amazonaws.com". in /opt/bitnami/apps/wordpress/htdocs/wp-content/plugins/S3-Uploads/lib/aws-sdk/Aws/S3/StreamWrapper.php on line 759
Warning: The bucket you are attempting to access must be addressed using the specified endpoint. Please send all future requests to this endpoint: "newsroom.macleay.net.s3.amazonaws.com". in /opt/bitnami/apps/wordpress/htdocs/wp-content/plugins/S3-Uploads/lib/aws-sdk/Aws/S3/StreamWrapper.php on line 759
PHP Warning:  copy(s3://newsroom.macleay.net/uploads/1824561755.jpg): failed to open stream: "S3_Uploads_Stream_Wrapper::stream_open" call failed in /opt/bitnami/apps/wordpress/htdocs/wp-content/plugins/S3-Uploads/inc/class-s3-uploads-wp-cli-command.php on line 26
Warning: copy(s3://newsroom.macleay.net/uploads/1824561755.jpg): failed to open stream: "S3_Uploads_Stream_Wrapper::stream_open" call failed in /opt/bitnami/apps/wordpress/htdocs/wp-content/plugins/S3-Uploads/inc/class-s3-uploads-wp-cli-command.php on line 26
Error: Failed to copy / write to S3 - check your policy?

Thoughts?

Can LS bucket but can not verify or upload

php wp-cli.phar s3-uploads verify

Deprecated: Directive 'register_long_arrays' is deprecated in PHP 5.3 and greater in Unknown on line 0

Deprecated: Directive 'magic_quotes_gpc' is deprecated in PHP 5.3 and greater in Unknown on line 0
Attempting to upload file s3://site/uploads/178792104.jpg
Fatal error: Call to a member function headObject() on a non-object in /usr/local/www/site/www/wp-content/plugins/S3-Uploads/inc/class-s3-uploads-stream-wrapper.php on line 169

Unit Tests are failing

When running unit tests, there are two errors:

FFPHP Fatal error:  Class 'S3_Uploads_Image_Editor_Imagick' not found in /srv/www/s3-uploads/htdocs/wp-content/plugins/s3-uploads/tests/test-s3-uploads-image-editor-imagick.php on line 46

Fatal error: Class 'S3_Uploads_Image_Editor_Imagick' not found in /srv/www/s3-uploads/htdocs/wp-content/plugins/s3-uploads/tests/test-s3-uploads-image-editor-imagick.php on line 46

Wrong path when deleting intermediate images

When trying to delete an image from media library I found out that the path is wrong for intermediate images
is trying to delete
s3://BUCKET/uploads/s3://BUCKET/uploads/2015/11/image-150x150.jpg
instead of
s3://BUCKET/uploads/2015/11/image-150x150.jpg

so goes wrong.

What could be done?
I've seen possibility to add the following to WP path_is_absolute() found in wp-includes/functions.php. So the path from $intermediate_file is accepted as absolute (but not that good idea cause of WPupdates)

function path_is_absolute( $path ) {
    if ( preg_match('#^s3:\/\/#', $path) ) // new
    return true; // new
/*
     * This is definitive if true but fails if $path does not exist or contains
     * a symbolic link.
     */
    if ( realpath($path) == $path )
        return true;

    if ( strlen($path) == 0 || $path[0] == '.' )
        return false;

    // Windows allows absolute paths like this.
    if ( preg_match('#^[a-zA-Z]:\\\\#', $path) )
        return true;

    // A path starting with / or \ is absolute; anything else is relative.
    return ( $path[0] == '/' || $path[0] == '\\' );
}

path_is_absolute() is called from within path_join() wich is called within wp_delete_attachment()

Second possibility in theme:

function my_wp_delete_file( $intermediate_file) {
  /* for more info check post.php wp_delete_attachment() */
  if(class_exists( 'S3_Uploads' )){
     if ( preg_match('#^s3:\/\/#', $intermediate_file) ){
       @unlink( $intermediate_file );
     }
  }

  return $intermediate_file;   
}
add_filter('wp_delete_file', 'my_wp_delete_file');

So what would be the plugin solution? Instead of core and theme solutions above

`verify` CLI command successful despite S3_ constants not defined

I'm testing S3-Uploads (master branch) on a server. I did not define S3_UPLOADS_BUCKET, S3_UPLOADS_KEY, or S3_UPLOADS_SECRET in wp-config.php yet and tried to run wp s3-uploads verify:

cptv@nj:~/www$ wp s3-uploads verify
Success: Looks like your configuration is correct.
Attempting to upload file /home/[...]/www/wp-content/uploads/484304978.jpg
File uploaded to S3 successfully
Attempting to delete file /home/[...]/www/wp-content/uploads/484304978.jpg
File deleted from S3 successfully

cptv@nj:~/www$ wp s3-uploads ls
Error: Error retrieving credentials from the instance profile metadata server. When you are not running inside of Amazon EC2, you must provide your AWS access key ID and secret access key in the "key" and "secret" options when creating a client or provide an instantiated Aws\Common\Credentials\CredentialsInterface object. ([curl] 28: Connection timed out after 5006 milliseconds [url] http://169.254.169.254/latest/meta-data/iam/security-credentials/)

Shouldn't wp s3-uploads verify be failing as there's no credentials listed anywhere?

Default Content Directory Incorrect

Hey there. I'm using Bedrock, and right now when I install and use this plugin, it automatically uploads the files into /uploads in S3. This is ignoring my content directory I have declared previously.

define('CONTENT_DIR', '/app');
define('WP_CONTENT_DIR', $webroot_dir . CONTENT_DIR);
define('WP_CONTENT_URL', WP_HOME . CONTENT_DIR);

I need this to upload them from /app/uploads instead of just /uploads

Right now I'm using wp s3-uploads cp /uploads/ /app/uploads/ to upload, but all new images I'm uploading are incorrect.

Uploaded avatars moved before cropping

Hi,

Thank you for an awesome plugin. :)
It works almost perfect for us. We are running it on a site with WP and BP.
The issue we have is when copping an image after upload. The avatar get uploaded locally just fine, but if you need to crop it, the trouble begin. It looks like the local file is removed before the cropping and saving is done. The file does not get saved and not moved to S3 of these reason.

I have tried to do some debugging (Iยดm not a developer, but Iยดll do my best) and found the following errors (and have attached the lines from the mentioned file) when uploading and trying to crop:

PHP Warning: closedir() expects parameter 1 to be resource, boolean given in /var/app/current/wp-content/plugins/buddypress/bp-core/bp-core-avatars.php on line 524

/var/app/current/wp-content/plugins/buddypress/bp-core/bp-core-avatars.php:
523: // Close the avatar directory.
524: closedir( $av_dir );
525:
526: // If we found a locally uploaded avatar.
527: if ( isset( $avatar_url ) ) {
528: // Support custom scheme.
529: $avatar_url = set_url_scheme( $avatar_url, $params['scheme'] );
530:
531: // Return it wrapped in an element.
532: if ( true === $params['html'] ) {

PHP Warning: opendir(/var/app/current/wp-content/uploads/avatars/6277): failed to open dir: Permission denied in /var/app/current/wp-content/plugins/buddypress/bp-core/bp-core-avatars.php on line 482
/var/app/current/wp-content/plugins/buddypress/bp-core/bp-core-avatars.php:
469: /**
470: * Look for uploaded avatar first. Use it if it exists.
471: * Set the file names to search for, to select the full size
472: * or thumbnail image.
473: */
474: $avatar_size = ( 'full' == $params['type'] ) ? '-bpfull' : '-bpthumb';
475: $legacy_user_avatar_name = ( 'full' == $params['type'] ) ? '-avatar2' : '-avatar1';
476: $legacy_group_avatar_name = ( 'full' == $params['type'] ) ? '-groupavatar-full' : '-groupavatar-thumb';
477:
478: // Check for directory.
479: if ( file_exists( $avatar_folder_dir ) ) {
480:
481: // Open directory.
482: if ( $av_dir = opendir( $avatar_folder_dir ) ) {
483:
484: // Stash files in an array once to check for one that matches.
485: $avatar_files = array();
486: while ( false !== ( $avatar_file = readdir( $av_dir ) ) ) {
487: // Only add files to the array (skip directories).
488: if ( 2 < strlen( $avatar_file ) ) {
489: $avatar_files[] = $avatar_file;
490: }
491: }

PHP Fatal error: Cannot use object of type WP_Error as array in /var/app/current/wp-content/plugins/buddypress/bp-core/bp-core-avatars.php on line 908, referer: https://www.xxxxxxx.xxx//profile/change-avatar/
/var/app/current/wp-content/plugins/buddypress/bp-core/bp-core-avatars.php:
903:// We only want to handle one image after resize.
904: if ( empty( $bp->avatar_admin->resized ) ) {
905: $bp->avatar_admin->image->file = $bp->avatar_admin->original['file'];
906: $bp->avatar_admin->image->dir = str_replace( $upload_path, '', $bp->avatar_admin->original['file'] );
907: } else {
908: $bp->avatar_admin->image->file = $bp->avatar_admin->resized['path'];
909: $bp->avatar_admin->image->dir = str_replace( $upload_path, '', $bp->avatar_admin->resized['path'] );
910: @Unlink( $bp->avatar_admin->original['file'] );
911: }

I donยดt know if this is of any help to resolve it, but i really hope it can help a bit.
If you need an server to test it on, I can provide this via e-mail.

Thank you so much.

Best regards,
Tarjei

Wrong url for featured image

After I've successfully migrated all attachments to S3, I needed a new post and look what happened.
After I've added featured image, it got http://virovitica-info-wp.s3-eu-west-1.amazonaws.com/wp-content/uploads/s3://virovitica-info/uploads/2015/11/30101745/DSC_0543.jpg as url.

Support for base path in S3

Thanks for writing this plugin.

Is it possible to add support for using a base path on the S3 key when uploading and downloading assets? Having the option to always prepend an S3 key with something like /wordpress or /staging would be useful.

For example: "/staging/uploads/some_image_name.jpg"

Region Argument Needed

I needed to set my region for S3 to work, else it results in something like this in CLI:

Error: The bucket you are attempting to access must be addressed using the specified endpoint. 
Please send all future requests to this endpoint: "bucketname.com.s3.amazonaws.com".

and this in PHP

Warning: opendir(s3://bucketname.com/uploads/sites/2): failed to open dir: "Aws\S3\StreamWrapper::dir_opendir" call failed in /home/bucketname/public_html/bucketname.com/wp-includes/ms-functions.php on line 1688

Fatal error: Uncaught Aws\S3\Exception\PermanentRedirectException: AWS Error Code: PermanentRedirect, Status Code: 301, AWS Request ID: 61CD15796511E0A4, AWS Error Type: client, AWS Error Message: The bucket you are attempting to access must be addressed using the specified endpoint. Please send all future requests to this endpoint: "bucketname.com.s3.amazonaws.com"., User-Agent: aws-sdk-php2/2.4.10 Guzzle/3.7.1 curl/7.38.0 PHP/5.5.19 ITR thrown in /home/bucketname/public_html/bucketname.com/wp-content/plugins/S3-Uploads-master/inc/aws-sdk/Aws/Common/Exception/NamespaceExceptionFactory.php on line 91

adding this to wp-config.php

define( 'S3_REGION', 'us-west-1' );

and updating function s3() in /S3-Uploads-master/inc/class-s3-uploads.php

    public function s3() {

        require_once dirname( __FILE__ ) . '/aws-sdk/aws-autoloader.php';

        if ( ! empty( $this->s3 ) )
            return $this->s3;

        $this->s3 = Aws\Common\Aws::factory( array( 'key' => $this->key, 'secret' => $this->secret, 'region' => S3_REGION ) )->get( 's3' );

        return $this->s3;
    }

managed to fix it for me.

php warnings/notices when running tests

When running phpunit on a certain client project (based on the normal hm-base), i'm getting the following output:

PHP Notice:  Undefined variable: test_root in /content/plugins-mu/s3-uploads/tests/bootstrap.php on line 21

Notice: Undefined variable: test_root in /content/plugins-mu/s3-uploads/tests/bootstrap.php on line 21
PHP Warning:  require(/includes/functions.php): failed to open stream: No such file or directory in /content/plugins-mu/s3-uploads/tests/bootstrap.php on line 21

Warning: require(/includes/functions.php): failed to open stream: No such file or directory in /content/plugins-mu/s3-uploads/tests/bootstrap.php on line 21
PHP Fatal error:  require(): Failed opening required '/includes/functions.php' (include_path='.:/usr/share/php:/usr/share/pear') in /content/plugins-mu/s3-uploads/tests/bootstrap.php on line 21

Fatal error: require(): Failed opening required '/includes/functions.php' (include_path='.:/usr/share/php:/usr/share/pear') in /content/plugins-mu/s3-uploads/tests/bootstrap.php on line 21

Don't explicitly call the setup() method of S3_Uploads

The setup() method of S3_Uploads class is called directly multiple times in the code.

The first occurrence of such a call is in s3_uploads_init(), hooked to plugins_loaded. This means that by the time that the other calls happen, the method has already been called once at least.

Since the setup() method hooks up different other methods, it should only be called once. Since S3_Uploads is a singleton, we should move the call to the method into the constructor and mark it as protected.

@joehoyle since you know the plugin best, I'd appreciate your input.

Undocumented functionality

I noticed that the S3 Upload plugin has some functionality that is not documented in the readme.md. Is it possible to explain what the following defines do and what the impact, be it on functionality or performance. S3_UPLOADS_USE_LOCAL and S3_UPLOADS_DISABLE_REPLACE_UPLOAD_URL ?

Is it possible that functionality that might not be documented?

Custom Post Type Throwing HTTP Error

Post and Page post types work fine and upload successfully. When using customer post type however, I'm seeing a HTTP Error pop up. Error Logs show no issue for me to debug further.

If you need me to provide any addtional information, plese let me know.

Check for required PHP version

The plugin requires at least PHP 5.3.3 (edit: adapted PHP version to that of SDK V2), following the requirements of the AWS SDK. This is higher than the WordPress requirements.

We should add a check to the plugin that stops the execution when the minimum is not met, to avoid a fatal PHP error, and add a notice to the documentation.

Issue with certain uploads and dynamic CSS creation.

Joe,

I had another issue that I opened before, that you can close out. I wanted to open a new issue, so we could try to figure this out.

I was having problems with a theme that writes out a CSS file dynamically, and it first checks to see that the file exists, and fails, causing the CSS for the whole site to be broken until I manually copy the CSS that gets created to the S3 directory with your copy utility.

Similarly, I have another app that was getting an error condition with respect to cover photos for a Buddypress Profile or Buddypress Group. I was able to fix the error condition on that one.

However, Buddypress Avatars which use the same process, are now failing. You upload the image and it can't find it because its already moved.

It seems that all of these issues resolve around the same problem wordpress, the theme, or plugin (in one case Buddypress) tries to lookup the file right after it is created, and fails.

I have a couple ideas how to fix this, but wanted to get your input. This seems to be happening only during the upload process or in the case of the dynamic CSS when it is being generated. It seems like it could be a timing issue. For example in all these cases, if there was already an uploaded image or a CSS S3-Uploads and Wordpress finds it fine to display it. It only is the check for an error condition, like the file didn't upload and the plugin wants to throw an error. Or maybe this is general problem with an absolute look up? Where the plugin looks in a directory its expecting to find things like TMP, and errors because it isn't there?

Your thoughts are appreciated.

plugin collisions when other plugin uses guzzle

hi,
i just experienced an error similar to this #24

'Guzzle\Common\Exception\RuntimeException' with message 'Guzzle\Http\CachingEntityBody supports only SEEK_SET and SEEK_CUR seek operation

although i was already on the latest version where the hack for guzzle was applied:
a9db109

it turned out i was also using sendgrid-email-delivery-simplified plugin and a composer loaded guzzle which was somehow replacing your shipped guzzle and the hack had no effect anymore.

just to let you know.

Avoid connecting unless needed

It takes a while to connect to S3 (can be up to 2s from the other side of the world), so we should avoid doing it unless needed.

This may require core to change, as wp_upload_dir right now calls wp_mkdir_p constantly, which is a gigantic pain.

cc @dd32

Move setting of `$original_upload_dir` in `S3_Uploads` to the constructor

The S3_Uploads class contains a $original_upload_dir member variable. Its purpose is to store the array containing the data related to the original (i.e. on the local file system) upload directory.

This variable is set twice in the class, once in the filter_upload_dir() method (to the value passed by the filter), and once in get_original_upload_dir() (to the value returned by wp_upload_dir()).

This is ambiguous, I think we should set the value once in the constructor, to the value returned by wp_upload_dir(), and this before any filtering by the plugin could happen.

Temporary files aren't cleaned up if an error occurs during saving

I suspect this is the cause of our inode exhaustion: in the image editor code, we grab a temp file name and tell WP to write to it, but never clean it up if there's an error. If WP hits an error after creating the file but before finishing, then the file will stick around.

In our case, the file is getting created but not written to, so I strongly suspect fopen is returning false in WP_Image_Editor::make_image, but this value is never checked in the Imagick _save method. Still not sure entirely what's happening there, but we should be running the unlink just in case anyway.

Exceptions from S3 SDK aren't caught

Fatal error: Uncaught Aws\S3\Exception\RequestTimeTooSkewedException: AWS Error Code: RequestTimeTooSkewed, Status Code: 403, AWS Request ID: 3C548F66BA12D3C0, AWS Error Type: client, AWS Error Message: The difference between the request time and the current time is too large., [...]/s3-uploads/inc/aws-sdk/Aws/Common/Exception/NamespaceExceptionFactory.php on line 91

S3 Uploads should be catching these exceptions to avoid a fatal error.

Use batch processing to migrate attachments

The WP CLI migrate-attachments command uses a WP_Query with posts_per_page set to -1.

With a larger data set, this will lead to issues. We should investigate on how we could use batch processing for the migration.

Fix Image Magick

Because image magick uses a library binding, it uses the local filesystem under the hood, which appears to be an issue with using S3-Uploads. We should somehow work around this, by subclassing WP_Image_Magick_Editor or otherwise.

S3-Uploads failing to create a new monthly directory

Hi,

Great work on this!

I've been using the code for about a month in a staging server. I installed the code and moved the historical files in August, and with it moved the file structure which included a folder for August (uploads/edd/2015/08). As the month flipped from August to September, I tried uploading a file today and it is showing an error on the Wordpress side that it can't move the file to uploads/edd/2015/09. When I look on S3 the directory wasn't created, however there is a directory on the Wordpress server. Permissions on the "09" directory are set to 776 and the owner/group is correct as well.

I could obviously create the folder on S3, and try it. The permissions on S3 have worked, and the "verify" command was successful (although it doesn't create a directory). I suppose I haven't seen the code create a directory before with the exception of the initial load.

Any ideas?

copy() fails to understand filenames without extensions

PHP Warning:  copy(): The second argument to copy() function cannot be a directory in /../htdocs/wp-content/plugins/S3-Uploads/inc/class-s3-uploads.php on line 187

The original implementation of that function would fire an error on my VVV because copy() couldn't understand that tempnam() generated filenames were real files, not directories.

Using the real file name available on $file['name'] array solved this -> https://github.com/moraleida/S3-Uploads/blob/fix-php-cant-copy-to-directory/inc/class-s3-uploads.php#L184

Failing to keep a dynamic.css file in sync

The theme I'm using creates a dynamic CSS directory within uploads, and then generates a CSS based on various updates of settings and other hooks. When the CSS is generated the directory must be deleted and then recreated. S3-Uploads deletes the file and directory from S3, but then doesn't recreate it. However, S3-uploads must not be detecting the file being re-created. The result is that the browser looks for a directory and a CSS file that it can't find, and returns a 403.

It seems like the detection of the deletion occurs, because the directory and the file are deleted from S3. However, the re-creation of the file goes undetected, so the new directory and file aren't uploaded to S3.

Thoughts?

Image not renamed correctly

Whilst I remember...

  • if you upload an image a second time - it doesn't rename the image file and the WP attachment will appear broken.
  • if you have an attachment - and the source file is missing - uploading a new copy will create a new attachment with a broken image and the image will show for the first broken attachment

Permanently deleted media files

Hey, good plugin, works well.

Quick question that may or may not be an issue:
When deleting a media file, the originally uploaded file gets left behind - only the different thumbnail variants gets deleted. Is this intended behavior?

We run a multisite setup, and would prefer that all files permanently deleted through Wordpress would get removed completely from S3.

Thanks.

Ps. I can delete the left over file through the CLI, so at least that is working, although not convenient:
wp s3-uploads rm uploads/2016/04/image.jpg --debug --url=www.our-multisite-setup.com

WP-CLI verify should respect s3_uploads_enabled()

Minor quibble, wp s3-uploads verify doesn't check if we've actually enabled uploading to S3 and happily goes on testing the local filesystem. In S3-Uploads/inc/class-s3-uploads-wp-cli-command.php one could add on line 18:

if ( ! s3_uploads_enabled() ) {
    WP_CLI::error('Uploading to S3 disabled. Either set S3_UPLOADS_AUTOENABLE to True, or toggle via WP-CLI: wp s3-uploads enable');
    return;
}

Support for secure images

I'd like to use this to "securely" store images on S3 for a private site. What I'm thinking is that, when viewing the site, the image URLs would include the access token. The access token obviously would have a limited lifespan.

What do you think of this approach?

Upload Issue

Hi Joe,

Just went to update a site after the couple pull requests we submitted and I am running into a few issues.

Uploads through the browser are failing, yet the verify command is working fine.

screen shot 2015-06-15 at 10 08 23 am

The php error log:

[15-Jun-2015 00:08:21 UTC] PHP Fatal error:  Uncaught exception 'Guzzle\Common\Exception\RuntimeException' with message 'Guzzle\Http\CachingEntityBody supports only SEEK_SET and SEEK_CUR seek operations' in /Users/zacscott/httpdocs/mumbrella.local.au/wp-content/mu-plugins/S3-Uploads/lib/aws-sdk/Guzzle/Http/CachingEntityBody.php:66
Stack trace:
#0 /Users/zacscott/httpdocs/mumbrella.local.au/wp-content/mu-plugins/S3-Uploads/lib/aws-sdk/Aws/S3/StreamWrapper.php(269): Guzzle\Http\CachingEntityBody->seek(0, 2)
#1 [internal function]: Aws\S3\StreamWrapper->stream_seek(0, 2)
#2 /Users/zacscott/httpdocs/mumbrella.local.au/wp/wp-admin/includes/image.php(336): exif_read_data('s3://vum-dev/up...')
#3 /Users/zacscott/httpdocs/mumbrella.local.au/wp/wp-admin/includes/media.php(337): wp_read_image_metadata('s3://vum-dev/up...')
#4 /Users/zacscott/httpdocs/mumbrella.local.au/wp/wp-admin/includes/ajax-actions.php(1901): media_handle_upload('async-upload', NULL, Array)
#5 /Users/zacscott/httpdocs/mumbrella.local.au/wp/wp-admin/async-upload.php(43): wp_ajax_upload_attachm in /Users/zacscott/httpdocs/mumbrella.local.au/wp-content/mu-plugins/S3-Uploads/lib/aws-sdk/Guzzle/Http/CachingEntityBody.php on line 66

Any ideas?

Thanks,

  • Zac

More defensive coding for WP CLI commands

The WP CLI commands that require arguments to be passed, like for example wp s3-uploads create_iam_user, should verify that these arguments are passed before proceeding, and alert users to missing input.

Update AWS SDK to Version 3

There's a stale branch concerning the update to version 3 of the AWS SDK. We can use this ticket to track any work towards this update.

As a side benefit, this would solve #26.

Fix Unit Test Installation and Running

Currently it is very difficult to install and run the plugin's unit tests on either Salty WordPress or VVV.

With Salty WordPress, the bin/install-wp-tests.sh script needs to be run first. This is not marked in any documentation. After the script has finished running, you get the following error:

PHP Fatal error: Class 'WP_REST_Server' not found in /srv/www/wordpress-develop.dev/tests/phpunit/includes/spy-rest-server.php on line 3

This is due to WP CLI issue: wp-cli/wp-cli#2129, and the solution is to update the install script based on this: https://raw.githubusercontent.com/wp-cli/wp-cli/master/templates/install-wp-tests.sh

After having done this, we encounter the same issue as on VVV, that does not need any set up. This is the error you get:

PHP Notice:  Undefined variable: test_root in /srv/www/s3-uploads/htdocs/wp-content/plugins/s3-uploads/tests/bootstrap.php on line 21
PHP Warning:  require(/includes/functions.php): failed to open stream: No such file or directory in /srv/www/s3-uploads/htdocs/wp-content/plugins/s3-uploads/tests/bootstrap.php on line 21
PHP Fatal error:  require(): Failed opening required '/includes/functions.php' (include_path='/usr/local/src/composer/vendor/phpunit/php-file-iterator:/usr/local/src/composer/vendor/phpunit/phpunit:/usr/local/src/composer/vendor/symfony/yaml:.:/usr/share/php:/usr/share/pear') in /srv/www/s3-uploads/htdocs/wp-content/plugins/s3-uploads/tests/bootstrap.php on line 21

I fixed this error by replacing the tests/bootstrap.php file with the following code:

$_tests_dir = getenv('WP_TESTS_DIR');
if ( !$_tests_dir ) $_tests_dir = '/tmp/wordpress-tests-lib';
require_once $_tests_dir . '/includes/functions.php';

function _manually_load_plugin() {
        require dirname( __FILE__ ) . '/../s3-uploads.php';
}
tests_add_filter( 'muplugins_loaded', '_manually_load_plugin' );

if ( getenv( 'S3_UPLOADS_BUCKET' ) ) {
        define( 'S3_UPLOADS_BUCKET', getenv( 'S3_UPLOADS_BUCKET' ) );
}

if ( getenv( 'S3_UPLOADS_KEY' ) ) {
        define( 'S3_UPLOADS_KEY', getenv( 'S3_UPLOADS_KEY' ) );
}

if ( getenv( 'S3_UPLOADS_SECRET' ) ) {
        define( 'S3_UPLOADS_SECRET', getenv( 'S3_UPLOADS_SECRET' ) );
}

require $_tests_dir . '/includes/bootstrap.php';

This makes the unit tests run successfully.

What I think we need to do is:

  1. Find out exactly what the cause of theses errors are (is it really WP-CLI?).
  2. Find out whether we can get by without having to use the Shell script to set up unit testing.
  3. Fix any issues so that the plugin can be tested on a variety of setups (Salty WordPress, VVV, maybe even MAMP).
  4. Provide documentation for setting things up correctly.

CLI / Migrating attachments / stream_wrapper_register(): Protocol s3:// is already defined

I'm running wp s3-uploads migrate-attachments and get the above-mentioned error after every 10 uploads:

$ wp s3-uploads migrate-attachments
Success: Moved file 2015/08/Slider-FULL-EPISODE.jpg to S3
Success: Moved file 2015/08/Slider-FULL-EPISODE-150x150.jpg to S3
Success: Moved file 2015/08/Slider-FULL-EPISODE-300x189.jpg to S3
Success: Moved file 2015/08/Slider-FULL-EPISODE-233x146.jpg to S3
Success: Moved file 2015/08/Slider-FULL-EPISODE-341x214.jpg to S3
Success: Moved file 2015/08/Slider-FULL-EPISODE-600x375.jpg to S3
Success: Moved file 2015/08/Slider-FULL-EPISODE-700x438.jpg to S3
Success: Moved file 2015/08/Slider-FULL-EPISODE-140x88.jpg to S3
Success: Moved file 2015/08/Slider-FULL-EPISODE-700x241.jpg to S3
PHP Warning:  stream_wrapper_register(): Protocol s3:// is already defined. in /home/www/wp-content/plugins/S3-Uploads-master/inc/class-s3-uploads-stream-wrapper.php on line 11
Warning: stream_wrapper_register(): Protocol s3:// is already defined. in /home/www/wp-content/plugins/S3-Uploads-master/inc/class-s3-uploads-stream-wrapper.php on line 11
Success: Moved file 2015/07/Web-Article.jpg to S3
Success: Moved file 2015/07/Web-Article-150x150.jpg to S3
Success: Moved file 2015/07/Web-Article-300x189.jpg to S3
Success: Moved file 2015/07/Web-Article-1024x644.jpg to S3
Success: Moved file 2015/07/Web-Article-233x146.jpg to S3
Success: Moved file 2015/07/Web-Article-341x214.jpg to S3
Success: Moved file 2015/07/Web-Article-600x375.jpg to S3
Success: Moved file 2015/07/Web-Article-700x438.jpg to S3
Success: Moved file 2015/07/Web-Article-140x88.jpg to S3
Success: Moved file 2015/07/Web-Article-1140x241.jpg to S3
PHP Warning:  stream_wrapper_register(): Protocol s3:// is already defined. in /home/www/wp-content/plugins/S3-Uploads-master/inc/class-s3-uploads-stream-wrapper.php on line 11
Warning: stream_wrapper_register(): Protocol s3:// is already defined. in /home/www/wp-content/plugins/S3-Uploads-master/inc/class-s3-uploads-stream-wrapper.php on line 11
Success: Moved file 2015/07/INTERVIEW-PIC.jpg to S3
Success: Moved file 2015/07/INTERVIEW-PIC-150x150.jpg to S3
Success: Moved file 2015/07/INTERVIEW-PIC-300x189.jpg to S3
Success: Moved file 2015/07/INTERVIEW-PIC-233x146.jpg to S3
Success: Moved file 2015/07/INTERVIEW-PIC-341x214.jpg to S3
Success: Moved file 2015/07/INTERVIEW-PIC-600x375.jpg to S3
Success: Moved file 2015/07/INTERVIEW-PIC-700x438.jpg to S3
Success: Moved file 2015/07/INTERVIEW-PIC-140x88.jpg to S3
Success: Moved file 2015/07/INTERVIEW-PIC-700x241.jpg to S3
PHP Warning:  stream_wrapper_register(): Protocol s3:// is already defined. in /home/www/wp-content/plugins/S3-Uploads-master/inc/class-s3-uploads-stream-wrapper.php on line 11
Warning: stream_wrapper_register(): Protocol s3:// is already defined. in /home/www/wp-content/plugins/S3-Uploads-master/inc/class-s3-uploads-stream-wrapper.php on line 11

S3_Uploads_Stream_Wrapper::register(), calls stream_wrapper_register without checking if the 's3' stream already exists.

You can see here that the official AWS SDK checks if the 's3' stream exists and (if it does exist) unregisters it first before calling stream_wrapper_register. Do you want me to submit a pull request with the same logic for S3_Uploads_Stream_Wrapper::register()?

Browsing media library is quite slow

Browsing the media library is quite slow - the infinite scroll takes several seconds (between 5 and 15) to load the next set of images.

Is this is necessary consequence of offloading images to S3, or might there be something messed up in my configuration (on the server or S3) ?

Sideload images not working

As used by press this - but I've also been using it to import images when given just the src,

Problem is that it creates a temp file and then moves it to uploads using phps rename Function so you get a notice telling you the file could not be moved

wp plugin activate does not work for PHP 5.3.9

Hello,

The error I was receiving when attempting to follow the directions was

PHP Parse error: syntax error, unexpected '[' in /usr/share/www/wp/wp-content/plugins/S3-Uploads/inc/class-s3-uploads-wp-cli-command.php on line 417

The only reason I am bringing it up here is because wp-cli's docs claim to support PHP down to 5.3.2, so this may cause issues for other people attempting to try this out!

When I upgraded to PHP 5.4, it worked as expected.

I am new to PHP so apologies in advance. Would it be helpful for me to submit a PR to document the minimum PHP requirement somewhere, or were you all expecting it to work with 5.3.x?

Enabling the plugin via CLI

The docs indicate you need to run wp s3-uploads enable to enable the plugin, however when run, CLI returns and error Error: 's3-uploads enable' is not a registered wp command. See 'wp help'. and uploading didn't work or return any errors with all the usual debugging on.

Document Contribution Instructions

The unit test suite that comes with the plugin isn't functional out of the box as it hinges on S3 credentials either set in constants or pulled through getenv(). This works on Jenkins as the proper environment variables are set, but not locally without help.

Using the WP-CLI unit test scaffold is great for getting started, but if Jenkins is going to be a bottleneck for contributions, then all contributors should be able to easily get tests running locally. Ideally, a quick-and-dirty setup would be documented in CONTRIBUTING.md for future devs.

Cannot use with S3 in Frankfurt Region (and China)

Gives the error when you try to verify

PHP Warning: The authorization mechanism you have provided is not supported. Please use AWS4-HMAC-SHA256. in /wp-content/plugins/S3-Uploads-master/lib/aws-sdk/Aws/S3/StreamWrapper.php on line 759

It's related to "Signature Version 4" on Frankfurt and China regions

In the China (Beijing) and EU (Frankfurt) regions, Amazon S3 supports only Signature Version 4, and AWS SDKs use this signature version to authenticate requests.

for the help: http://docs.aws.amazon.com/AmazonS3/latest/dev/UsingAWSSDK.html#specify-signature-version

wp-cli commands unable to include aws-sdk

Hello,

It seems i'm having some issues:

wp s3-uploads create-iam-user --admin-key='1' --admin-secret='2'

PHP Warning:  require_once(/srv/www/vhost/web/cms/wp-content/plugins/s3_uploads/inc/aws-sdk/aws-autoloader.php): failed to open stream: No such file or directory in /srv/www/vhost/web/cms/wp-content/plugins/s3_uploads/inc/class-s3-uploads-wp-cli-command.php on line 142
Warning: require_once(/srv/www/vhost/web/cms/wp-content/plugins/s3_uploads/inc/aws-sdk/aws-autoloader.php): failed to open stream: No such file or directory in /srv/www/vhost/web/cms/wp-content/plugins/s3_uploads/inc/class-s3-uploads-wp-cli-command.php on line 142
PHP Fatal error:  require_once(): Failed opening required '/srv/www/vhost/web/cms/wp-content/plugins/s3_uploads/inc/aws-sdk/aws-autoloader.php' (include_path='phar:///usr/local/bin/wp/vendor/phpunit/phpunit-mock-objects:phar:///usr/local/bin/wp/vendor/phpunit/php-token-stream:phar:///usr/local/bin/wp/vendor/phpunit/php-code-coverage:phar:///usr/local/bin/wp/vendor/phpunit/phpunit:phar:///usr/local/bin/wp/vendor/symfony/yaml:.:/usr/share/php:/usr/share/pear') in /srv/www/vhost/web/cms/wp-content/plugins/s3_uploads/inc/class-s3-uploads-wp-cli-command.php on line 142
Fatal error: require_once(): Failed opening required '/srv/www/vhost/web/cms/wp-content/plugins/s3_uploads/inc/aws-sdk/aws-autoloader.php' (include_path='phar:///usr/local/bin/wp/vendor/phpunit/phpunit-mock-objects:phar:///usr/local/bin/wp/vendor/phpunit/php-token-stream:phar:///usr/local/bin/wp/vendor/phpunit/php-code-coverage:phar:///usr/local/bin/wp/vendor/phpunit/phpunit:phar:///usr/local/bin/wp/vendor/symfony/yaml:.:/usr/share/php:/usr/share/pear') in /srv/www/vhost/web/cms/wp-content/plugins/s3_uploads/inc/class-s3-uploads-wp-cli-command.php on line 142

If i modify line 142 to look one directory up (inside lib), it works:

require_once dirname( dirname( __FILE__ ) ) . '/lib//aws-sdk/aws-autoloader.php';

Another issue i'm having is with wp s3-uploads ls uploads:

PHP Fatal error:  Class 'Aws\Common\Aws' not found in /srv/www/vhost/web/cms/wp-content/plugins/s3_uploads/inc/class-s3-uploads.php on line 158
Fatal error: Class 'Aws\Common\Aws' not found in /srv/www/vhost/web/cms/wp-content/plugins/s3_uploads/inc/class-s3-uploads.php on line 158

Which goes away (and works) if i put require_once dirname( dirname( __FILE__ ) ) . '/lib//aws-sdk/aws-autoloader.php' at the top of the s3 function inside class-s3-uploads.php.

There has to be something wrong on my end, but i'm running out of ideas. migrate-attachments works fine.

Support for cloudfront / CDNs

Quote from cloudfront site

Amazon CloudFront is a content delivery web service. It integrates with other Amazon Web Services products to give developers and businesses an easy way to distribute content to end users with low latency, high data transfer speeds, and no minimum usage commitments.

Basically cloudfront is amazon CDN for S3 / amazon products. Whenever using S3 in production / large scale, always better to use a CDN for speed. At the moment, you can kind of build your own support with the define S3_UPLOADS_BUCKET_URL. This is not documented, well supported and has any unit tests for it. It would be nice if it was documented and could easily be turned on and off. We could add filters on places where the current S3 url is outputted, like wp_get_attachment_url and the_content, so that the cdn functionality would be turned off on none production enviroments. This cdn functionality doesn't need to be limited to cloudfront, if it filters a url, then you could use the same filter for other cdns.

Willing to work on this, if you people think it is a good idea...

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.