Giter VIP home page Giter VIP logo

sagemaker-studio-image-build-cli's Introduction

SageMaker Docker Build

Version Code style: black

This is a CLI for building Docker images in SageMaker Studio using AWS CodeBuild.

Usage

Navigate to the directory containing the Dockerfile and simply do:

sm-docker build .

Any additional arguments supported with docker build are supported

sm-docker build . --file /path/to/Dockerfile --build-arg foo=bar

By default, the CodeBuild project will not run within a VPC, the image will be pushed to a repository sagemakerstudio with the tag latest, and use the Studio App's execution role and the default SageMaker Python SDK S3 bucket

These can be overridden with the relevant CLI options.

sm-docker build . --repository mynewrepo:1.0 --role SampleDockerBuildRole --bucket sagemaker-us-east-1-326543455535 --vpc-id vpc-0c70e76ef1c603b94 --subnet-ids subnet-0d984f080338960bb,subnet-0ac3e96808c8092f2 --security-group-ids sg-0d31b4042f2902cd0

The CLI will take care of packaging the current directory and uploading to S3, creating a CodeBuild project, starting a build with the S3 artifacts, tailing the build logs, and uploading the built image to ECR.

Installing

Install the CLI using pip.

pip install sagemaker-studio-image-build

Ensure the execution role has a trust policy with CodeBuild.

{
  "Version": "2012-10-17",
  "Statement": [
    {
      "Effect": "Allow",
      "Principal": {
        "Service": [
          "codebuild.amazonaws.com"
        ]
      },
      "Action": "sts:AssumeRole"
    }
  ]
}

The following permissions are required in the execution role to execute a build in CodeBuild and push the image to ECR

{
    "Version": "2012-10-17",
    "Statement": [
        {
            "Effect": "Allow",
            "Action": [
                "codebuild:DeleteProject",
                "codebuild:CreateProject",
                "codebuild:BatchGetBuilds",
                "codebuild:StartBuild"
            ],
            "Resource": "arn:aws:codebuild:*:*:project/sagemaker-studio*"
        },
        {
            "Effect": "Allow",
            "Action": "logs:CreateLogStream",
            "Resource": "arn:aws:logs:*:*:log-group:/aws/codebuild/sagemaker-studio*"
        },
        {
            "Effect": "Allow",
            "Action": [
                "logs:GetLogEvents",
                "logs:PutLogEvents"
            ],
            "Resource": "arn:aws:logs:*:*:log-group:/aws/codebuild/sagemaker-studio*:log-stream:*"
        },
        {
            "Effect": "Allow",
            "Action": "logs:CreateLogGroup",
            "Resource": "*"
        },
        {
            "Effect": "Allow",
            "Action": [
                "ecr:CreateRepository",
                "ecr:BatchGetImage",
                "ecr:CompleteLayerUpload",
                "ecr:DescribeImages",
                "ecr:DescribeRepositories",
                "ecr:UploadLayerPart",
                "ecr:ListImages",
                "ecr:InitiateLayerUpload", 
                "ecr:BatchCheckLayerAvailability",
                "ecr:PutImage"
            ],
            "Resource": "arn:aws:ecr:*:*:repository/sagemaker-studio*"
        },
        {
            "Sid": "ReadAccessToPrebuiltAwsImages",
            "Effect": "Allow",
            "Action": [
                "ecr:BatchGetImage",
                "ecr:GetDownloadUrlForLayer"
            ],
            "Resource": [
                "arn:aws:ecr:*:763104351884:repository/*",
                "arn:aws:ecr:*:217643126080:repository/*",
                "arn:aws:ecr:*:727897471807:repository/*",
                "arn:aws:ecr:*:626614931356:repository/*",
                "arn:aws:ecr:*:683313688378:repository/*",
                "arn:aws:ecr:*:520713654638:repository/*",
                "arn:aws:ecr:*:462105765813:repository/*"
            ]
        },
        {
            "Sid": "EcrAuthorizationTokenRetrieval",
            "Effect": "Allow",
            "Action": [
                "ecr:GetAuthorizationToken"
            ],
            "Resource": [
                "*"
            ]
        },
        {
            "Effect": "Allow",
            "Action": [
              "s3:GetObject",
              "s3:DeleteObject",
              "s3:PutObject"
              ],
            "Resource": "arn:aws:s3:::sagemaker-*/*"
        },
        {
            "Effect": "Allow",
            "Action": [
                "s3:CreateBucket"
            ],
            "Resource": "arn:aws:s3:::sagemaker*"
        },
        {
            "Effect": "Allow",
            "Action": [
                "iam:GetRole",
                "iam:ListRoles"
            ],
            "Resource": "*"
        },
        {
            "Effect": "Allow",
            "Action": "iam:PassRole",
            "Resource": "arn:aws:iam::*:role/*",
            "Condition": {
                "StringLikeIfExists": {
                    "iam:PassedToService": "codebuild.amazonaws.com"
                }
            }
        }
    ]
}

If you need to run your CodeBuild project within a VPC, please add the following actions to your execution role that the CodeBuild Project will assume:

        {
            "Sid": "VpcAccessActions",
            "Effect": "Allow",
            "Action": [
                "ec2:CreateNetworkInterface",
                "ec2:CreateNetworkInterfacePermission",
                "ec2:DescribeDhcpOptions",
                "ec2:DescribeNetworkInterfaces",
                "ec2:DeleteNetworkInterface",
                "ec2:DescribeSubnets",
                "ec2:DescribeSecurityGroups",
                "ec2:DescribeVpcs"
            ],
            "Resource": "*"
        }

Development

Checkout the repository.

make install

Testing locally

To build locally, use one of the example Dockerfiles in the examples directory

ROLE_NAME=<<A role in your account to use in the CodeBuild build job>>
(cd examples/basic_build && sm-docker build . --role ${ROLE_NAME} )
(cd examples/build_with_args && sm-docker build . --role ${ROLE_NAME} --file Dockerfile.args --build-arg BASE_IMAGE=python:3.8 )

Testing on SageMaker Studio

To build a binary to use on SageMaker Studio, specify an S3 path and use the s3bundle target.

export DEV_S3_PATH_PREFIX=s3://path/to/location
black .
make -k s3bundle

From a "System Terminal" in SageMaker Studio

export DEV_S3_PATH_PREFIX=s3://path/to/location
aws s3 sync ${DEV_S3_PATH_PREFIX}/sagemaker-docker-build/dist . 
pip install sagemaker_studio_image_build-x.y.z.tar.gz

Security

See CONTRIBUTING for more information.

License

This library is licensed under the MIT-0 License. See the LICENSE file.

sagemaker-studio-image-build-cli's People

Contributors

amazon-auto avatar brightsparc avatar dleen avatar jaipreet-s avatar trenton avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

sagemaker-studio-image-build-cli's Issues

The Command returned non-zero code: 137

Hi, I am using sm-docker to build my custom environment image in aws sagemaker studio. However while solving environment it gets Killed with error code 137. I tried to do research and found out, that can be because of memory limit and we can specify memory in docker command. I am not sure how to do that with sm-docker command. Is there anything already there like passing an argument for that which I am missing? Thanks.

failed when using in Beijing region or Ningxia region

Hi,

I try to use this in AWS China regions (BJS and ZHY) but failed, I checked error log, found the domain URL is not the right one.

It could pass build image, but failed when try to push image to ECR.

[Container] 2022/05/18 04:10:55 Running command docker push $AWS_ACCOUNT_ID.dkr.ecr.$AWS_DEFAULT_REGION.amazonaws.com/$IMAGE_REPO_NAME:$IMAGE_TAG
185 | The push refers to repository [MY_AWS_ACCOUNT_ID.dkr.ecr.cn-northwest-1.amazonaws.com/unipus-anti-fraud-pipelines-sagemaker-processing-container]
186 | Get "https://MY_AWS_ACCOUNT_ID.dkr.ecr.cn-northwest-1.amazonaws.com/v2/": dial tcp: lookup MY_AWS_ACCOUNT_ID.dkr.ecr.cn-northwest-1.amazonaws.com on 10.0.0.2:53: no such host
187 |  
188 | [Container] 2022/05/18 04:10:55 Command did not exit successfully docker push $AWS_ACCOUNT_ID.dkr.ecr.$AWS_DEFAULT_REGION.amazonaws.com/$IMAGE_REPO_NAME:$IMAGE_TAG exit status 1
189 | [Container] 2022/05/18 04:10:55 Phase complete: POST_BUILD State: FAILED
190 | [Container] 2022/05/18 04:10:55 Phase context status code: COMMAND_EXECUTION_ERROR Message: Error while executing command: docker push $AWS_ACCOUNT_ID.dkr.ecr.$AWS_DEFAULT_REGION.amazonaws.com/$IMAGE_REPO_NAME:$IMAGE_TAG. Reason: exit status 1
191
<br class="Apple-interchange-newline">

Acctually, for BJS or ZHY region, there will be an '.cn' in the end of the domain URL, for example "MY_AWS_ACCOUNT_ID.dkr.ecr.cn-northwest-1.amazonaws.com.cn", this is different than other AWS global regions.

Could you fix this issue, thanks a lot.

[Enhancement] - Ability to specify custom buildspec template (or additional parameters in buildspec)

a use case in an environment where custom libraries are being pulled in from JFrog or even Docker hub, we need to be able to pass in credentials. The typical way of doing this is to specify parameter store keys or Secret store keys that can be used for authenticating. However, it is currently not possible. (unless we modify the code to do something custom. )

Perhaps having additional options to pass in parameter store keys to use for authentication and Also be able to specify a custom buildspec would be great.

Add support for GPU build environment

Currently build environment is hardcoded to LINUX_CONTAINER. This will cause builds that involve compiling Cuda operations to fail (for example, Docker images that include custom PyTorch or Tensorflow operations). Adding an --environment option to change build environment to LINUX_GPU_CONTAINER fixes this. This also requires changing the --compute-type argument to BUILD_GENERAL1_LARGE.

Request to expand the tool to China regions

Thanks for providing the useful tool, I tried it in two regions, it works for me in us-west-2, but failed in cn-northwest-1 and the first problem seems incorrect URI, for example, the docker push command in buildspec uses a hardcoded URI suffix amazonaws.com, but in China regions the suffix should be amazonaws.com.cn.

Would you fix the problems and enable the tool in China regions?

sm-docker build does not work when delaing with private Pypi repos

If a Docker file installs libraries from a private PyPi repo (a repo. within a VPC), the docker build fails from the codebuild project. The reason being, the code build project is not configured to run within a VPC.

  1. Enhance sm-docker to accept additional codebuild parameters so that it can accept the VpcConfig (or)
  2. Enhance sm-docker to accept a config file where all the command line options can be passed in, allowing easy of passing in more options (including authentication/authorization details to the codebuild project)
  3. Allow for "custom" naming of codebuild projects (passing in the name for the codebuild project as n argument or in the config file)

pip install fails with package error

/opt/conda/lib/python3.7/site-packages/secretstorage/dhcrypto.py:16: CryptographyDeprecationWarning: int_from_bytes is deprecated, use int.from_bytes instead
  from cryptography.utils import int_from_bytes
/opt/conda/lib/python3.7/site-packages/secretstorage/util.py:25: CryptographyDeprecationWarning: int_from_bytes is deprecated, use int.from_bytes instead
  from cryptography.utils import int_from_bytes
Collecting sagemaker-studio-image-build
  Using cached sagemaker_studio_image_build-0.6.0.tar.gz (13 kB)
  Preparing metadata (setup.py) ... error
  error: subprocess-exited-with-error
  
  × python setup.py egg_info did not run successfully.
  │ exit code: 1
  ╰─> [20 lines of output]
      Traceback (most recent call last):
        File "<string>", line 36, in <module>
        File "<pip-setuptools-caller>", line 34, in <module>
        File "/tmp/pip-install-6vqr1td2/sagemaker-studio-image-build_802718701392414ca444c646e5991c5c/setup.py", line 33, in <module>
          package_data={"sagemaker_studio_image_build": ["*.yml", "data/**"]},
        File "/opt/conda/lib/python3.7/site-packages/setuptools/__init__.py", line 87, in setup
          return distutils.core.setup(**attrs)
        File "/opt/conda/lib/python3.7/site-packages/setuptools/_distutils/core.py", line 109, in setup
          _setup_distribution = dist = klass(attrs)
        File "/opt/conda/lib/python3.7/site-packages/setuptools/dist.py", line 466, in __init__
          for k, v in attrs.items()
        File "/opt/conda/lib/python3.7/site-packages/setuptools/_distutils/dist.py", line 293, in __init__
          self.finalize_options()
        File "/opt/conda/lib/python3.7/site-packages/setuptools/dist.py", line 885, in finalize_options
          for ep in sorted(loaded, key=by_order):
        File "/opt/conda/lib/python3.7/site-packages/setuptools/dist.py", line 884, in <lambda>
          loaded = map(lambda e: e.load(), filtered)
        File "/opt/conda/lib/python3.7/site-packages/setuptools/_vendor/importlib_metadata/__init__.py", line 196, in load
          return functools.reduce(getattr, attrs, module)
      AttributeError: type object 'Distribution' has no attribute '_finalize_feature_opts'
      [end of output]
  
  note: This error originates from a subprocess, and is likely not a problem with pip.
error: metadata-generation-failed

× Encountered error while generating package metadata.
╰─> See above for output.

note: This is an issue with the package mentioned above, not pip.
hint: See above for details.

Pushing to ECR times out when --repository flag is used

Building a container with the default naming works, but using a custom name leads to time out when the repository flag is used. The repository will be created, but a new version will not be pushed.

Steps to reproduce:
sm-docker build . --role YOUR_ROLE --repository CUSTOM_NAME:VERSION

You will get a log similar to the below:

...[Container] 2023/01/31 14:14:43 Waiting for agent ping

[Container] 2023/01/31 14:14:44 Waiting for DOWNLOAD_SOURCE
[Container] 2023/01/31 14:14:47 Phase is DOWNLOAD_SOURCE
[Container] 2023/01/31 14:14:47 CODEBUILD_SRC_DIR=/codebuild/output/src789716962/src
[Container] 2023/01/31 14:14:47 YAML location is /codebuild/output/src789716962/src/buildspec.yml
[Container] 2023/01/31 14:14:47 Setting HTTP client timeout to higher timeout for S3 source
[Container] 2023/01/31 14:14:47 Processing environment variables
[Container] 2023/01/31 14:14:47 No runtime version selected in buildspec.
[Container] 2023/01/31 14:14:47 Moving to directory /codebuild/output/src789716962/src
[Container] 2023/01/31 14:14:48 Configuring ssm agent with target id: codebuild:17ad40cb-2840-4a49-a1ec-47b176de5fe9
[Container] 2023/01/31 14:14:48 Successfully updated ssm agent configuration
[Container] 2023/01/31 14:14:48 Registering with agent
[Container] 2023/01/31 14:14:48 Phases found in YAML: 3
[Container] 2023/01/31 14:14:48  POST_BUILD: 3 commands
[Container] 2023/01/31 14:14:48  PRE_BUILD: 9 commands
[Container] 2023/01/31 14:14:48  BUILD: 4 commands
[Container] 2023/01/31 14:14:48 Phase complete: DOWNLOAD_SOURCE State: SUCCEEDED
[Container] 2023/01/31 14:14:48 Phase context status code:  Message:
[Container] 2023/01/31 14:14:48 Entering phase INSTALL
[Container] 2023/01/31 14:14:48 Phase complete: INSTALL State: SUCCEEDED
[Container] 2023/01/31 14:14:48 Phase context status code:  Message:
[Container] 2023/01/31 14:14:48 Entering phase PRE_BUILD
[Container] 2023/01/31 14:14:48 Running command echo Logging in to Amazon ECR...
Logging in to Amazon ECR...

[Container] 2023/01/31 14:14:48 Running command $(aws ecr get-login --no-include-email --region $AWS_DEFAULT_REGION)
WARNING! Using --password via the CLI is insecure. Use --password-stdin.
WARNING! Your password will be stored unencrypted in /root/.docker/config.json.
Configure a credential helper to remove this warning. See
https://docs.docker.com/engine/reference/commandline/login/#credentials-store

Login Succeeded

[Container] 2023/01/31 14:15:04 Running command $(aws ecr get-login --no-include-email --region $AWS_DEFAULT_REGION --registry-ids 763104351884)
WARNING! Using --password via the CLI is insecure. Use --password-stdin.
WARNING! Your password will be stored unencrypted in /root/.docker/config.json.
Configure a credential helper to remove this warning. See
https://docs.docker.com/engine/reference/commandline/login/#credentials-store

Login Succeeded

[Container] 2023/01/31 14:15:05 Running command $(aws ecr get-login --no-include-email --region $AWS_DEFAULT_REGION --registry-ids 217643126080)
WARNING! Using --password via the CLI is insecure. Use --password-stdin.
WARNING! Your password will be stored unencrypted in /root/.docker/config.json.
Configure a credential helper to remove this warning. See
https://docs.docker.com/engine/reference/commandline/login/#credentials-store

Login Succeeded

[Container] 2023/01/31 14:15:05 Running command $(aws ecr get-login --no-include-email --region $AWS_DEFAULT_REGION --registry-ids 727897471807)
WARNING! Using --password via the CLI is insecure. Use --password-stdin.
WARNING! Your password will be stored unencrypted in /root/.docker/config.json.
Configure a credential helper to remove this warning. See
https://docs.docker.com/engine/reference/commandline/login/#credentials-store

Login Succeeded

[Container] 2023/01/31 14:15:06 Running command $(aws ecr get-login --no-include-email --region $AWS_DEFAULT_REGION --registry-ids 626614931356)
WARNING! Using --password via the CLI is insecure. Use --password-stdin.
WARNING! Your password will be stored unencrypted in /root/.docker/config.json.
Configure a credential helper to remove this warning. See
https://docs.docker.com/engine/reference/commandline/login/#credentials-store

Login Succeeded

[Container] 2023/01/31 14:15:06 Running command $(aws ecr get-login --no-include-email --region $AWS_DEFAULT_REGION --registry-ids 683313688378)
WARNING! Using --password via the CLI is insecure. Use --password-stdin.
WARNING! Your password will be stored unencrypted in /root/.docker/config.json.
Configure a credential helper to remove this warning. See
https://docs.docker.com/engine/reference/commandline/login/#credentials-store

Login Succeeded

[Container] 2023/01/31 14:15:07 Running command $(aws ecr get-login --no-include-email --region $AWS_DEFAULT_REGION --registry-ids 520713654638)
WARNING! Using --password via the CLI is insecure. Use --password-stdin.
WARNING! Your password will be stored unencrypted in /root/.docker/config.json.
Configure a credential helper to remove this warning. See
https://docs.docker.com/engine/reference/commandline/login/#credentials-store

Login Succeeded

[Container] 2023/01/31 14:15:07 Running command $(aws ecr get-login --no-include-email --region $AWS_DEFAULT_REGION --registry-ids 462105765813)
WARNING! Using --password via the CLI is insecure. Use --password-stdin.
WARNING! Your password will be stored unencrypted in /root/.docker/config.json.
Configure a credential helper to remove this warning. See
https://docs.docker.com/engine/reference/commandline/login/#credentials-store

Login Succeeded

[Container] 2023/01/31 14:15:08 Phase complete: PRE_BUILD State: SUCCEEDED
[Container] 2023/01/31 14:15:08 Phase context status code:  Message:
[Container] 2023/01/31 14:15:08 Entering phase BUILD
[Container] 2023/01/31 14:15:08 Running command echo Build started on `date`
Build started on Tue Jan 31 14:15:08 UTC 2023

[Container] 2023/01/31 14:15:08 Running command echo Building the Docker image...
Building the Docker image...

[Container] 2023/01/31 14:15:08 Running command docker build -t $IMAGE_REPO_NAME:$IMAGE_TAG .
Sending build context to Docker daemon  6.656kB
Step 1/13 : FROM python:3.10
3.10: Pulling from library/python
bbeef03cda1f: Pulling fs layer
f049f75f014e: Pulling fs layer
56261d0e6b05: Pulling fs layer
9bd150679dbd: Pulling fs layer
5b282ee9da04: Pulling fs layer
03f027d5e312: Pulling fs layer
33acf7002bd0: Pulling fs layer
b577b9b74834: Pulling fs layer
2761e6c6b897: Pulling fs layer
03f027d5e312: Waiting
9bd150679dbd: Waiting
5b282ee9da04: Waiting
33acf7002bd0: Waiting
b577b9b74834: Waiting
2761e6c6b897: Waiting
56261d0e6b05: Verifying Checksum
56261d0e6b05: Download complete
f049f75f014e: Verifying Checksum
f049f75f014e: Download complete
bbeef03cda1f: Verifying Checksum
bbeef03cda1f: Download complete
9bd150679dbd: Verifying Checksum
9bd150679dbd: Download complete
33acf7002bd0: Verifying Checksum
33acf7002bd0: Download complete
03f027d5e312: Verifying Checksum
03f027d5e312: Download complete
b577b9b74834: Verifying Checksum
b577b9b74834: Download complete
2761e6c6b897: Verifying Checksum
2761e6c6b897: Download complete
5b282ee9da04: Verifying Checksum
5b282ee9da04: Download complete
bbeef03cda1f: Pull complete
f049f75f014e: Pull complete
56261d0e6b05: Pull complete
9bd150679dbd: Pull complete
5b282ee9da04: Pull complete
03f027d5e312: Pull complete
33acf7002bd0: Pull complete
b577b9b74834: Pull complete
2761e6c6b897: Pull complete
Digest: sha256:5ef345608493927ad12515e75ebe0004f5633dd5d7b08c13c52c3432e9a7963a
Status: Downloaded newer image for python:3.10
 ---> 13ad26b9696b
Step 2/13 : ARG NB_USER="sagemaker-user"
 ---> Running in d10ce2cc52a1
Removing intermediate container d10ce2cc52a1
 ---> 083417691092
Step 3/13 : ARG NB_UID="1000"
 ---> Running in acaea6b1ebcf
Removing intermediate container acaea6b1ebcf
 ---> 2047404a41c8
Step 4/13 : ARG NB_GID="100"
 ---> Running in dba243ad30b0
Removing intermediate container dba243ad30b0
 ---> 15678eec1de9
Step 5/13 : RUN     apt-get update &&     apt-get install -y sudo &&     useradd -m -s /bin/bash -N -u $NB_UID $NB_USER &&     chmod g+w /etc/passwd &&     echo "${NB_USER}    ALL=(ALL)    NOPASSWD:    ALL" >> /etc/sudoers &&     rm -rf /var/lib/apt/lists/*
 ---> Running in e86e2943bf13
Get:1 http://deb.debian.org/debian bullseye InRelease [116 kB]
Get:2 http://deb.debian.org/debian-security bullseye-security InRelease [48.4 kB]
Get:3 http://deb.debian.org/debian bullseye-updates InRelease [44.1 kB]
Get:4 http://deb.debian.org/debian bullseye/main amd64 Packages [8183 kB]
Get:5 http://deb.debian.org/debian-security bullseye-security/main amd64 Packages [218 kB]
Get:6 http://deb.debian.org/debian bullseye-updates/main amd64 Packages [14.6 kB]
Fetched 8624 kB in 1s (7644 kB/s)
Reading package lists...
Reading package lists...
Building dependency tree...
Reading state information...
The following NEW packages will be installed:
  sudo
0 upgraded, 1 newly installed, 0 to remove and 10 not upgraded.
Need to get 1061 kB of archives.
After this operation, 4699 kB of additional disk space will be used.
Get:1 http://deb.debian.org/debian-security bullseye-security/main amd64 sudo amd64 1.9.5p2-3+deb11u1 [1061 kB]
debconf: delaying package configuration, since apt-utils is not installed
Fetched 1061 kB in 0s (51.1 MB/s)
Selecting previously unselected package sudo.
(Reading database ... 23422 files and directories currently installed.)
Preparing to unpack .../sudo_1.9.5p2-3+deb11u1_amd64.deb ...
Unpacking sudo (1.9.5p2-3+deb11u1) ...
Setting up sudo (1.9.5p2-3+deb11u1) ...
invoke-rc.d: could not determine current runlevel
invoke-rc.d: policy-rc.d denied execution of start.
Removing intermediate container e86e2943bf13
 ---> 2f2b930385c7
Step 6/13 : RUN pip install poetry
 ---> Running in 3da32ff6be18
Collecting poetry
  Downloading poetry-1.3.2-py3-none-any.whl (218 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 218.9/218.9 kB 33.9 MB/s eta 0:00:00
Collecting urllib3<2.0.0,>=1.26.0
  Downloading urllib3-1.26.14-py2.py3-none-any.whl (140 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 140.6/140.6 kB 43.7 MB/s eta 0:00:00
Collecting keyring<24.0.0,>=23.9.0
  Downloading keyring-23.13.1-py3-none-any.whl (37 kB)
Collecting requests<3.0,>=2.18
  Downloading requests-2.28.2-py3-none-any.whl (62 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 62.8/62.8 kB 17.9 MB/s eta 0:00:00
Collecting requests-toolbelt<0.11.0,>=0.9.1
  Downloading requests_toolbelt-0.10.1-py2.py3-none-any.whl (54 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 54.5/54.5 kB 21.2 MB/s eta 0:00:00
Collecting jsonschema<5.0.0,>=4.10.0
  Downloading jsonschema-4.17.3-py3-none-any.whl (90 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 90.4/90.4 kB 18.0 MB/s eta 0:00:00
Collecting shellingham<2.0,>=1.5
  Downloading shellingham-1.5.0.post1-py2.py3-none-any.whl (9.4 kB)
Collecting tomli<3.0.0,>=2.0.1
  Downloading tomli-2.0.1-py3-none-any.whl (12 kB)
Collecting poetry-core==1.4.0
  Downloading poetry_core-1.4.0-py3-none-any.whl (546 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 546.4/546.4 kB 60.5 MB/s eta 0:00:00
Collecting dulwich<0.21.0,>=0.20.46
  Downloading dulwich-0.20.50-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (499 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 500.0/500.0 kB 70.3 MB/s eta 0:00:00
Collecting trove-classifiers>=2022.5.19
  Downloading trove_classifiers-2023.1.20-py3-none-any.whl (13 kB)
Collecting cleo<3.0.0,>=2.0.0
  Downloading cleo-2.0.1-py3-none-any.whl (77 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 77.3/77.3 kB 24.0 MB/s eta 0:00:00
Collecting cachecontrol[filecache]<0.13.0,>=0.12.9
  Downloading CacheControl-0.12.11-py2.py3-none-any.whl (21 kB)
Collecting pexpect<5.0.0,>=4.7.0
  Downloading pexpect-4.8.0-py2.py3-none-any.whl (59 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 59.0/59.0 kB 20.3 MB/s eta 0:00:00
Collecting virtualenv!=20.4.5,!=20.4.6,<21.0.0,>=20.4.3
  Downloading virtualenv-20.17.1-py3-none-any.whl (8.8 MB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 8.8/8.8 MB 118.1 MB/s eta 0:00:00
Collecting html5lib<2.0,>=1.0
  Downloading html5lib-1.1-py2.py3-none-any.whl (112 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 112.2/112.2 kB 37.4 MB/s eta 0:00:00
Collecting poetry-plugin-export<2.0.0,>=1.2.0
  Downloading poetry_plugin_export-1.3.0-py3-none-any.whl (10 kB)
Collecting pkginfo<2.0,>=1.5
  Downloading pkginfo-1.9.6-py3-none-any.whl (30 kB)
Collecting lockfile<0.13.0,>=0.12.2
  Downloading lockfile-0.12.2-py2.py3-none-any.whl (13 kB)
Collecting tomlkit!=0.11.2,!=0.11.3,<1.0.0,>=0.11.1
  Downloading tomlkit-0.11.6-py3-none-any.whl (35 kB)
Collecting platformdirs<3.0.0,>=2.5.2
  Downloading platformdirs-2.6.2-py3-none-any.whl (14 kB)
Collecting packaging>=20.4
  Downloading packaging-23.0-py3-none-any.whl (42 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 42.7/42.7 kB 12.8 MB/s eta 0:00:00
Collecting crashtest<0.5.0,>=0.4.1
  Downloading crashtest-0.4.1-py3-none-any.whl (7.6 kB)
Collecting filelock<4.0.0,>=3.8.0
  Downloading filelock-3.9.0-py3-none-any.whl (9.7 kB)
Collecting msgpack>=0.5.2
  Downloading msgpack-1.0.4-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (316 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 317.0/317.0 kB 68.5 MB/s eta 0:00:00
Collecting rapidfuzz<3.0.0,>=2.2.0
  Downloading rapidfuzz-2.13.7-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (2.2 MB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 2.2/2.2 MB 114.0 MB/s eta 0:00:00
Collecting six>=1.9
  Downloading six-1.16.0-py2.py3-none-any.whl (11 kB)
Collecting webencodings
  Downloading webencodings-0.5.1-py2.py3-none-any.whl (11 kB)
Collecting pyrsistent!=0.17.0,!=0.17.1,!=0.17.2,>=0.14.0
  Downloading pyrsistent-0.19.3-py3-none-any.whl (57 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 57.5/57.5 kB 20.8 MB/s eta 0:00:00
Collecting attrs>=17.4.0
  Downloading attrs-22.2.0-py3-none-any.whl (60 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 60.0/60.0 kB 21.1 MB/s eta 0:00:00
Collecting jeepney>=0.4.2
  Downloading jeepney-0.8.0-py3-none-any.whl (48 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 48.4/48.4 kB 16.1 MB/s eta 0:00:00
Collecting jaraco.classes
  Downloading jaraco.classes-3.2.3-py3-none-any.whl (6.0 kB)
Collecting SecretStorage>=3.2
  Downloading SecretStorage-3.3.3-py3-none-any.whl (15 kB)
Collecting importlib-metadata>=4.11.4
  Downloading importlib_metadata-6.0.0-py3-none-any.whl (21 kB)
Collecting ptyprocess>=0.5
  Downloading ptyprocess-0.7.0-py2.py3-none-any.whl (13 kB)
Collecting charset-normalizer<4,>=2
  Downloading charset_normalizer-3.0.1-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (198 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 198.8/198.8 kB 55.1 MB/s eta 0:00:00
Collecting idna<4,>=2.5
  Downloading idna-3.4-py3-none-any.whl (61 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 61.5/61.5 kB 21.8 MB/s eta 0:00:00
Collecting certifi>=2017.4.17
  Downloading certifi-2022.12.7-py3-none-any.whl (155 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 155.3/155.3 kB 48.1 MB/s eta 0:00:00
Collecting distlib<1,>=0.3.6
  Downloading distlib-0.3.6-py2.py3-none-any.whl (468 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 468.5/468.5 kB 94.4 MB/s eta 0:00:00
Collecting zipp>=0.5
  Downloading zipp-3.12.0-py3-none-any.whl (6.6 kB)
Collecting cryptography>=2.0
  Downloading cryptography-39.0.0-cp36-abi3-manylinux_2_28_x86_64.whl (4.2 MB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 4.2/4.2 MB 128.9 MB/s eta 0:00:00
Collecting more-itertools
  Downloading more_itertools-9.0.0-py3-none-any.whl (52 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 52.8/52.8 kB 20.1 MB/s eta 0:00:00
Collecting cffi>=1.12
  Downloading cffi-1.15.1-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (441 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 441.8/441.8 kB 77.9 MB/s eta 0:00:00
Collecting pycparser
  Downloading pycparser-2.21-py2.py3-none-any.whl (118 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 118.7/118.7 kB 39.4 MB/s eta 0:00:00
Installing collected packages: webencodings, trove-classifiers, ptyprocess, msgpack, lockfile, distlib, charset-normalizer, zipp, urllib3, tomlkit, tomli, six, shellingham, rapidfuzz, pyrsistent, pycparser, poetry-core, platformdirs, pkginfo, pexpect, packaging, more-itertools, jeepney, idna, filelock, crashtest, certifi, attrs, virtualenv, requests, jsonschema, jaraco.classes, importlib-metadata, html5lib, dulwich, cleo, cffi, requests-toolbelt, cryptography, cachecontrol, SecretStorage, keyring, poetry-plugin-export, poetry
Successfully installed SecretStorage-3.3.3 attrs-22.2.0 cachecontrol-0.12.11 certifi-2022.12.7 cffi-1.15.1 charset-normalizer-3.0.1 cleo-2.0.1 crashtest-0.4.1 cryptography-39.0.0 distlib-0.3.6 dulwich-0.20.50 filelock-3.9.0 html5lib-1.1 idna-3.4 importlib-metadata-6.0.0 jaraco.classes-3.2.3 jeepney-0.8.0 jsonschema-4.17.3 keyring-23.13.1 lockfile-0.12.2 more-itertools-9.0.0 msgpack-1.0.4 packaging-23.0 pexpect-4.8.0 pkginfo-1.9.6 platformdirs-2.6.2 poetry-1.3.2 poetry-core-1.4.0 poetry-plugin-export-1.3.0 ptyprocess-0.7.0 pycparser-2.21 pyrsistent-0.19.3 rapidfuzz-2.13.7 requests-2.28.2 requests-toolbelt-0.10.1 shellingham-1.5.0.post1 six-1.16.0 tomli-2.0.1 tomlkit-0.11.6 trove-classifiers-2023.1.20 urllib3-1.26.14 virtualenv-20.17.1 webencodings-0.5.1 zipp-3.12.0
WARNING: Running pip as the 'root' user can result in broken permissions and conflicting behaviour with the system package manager. It is recommended to use a virtual environment instead: https://pip.pypa.io/warnings/venv

[notice] A new release of pip available: 22.3.1 -> 23.0
[notice] To update, run: pip install --upgrade pip
Removing intermediate container 3da32ff6be18
 ---> f407d281cd37
Step 7/13 : RUN poetry config virtualenvs.create false --local
 ---> Running in 640b22c6cbe7
Removing intermediate container 640b22c6cbe7
 ---> 50469dcd0c66
Step 8/13 : COPY pyproject.toml /
 ---> cb77f4bfead5
Step 9/13 : RUN poetry install
 ---> Running in 12842688843e
Skipping virtualenv creation, as specified in config file.
Updating dependencies
Resolving dependencies...

Writing lock file

Package operations: 25 installs, 0 updates, 0 removals

  • Installing asttokens (2.2.1)
  • Installing executing (1.2.0)
  • Installing parso (0.8.3)
  • Installing pure-eval (0.2.2)
  • Installing traitlets (5.9.0)
  • Installing wcwidth (0.2.6)
  • Installing backcall (0.2.0)
  • Installing decorator (5.1.1)
  • Installing jedi (0.18.2)
  • Installing jupyter-core (5.2.0)
  • Installing matplotlib-inline (0.1.6)
  • Installing pickleshare (0.7.5)
  • Installing prompt-toolkit (3.0.36)
  • Installing pygments (2.14.0)
  • Installing python-dateutil (2.8.2)
  • Installing pyzmq (25.0.0)
  • Installing stack-data (0.6.2)
  • Installing tornado (6.2)
  • Installing ipython (8.9.0)
  • Installing ipython-genutils (0.2.0)
  • Installing jupyter-client (8.0.2)
  • Installing pytz (2022.7.1)
  • Installing numpy (1.24.1)
  • Installing ipykernel (5.5.6)
  • Installing pandas (1.5.3)
Removing intermediate container 12842688843e
 ---> c2ad7e61633d
Step 10/13 : RUN python -m ipykernel install --sys-prefix
 ---> Running in d3dc05fde23c
Installed kernelspec python3 in /usr/local/share/jupyter/kernels/python3
Removing intermediate container d3dc05fde23c
 ---> 5996d0014174
Step 11/13 : CMD jupyter kernelspec list --json
 ---> Running in 46cebaafe848
Removing intermediate container 46cebaafe848
 ---> 6c9cb62b063a
Step 12/13 : ENV SHELL=/bin/bash
 ---> Running in 5c1f61a50603
Removing intermediate container 5c1f61a50603
 ---> 9e4e7e6a0144
Step 13/13 : USER $NB_UID
 ---> Running in a5909f39cda6
Removing intermediate container a5909f39cda6
 ---> 7f9f547132c6
Successfully built 7f9f547132c6
Successfully tagged poetry-2:2.0

[Container] 2023/01/31 14:16:07 Running command docker tag $IMAGE_REPO_NAME:$IMAGE_TAG $AWS_ACCOUNT_ID.dkr.ecr.$AWS_DEFAULT_REGION.amazonaws.com/$IMAGE_REPO_NAME:$IMAGE_TAG

[Container] 2023/01/31 14:16:07 Phase complete: BUILD State: SUCCEEDED
[Container] 2023/01/31 14:16:07 Phase context status code:  Message:
[Container] 2023/01/31 14:16:07 Entering phase POST_BUILD
[Container] 2023/01/31 14:16:07 Running command echo Build completed on `date`
Build completed on Tue Jan 31 14:16:07 UTC 2023

[Container] 2023/01/31 14:16:07 Running command echo Pushing the Docker image...
Pushing the Docker image...

[Container] 2023/01/31 14:16:07 Running command docker push $AWS_ACCOUNT_ID.dkr.ecr.$AWS_DEFAULT_REGION.amazonaws.com/$IMAGE_REPO_NAME:$IMAGE_TAG
The push refers to repository [718026778991.dkr.ecr.us-east-1.amazonaws.com/poetry-2]
d2fdc4294ee4: Preparing
409a963c4bb0: Preparing
fb798a98dc82: Preparing
919b9aaa34c1: Preparing
d61ce5953464: Preparing
1598ddcd2a10: Preparing
14f8c1c57058: Preparing
7c9f55d641e2: Preparing
2a5e0ed31f5a: Preparing
dc6462f7bb8b: Preparing
a4db1a405763: Preparing
9f4f964da727: Preparing
49b333f7bad4: Preparing
a463dbda4664: Preparing
a9099c3159f5: Preparing
1598ddcd2a10: Waiting
14f8c1c57058: Waiting
7c9f55d641e2: Waiting
2a5e0ed31f5a: Waiting
dc6462f7bb8b: Waiting
a4db1a405763: Waiting
9f4f964da727: Waiting
49b333f7bad4: Waiting
a463dbda4664: Waiting
a9099c3159f5: Waiting
409a963c4bb0: Retrying in 5 seconds
fb798a98dc82: Retrying in 5 seconds
d2fdc4294ee4: Retrying in 5 seconds
d61ce5953464: Retrying in 5 seconds
919b9aaa34c1: Retrying in 5 seconds
409a963c4bb0: Retrying in 4 seconds
fb798a98dc82: Retrying in 4 seconds
d2fdc4294ee4: Retrying in 4 seconds
d61ce5953464: Retrying in 4 seconds
919b9aaa34c1: Retrying in 4 seconds
409a963c4bb0: Retrying in 3 seconds
fb798a98dc82: Retrying in 3 seconds
d2fdc4294ee4: Retrying in 3 seconds
d61ce5953464: Retrying in 3 seconds
919b9aaa34c1: Retrying in 3 seconds
409a963c4bb0: Retrying in 2 seconds
fb798a98dc82: Retrying in 2 seconds
d2fdc4294ee4: Retrying in 2 seconds
d61ce5953464: Retrying in 2 seconds
919b9aaa34c1: Retrying in 2 seconds
409a963c4bb0: Retrying in 1 second
fb798a98dc82: Retrying in 1 second
d2fdc4294ee4: Retrying in 1 second
d61ce5953464: Retrying in 1 second
919b9aaa34c1: Retrying in 1 second
409a963c4bb0: Retrying in 10 seconds
d2fdc4294ee4: Retrying in 10 seconds
fb798a98dc82: Retrying in 10 seconds
d61ce5953464: Retrying in 10 seconds
919b9aaa34c1: Retrying in 10 seconds
409a963c4bb0: Retrying in 9 seconds
d2fdc4294ee4: Retrying in 9 seconds
fb798a98dc82: Retrying in 9 seconds
d61ce5953464: Retrying in 9 seconds
919b9aaa34c1: Retrying in 9 seconds
409a963c4bb0: Retrying in 8 seconds
d2fdc4294ee4: Retrying in 8 seconds
fb798a98dc82: Retrying in 8 seconds
919b9aaa34c1: Retrying in 8 seconds
d61ce5953464: Retrying in 8 seconds
409a963c4bb0: Retrying in 7 seconds
d2fdc4294ee4: Retrying in 7 seconds
fb798a98dc82: Retrying in 7 seconds
d61ce5953464: Retrying in 7 seconds
919b9aaa34c1: Retrying in 7 seconds
409a963c4bb0: Retrying in 6 seconds
d2fdc4294ee4: Retrying in 6 seconds
fb798a98dc82: Retrying in 6 seconds
d61ce5953464: Retrying in 6 seconds
919b9aaa34c1: Retrying in 6 seconds
409a963c4bb0: Retrying in 5 seconds
d2fdc4294ee4: Retrying in 5 seconds
fb798a98dc82: Retrying in 5 seconds
919b9aaa34c1: Retrying in 5 seconds
d61ce5953464: Retrying in 5 seconds
409a963c4bb0: Retrying in 4 seconds
d2fdc4294ee4: Retrying in 4 seconds
fb798a98dc82: Retrying in 4 seconds
d61ce5953464: Retrying in 4 seconds
919b9aaa34c1: Retrying in 4 seconds
409a963c4bb0: Retrying in 3 seconds
d2fdc4294ee4: Retrying in 3 seconds
fb798a98dc82: Retrying in 3 seconds
d61ce5953464: Retrying in 3 seconds
919b9aaa34c1: Retrying in 3 seconds
409a963c4bb0: Retrying in 2 seconds
d2fdc4294ee4: Retrying in 2 seconds
fb798a98dc82: Retrying in 2 seconds
d61ce5953464: Retrying in 2 seconds
919b9aaa34c1: Retrying in 2 seconds
409a963c4bb0: Retrying in 1 second
d2fdc4294ee4: Retrying in 1 second
fb798a98dc82: Retrying in 1 second
919b9aaa34c1: Retrying in 1 second
d61ce5953464: Retrying in 1 second
409a963c4bb0: Retrying in 15 seconds
d2fdc4294ee4: Retrying in 15 seconds
fb798a98dc82: Retrying in 15 seconds
d61ce5953464: Retrying in 15 seconds
919b9aaa34c1: Retrying in 15 seconds
409a963c4bb0: Retrying in 14 seconds
d2fdc4294ee4: Retrying in 14 seconds
fb798a98dc82: Retrying in 14 seconds
d61ce5953464: Retrying in 14 seconds
919b9aaa34c1: Retrying in 14 seconds
409a963c4bb0: Retrying in 13 seconds
d2fdc4294ee4: Retrying in 13 seconds
fb798a98dc82: Retrying in 13 seconds
d61ce5953464: Retrying in 13 seconds
919b9aaa34c1: Retrying in 13 seconds
409a963c4bb0: Retrying in 12 seconds
d2fdc4294ee4: Retrying in 12 seconds
fb798a98dc82: Retrying in 12 seconds
d61ce5953464: Retrying in 12 seconds
919b9aaa34c1: Retrying in 12 seconds
409a963c4bb0: Retrying in 11 seconds
d2fdc4294ee4: Retrying in 11 seconds
fb798a98dc82: Retrying in 11 seconds
d61ce5953464: Retrying in 11 seconds
919b9aaa34c1: Retrying in 11 seconds
409a963c4bb0: Retrying in 10 seconds
d2fdc4294ee4: Retrying in 10 seconds
fb798a98dc82: Retrying in 10 seconds
d61ce5953464: Retrying in 10 seconds
919b9aaa34c1: Retrying in 10 seconds
409a963c4bb0: Retrying in 9 seconds
d2fdc4294ee4: Retrying in 9 seconds
fb798a98dc82: Retrying in 9 seconds
d61ce5953464: Retrying in 9 seconds
919b9aaa34c1: Retrying in 9 seconds
409a963c4bb0: Retrying in 8 seconds
d2fdc4294ee4: Retrying in 8 seconds
fb798a98dc82: Retrying in 8 seconds
d61ce5953464: Retrying in 8 seconds
919b9aaa34c1: Retrying in 8 seconds
409a963c4bb0: Retrying in 7 seconds
d2fdc4294ee4: Retrying in 7 seconds
fb798a98dc82: Retrying in 7 seconds
d61ce5953464: Retrying in 7 seconds
919b9aaa34c1: Retrying in 7 seconds
409a963c4bb0: Retrying in 6 seconds
d2fdc4294ee4: Retrying in 6 seconds
fb798a98dc82: Retrying in 6 seconds
d61ce5953464: Retrying in 6 seconds
919b9aaa34c1: Retrying in 6 seconds
409a963c4bb0: Retrying in 5 seconds
d2fdc4294ee4: Retrying in 5 seconds
fb798a98dc82: Retrying in 5 seconds
d61ce5953464: Retrying in 5 seconds
919b9aaa34c1: Retrying in 5 seconds
409a963c4bb0: Retrying in 4 seconds
d2fdc4294ee4: Retrying in 4 seconds
fb798a98dc82: Retrying in 4 seconds
d61ce5953464: Retrying in 4 seconds
919b9aaa34c1: Retrying in 4 seconds
409a963c4bb0: Retrying in 3 seconds
d2fdc4294ee4: Retrying in 3 seconds
fb798a98dc82: Retrying in 3 seconds
d61ce5953464: Retrying in 3 seconds
919b9aaa34c1: Retrying in 3 seconds
409a963c4bb0: Retrying in 2 seconds
d2fdc4294ee4: Retrying in 2 seconds
fb798a98dc82: Retrying in 2 seconds
d61ce5953464: Retrying in 2 seconds
919b9aaa34c1: Retrying in 2 seconds
409a963c4bb0: Retrying in 1 second
d2fdc4294ee4: Retrying in 1 second
fb798a98dc82: Retrying in 1 second
d61ce5953464: Retrying in 1 second
919b9aaa34c1: Retrying in 1 second
409a963c4bb0: Retrying in 20 seconds
fb798a98dc82: Retrying in 20 seconds
d2fdc4294ee4: Retrying in 20 seconds
d61ce5953464: Retrying in 20 seconds
919b9aaa34c1: Retrying in 20 seconds
409a963c4bb0: Retrying in 19 seconds
fb798a98dc82: Retrying in 19 seconds
d2fdc4294ee4: Retrying in 19 seconds
d61ce5953464: Retrying in 19 seconds
919b9aaa34c1: Retrying in 19 seconds
409a963c4bb0: Retrying in 18 seconds
fb798a98dc82: Retrying in 18 seconds
d2fdc4294ee4: Retrying in 18 seconds
d61ce5953464: Retrying in 18 seconds
919b9aaa34c1: Retrying in 18 seconds
409a963c4bb0: Retrying in 17 seconds
fb798a98dc82: Retrying in 17 seconds
d2fdc4294ee4: Retrying in 17 seconds
d61ce5953464: Retrying in 17 seconds
919b9aaa34c1: Retrying in 17 seconds
409a963c4bb0: Retrying in 16 seconds
fb798a98dc82: Retrying in 16 seconds
d2fdc4294ee4: Retrying in 16 seconds
d61ce5953464: Retrying in 16 seconds
919b9aaa34c1: Retrying in 16 seconds
409a963c4bb0: Retrying in 15 seconds
fb798a98dc82: Retrying in 15 seconds
d2fdc4294ee4: Retrying in 15 seconds
d61ce5953464: Retrying in 15 seconds
919b9aaa34c1: Retrying in 15 seconds
409a963c4bb0: Retrying in 14 seconds
fb798a98dc82: Retrying in 14 seconds
d2fdc4294ee4: Retrying in 14 seconds
d61ce5953464: Retrying in 14 seconds
919b9aaa34c1: Retrying in 14 seconds
409a963c4bb0: Retrying in 13 seconds
fb798a98dc82: Retrying in 13 seconds
d2fdc4294ee4: Retrying in 13 seconds
d61ce5953464: Retrying in 13 seconds
919b9aaa34c1: Retrying in 13 seconds
409a963c4bb0: Retrying in 12 seconds
fb798a98dc82: Retrying in 12 seconds
d2fdc4294ee4: Retrying in 12 seconds
d61ce5953464: Retrying in 12 seconds
919b9aaa34c1: Retrying in 12 seconds
409a963c4bb0: Retrying in 11 seconds
fb798a98dc82: Retrying in 11 seconds
d2fdc4294ee4: Retrying in 11 seconds
d61ce5953464: Retrying in 11 seconds
919b9aaa34c1: Retrying in 11 seconds
409a963c4bb0: Retrying in 10 seconds
fb798a98dc82: Retrying in 10 seconds
d2fdc4294ee4: Retrying in 10 seconds
d61ce5953464: Retrying in 10 seconds
919b9aaa34c1: Retrying in 10 seconds
409a963c4bb0: Retrying in 9 seconds
fb798a98dc82: Retrying in 9 seconds
d2fdc4294ee4: Retrying in 9 seconds
d61ce5953464: Retrying in 9 seconds
919b9aaa34c1: Retrying in 9 seconds
409a963c4bb0: Retrying in 8 seconds
fb798a98dc82: Retrying in 8 seconds
d2fdc4294ee4: Retrying in 8 seconds
d61ce5953464: Retrying in 8 seconds
919b9aaa34c1: Retrying in 8 seconds
409a963c4bb0: Retrying in 7 seconds
fb798a98dc82: Retrying in 7 seconds
d2fdc4294ee4: Retrying in 7 seconds
d61ce5953464: Retrying in 7 seconds
919b9aaa34c1: Retrying in 7 seconds
409a963c4bb0: Retrying in 6 seconds
fb798a98dc82: Retrying in 6 seconds
d2fdc4294ee4: Retrying in 6 seconds
d61ce5953464: Retrying in 6 seconds
919b9aaa34c1: Retrying in 6 seconds
409a963c4bb0: Retrying in 5 seconds
fb798a98dc82: Retrying in 5 seconds
d2fdc4294ee4: Retrying in 5 seconds
d61ce5953464: Retrying in 5 seconds
919b9aaa34c1: Retrying in 5 seconds
409a963c4bb0: Retrying in 4 seconds
fb798a98dc82: Retrying in 4 seconds
d2fdc4294ee4: Retrying in 4 seconds
d61ce5953464: Retrying in 4 seconds
919b9aaa34c1: Retrying in 4 seconds
409a963c4bb0: Retrying in 3 seconds
fb798a98dc82: Retrying in 3 seconds
d2fdc4294ee4: Retrying in 3 seconds
d61ce5953464: Retrying in 3 seconds
919b9aaa34c1: Retrying in 3 seconds
409a963c4bb0: Retrying in 2 seconds
fb798a98dc82: Retrying in 2 seconds
d2fdc4294ee4: Retrying in 2 seconds
d61ce5953464: Retrying in 2 seconds
919b9aaa34c1: Retrying in 2 seconds
409a963c4bb0: Retrying in 1 second
fb798a98dc82: Retrying in 1 second
d2fdc4294ee4: Retrying in 1 second
d61ce5953464: Retrying in 1 second
919b9aaa34c1: Retrying in 1 second
EOF

[Container] 2023/01/31 14:16:57 Command did not exit successfully docker push $AWS_ACCOUNT_ID.dkr.ecr.$AWS_DEFAULT_REGION.amazonaws.com/$IMAGE_REPO_NAME:$IMAGE_TAG exit status 1
[Container] 2023/01/31 14:16:57 Phase complete: POST_BUILD State: FAILED
[Container] 2023/01/31 14:16:57 Phase context status code: COMMAND_EXECUTION_ERROR Message: Error while executing command: docker push $AWS_ACCOUNT_ID.dkr.ecr.$AWS_DEFAULT_REGION.amazonaws.com/$IMAGE_REPO_NAME:$IMAGE_TAG. Reason: exit status 1

Image building error in protobuf 4.x versions

I encountered the following error while building the image. Following the error's suggestion, downgrading the protobuf version to 3.20.3 resolves the issue. However, starting from version 4.20, it no longer functions properly.

Building container image and pushing to ECR
Traceback (most recent call last):
File "/opt/conda/envs/studio/bin/sm-docker", line 8, in
sys.exit(main())
File "/opt/conda/envs/studio/lib/python3.9/site-packages/sagemaker_studio_image_build/cli.py", line 133, in main
args.func(args, unknown)
File "/opt/conda/envs/studio/lib/python3.9/site-packages/sagemaker_studio_image_build/cli.py", line 74, in build_image
args.repository, get_role(args), args.bucket, args.compute_type,
File "/opt/conda/envs/studio/lib/python3.9/site-packages/sagemaker_studio_image_build/cli.py", line 46, in get_role
import sagemaker
File "/opt/conda/envs/studio/lib/python3.9/site-packages/sagemaker/init.py", line 18, in
from sagemaker import estimator, parameter, tuner # noqa: F401
File "/opt/conda/envs/studio/lib/python3.9/site-packages/sagemaker/estimator.py", line 27, in
from sagemaker import git_utils, image_uris, vpc_utils
File "/opt/conda/envs/studio/lib/python3.9/site-packages/sagemaker/image_uris.py", line 24, in
from sagemaker.spark import defaults
File "/opt/conda/envs/studio/lib/python3.9/site-packages/sagemaker/spark/init.py", line 16, in
from sagemaker.spark.processing import PySparkProcessor, SparkJarProcessor # noqa: F401
File "/opt/conda/envs/studio/lib/python3.9/site-packages/sagemaker/spark/processing.py", line 35, in
from sagemaker.local.image import _ecr_login_if_needed, _pull_image
File "/opt/conda/envs/studio/lib/python3.9/site-packages/sagemaker/local/init.py", line 16, in
from .local_session import ( # noqa: F401
File "/opt/conda/envs/studio/lib/python3.9/site-packages/sagemaker/local/local_session.py", line 23, in
from sagemaker.local.image import _SageMakerContainer
File "/opt/conda/envs/studio/lib/python3.9/site-packages/sagemaker/local/image.py", line 38, in
import sagemaker.local.data
File "/opt/conda/envs/studio/lib/python3.9/site-packages/sagemaker/local/data.py", line 26, in
import sagemaker.amazon.common
File "/opt/conda/envs/studio/lib/python3.9/site-packages/sagemaker/amazon/common.py", line 23, in
from sagemaker.amazon.record_pb2 import Record
File "/opt/conda/envs/studio/lib/python3.9/site-packages/sagemaker/amazon/record_pb2.py", line 36, in
_descriptor.FieldDescriptor(
File "/opt/conda/envs/studio/lib/python3.9/site-packages/google/protobuf/descriptor.py", line 561, in new
_message.Message._CheckCalledFromGeneratedFile()
TypeError: Descriptors cannot not be created directly.
If this call came from a _pb2.py file, your generated code is out of date and must be regenerated with protoc >= 3.19.0.
If you cannot immediately regenerate your protos, some other possible workarounds are:

  1. Downgrade the protobuf package to 3.20.x or lower.
  2. Set PROTOCOL_BUFFERS_PYTHON_IMPLEMENTATION=python (but this will use pure-Python parsing and will be much slower).

More information: https://developers.google.com/protocol-buffers/docs/news/2022-05-06#python-updates
make: *** [container] Error 1

Any way to specify additional ECR registries to log in to?

I'm trying to sm-docker build a container derived from SageMaker Scikit-Learn framework container in ap-southeast-1, something like the following:

base_docker_uri = sagemaker.image_uris.retrieve(
    sagemaker.sklearn.defaults.SKLEARN_NAME,
    smsess.boto_region_name,
    version="0.23-1",
    instance_type="ml.m5.xlarge",
)
# 121021644041.dkr.ecr.ap-southeast-1.amazonaws.com/sagemaker-scikit-learn:0.23-1-cpu-py3

...so Dockerfile is FROM 121021644041.dkr....etc

Seems like the CLI tool spins up successfully and logs in to a load of other ECR registries, but not 121021644041: Then fails on step 1 with:

[Container] 2021/04/20 02:54:22 Running command docker build -t $IMAGE_REPO_NAME:$IMAGE_TAG .
Sending build context to Docker daemon   7.68kB
Step 1/2 : FROM 121021644041.dkr.ecr.ap-southeast-1.amazonaws.com/sagemaker-scikit-learn:0.23-1-cpu-py3
Get https://121021644041.dkr.ecr.ap-southeast-1.amazonaws.com/v2/sagemaker-scikit-learn/manifests/0.23-1-cpu-py3: no basic auth credentials

[Container] 2021/04/20 02:54:22 Command did not exit successfully docker build -t $IMAGE_REPO_NAME:$IMAGE_TAG . exit status 1

I've since tested and on a SageMaker Notebook Instance I can build the same Dockerfile fine, so long as I log in to the 121021644041 ECR first.

From a cursory look at the job logs and #12, it looks like the current strategy is to have the tool ecr login to every AWS account on which AWS DLCs are provided?

...So would the correct fix be to add every account Id listed here to support SKLearn?

I was thinking it might be preferable to also add a way for users to indicate extra required account IDs through the CLI, since:

  • This list is going to start getting pretty long
  • (Like in #12 and this issue) I guess there might always be some gaps introducing bugs
  • In some rarer cases, I guess users might have private cross-account ECR needs too?

UnicodeEncodeError in zip.write()

I'm attempting to get this sample (which builds a container image from notebook in the "Define a SageMaker Model Monitor schedule" section) running in SageMaker Studio, using the new CLI.

Essentially there is a ./docker/ folder next to my notebook containing just a Dockerfile and evaluation.py script.

However when I run:

!sm-docker build ./docker --file ./docker/Dockerfile --repository sagemaker-processing-container:latest

(Or same without specifying the --file or --repository options, or omitting the :latest tag) I get the following error:

Traceback (most recent call last):
  File "/opt/conda/lib/python3.6/zipfile.py", line 432, in _encodeFilenameFlags
    return self.filename.encode('ascii'), self.flag_bits
UnicodeEncodeError: 'ascii' codec can't encode characters in position 11-31: ordinal not in range(128)

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/opt/conda/bin/sm-docker", line 8, in <module>
    sys.exit(main())
  File "/opt/conda/lib/python3.6/site-packages/sagemaker_studio_image_build/cli.py", line 92, in main
    args.func(args, unknown)
  File "/opt/conda/lib/python3.6/site-packages/sagemaker_studio_image_build/cli.py", line 53, in build_image
    args.repository, get_role(args), args.bucket, extra_args, log=not args.no_logs
  File "/opt/conda/lib/python3.6/site-packages/sagemaker_studio_image_build/builder.py", line 68, in build_image
    bucket, key = upload_zip_file(repository, bucket, " ".join(extra_args))
  File "/opt/conda/lib/python3.6/site-packages/sagemaker_studio_image_build/builder.py", line 39, in upload_zip_file
    zip.write(f"{dirname}/{file}")
  File "/opt/conda/lib/python3.6/zipfile.py", line 1622, in write
    with open(filename, "rb") as src, self.open(zinfo, 'w') as dest:
  File "/opt/conda/lib/python3.6/zipfile.py", line 1355, in open
    return self._open_to_write(zinfo, force_zip64=force_zip64)
  File "/opt/conda/lib/python3.6/zipfile.py", line 1468, in _open_to_write
    self.fp.write(zinfo.FileHeader(zip64))
  File "/opt/conda/lib/python3.6/zipfile.py", line 422, in FileHeader
    filename, flag_bits = self._encodeFilenameFlags()
  File "/opt/conda/lib/python3.6/zipfile.py", line 434, in _encodeFilenameFlags
    return self.filename.encode('utf-8'), self.flag_bits | 0x800
UnicodeEncodeError: 'utf-8' codec can't encode characters in position 11-31: surrogates not allowed

It's a weird error so I could well be doing something stupid - but am wondering if there's an implicitly assumed encoding somewhere which is clashing with this kernel's environment?

I don't have any special chars in filenames, and am running Studio kernel Python 3 (PyTorch CPU Optimized).

Any ideas or insights greatly appreciated!

Full steps to reproduce

(From the referenced public sample above)

  • Add this package to the set of pip installs at the top
  • Replace the ! unzip ... command with something like the following (since Studio kernels don't have unzip installed by default)
import zipfile
with zipfile.ZipFile('GTSRB_Final_Test_Images.zip', 'r') as zip_ref:
    print('Unzipping...')
    zip_ref.extractall()
  • Split the cell containing # Create ECR repository and push docker image: Execute just the first (Python) half and run the above sm-docker command instead of the sample's !docker build ... line.

Command did not exit successfully docker push

I'm getting the error "Command did not exit successfully docker push" after the !sm-docker build . command at the stage where it is pushing the Docker image. Any idea why?

[Container] 2021/05/04 06:57:20 Running command echo Pushing the Docker image...
Pushing the Docker image...

[Container] 2021/05/04 06:57:20 Running command docker push $AWS_ACCOUNT_ID.dkr.ecr.$AWS_DEFAULT_REGION.amazonaws.com/$IMAGE_REPO_NAME:$IMAGE_TAG
The push refers to repository [752731038471.dkr.ecr.eu-central-1.amazonaws.com/sagemaker-studio-d-tfbogtriaiml]
An image does not exist locally with the tag: 752731038471.dkr.ecr.eu-central-1.amazonaws.com/sagemaker-studio-d-tfbogtriaiml

[Container] 2021/05/04 06:57:20 Command did not exit successfully docker push $AWS_ACCOUNT_ID.dkr.ecr.$AWS_DEFAULT_REGION.amazonaws.com/$IMAGE_REPO_NAME:$IMAGE_TAG exit status 1
[Container] 2021/05/04 06:57:20 Phase complete: POST_BUILD State: FAILED
[Container] 2021/05/04 06:57:20 Phase context status code: COMMAND_EXECUTION_ERROR Message: Error while executing command: docker push $AWS_ACCOUNT_ID.dkr.ecr.$AWS_DEFAULT_REGION.amazonaws.com/$IMAGE_REPO_NAME:$IMAGE_TAG. Reason: exit status 1

Using sm-docker too often triggers throttling on dockerhub

After running sm-docker for 3-4 times in the same hour I get the error:
toomanyrequests: You have reached your pull rate limit. You may increase the limit by authenticating and upgrading: https://www.docker.com/increase-rate-limit

[Feature Request] Add support for cache usage

It would be great if we could manipulate CodeBuild create_project's cache argument so that builds take significantly less time. This is specially painful when developing container, where minor corrections/iterations can take 5-10 mins each.

CodeBuild supports different options for cache usage that can be indicated as an argument (cache) in the create_project method of the boto3 codebuild client (https://boto3.amazonaws.com/v1/documentation/api/latest/reference/services/codebuild/client/create_project.html)

...
    cache={
        'type': 'NO_CACHE'|'S3'|'LOCAL',
        'location': 'string',
        'modes': [
            'LOCAL_DOCKER_LAYER_CACHE'|'LOCAL_SOURCE_CACHE'|'LOCAL_CUSTOM_CACHE',
        ]
    },
...

We could leverage this argument to further optimize codebuild integration with sagemaker.

[Enhancement] - Ability to specify a specific bucket prefix to upload code to

As far as I can tell from the source, it seems like the user has no input to control the prefix of the uploaded code. The s3.upload_fileobj() is used but the key that would normally contain the prefix you wanted is hardcoded as shown in the linked line. I suggest that either the user can specify another command line option such as --prefix and give a prefix that way, or to add the functionality to --bucket, where I can write my-bucket-name/prefix and it will parse out the prefix, putting it as the start of the key parameter.

https://github.com/aws-samples/sagemaker-studio-image-build-cli/blob/master/sagemaker_studio_image_build/builder.py#L30

[Question] sm-docker zips and uploads the current dir, not the target dir?

From my investigations into #2, I saw that sm-docker is zipping and uploading the current folder, not the target folder, when a target is given. E.g:

sm-docker build ./docker
# Uploads all of ./ (not just ./docker) to S3

Is this expected and deliberate for some reason? If you have lots of other stuff in your folder it can make for a significantly slower wait than the (also working) cd ./docker && sm-docker build .

Set build environment type as new arg

When building a conda environment in the Dockerfile I am getting the following error message:

CondaMemoryError: The conda process ran out of memory. Increase system memory and/or try again.

Would it be possible to have an argument to set the build environment type as described here ?

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.