Comments (10)
Is the code in a s3 bucket or is it embedded into the template? Not that I can make the math work for 41655897 to 4MBs as the limit.
from aws-cloudformation-github-deploy.
im guessing embedded in the template??, because I don't see where it would be uploaded. the weird thing is, in a working project, I don't see that code either. the only s3 upload or copy command i see is when we s3 cp our lambda authenticator, or when we upload our template.yml
Do you see anything glaringly obvious I am missing?
template.yml:
MyFunction:
Type: AWS::Serverless::Function
Properties:
CodeUri: functions/myFunction
Description: "Lambda function to process files"
FunctionName: !Sub '${myLambdaFunctionName}${sSuffix}'
Handler: app.lambda_handler
Runtime: python3.9
MemorySize: 1000
Timeout: 300
Role: !GetAtt IAMRole.Arn
PermissionsBoundary: !Sub "arn:aws:iam::${AWS::AccountId}:policy/Boundary"
Environment:
Variables:
BUCKET_NAME: !Sub "${s3BucketName}-${DeployEnvironment}"
PENDING_S3_LOCATION: "PENDING"
# PENDING AND OUTPUT WILL BE THE SAME FOR NOW
OUTPUT_S3_LOCATION: "PENDING"
LOG_LEVEL: !Ref "logLevel"
AutoPublishAlias: !Ref FunctionCurrentVersionAlias
DeploymentPreference:
Type: !FindInMap [EnvironmentConfiguration, !Ref DeployEnvironment, FunctionDeploymentPreference]
Role: !GetAtt "IAMRoleForCodeDeploy.Arn"
Tracing: Active
PackageType: Zip
Layers:
- !Ref MyLayer
MyLayer:
Type: AWS::Serverless::LayerVersion
Properties:
ContentUri: layers/my_layer
CompatibleRuntimes:
- python3.9
deployment-actions.yml
- name: Setup AWS SAM
uses: aws-actions/setup-sam@v2
- name: Installing required dependencies
run: sh build-scripts/install.sh
- name: Run AWS SAM Build
run: sam build --use-container --template-file template.yml
- name: sam package
run: sam package --template-file .aws-sam/build/template.yaml --s3-bucket ${{ env.S3_BUCKET }} --output-template-file template.yml --kms-key-id alias/aws/s3
- name: Upload CloudFormation Template to S3
run: |
aws s3 cp ./template.yml s3://${{ env.S3_BUCKET }}/${{ github.event.repository.name }}/template.yml
- name: Deploy to AWS Cloudformation
# FAILS HERE
uses: aws-actions/aws-cloudformation-github-deploy@v1
with:
name: ${{ github.event.repository.name }}-${{ env.DEPLOYMENT_ENV }}-pipeline-stack-deployment
template: https://s3.amazonaws.com/${{ env.S3_BUCKET }}/${{ github.event.repository.name }}/template.yml
parameter-overrides: file://${{ github.workspace }}/params.${{ env.DEPLOYMENT_ENV }}.json
changeset: --change-set-name
capabilities: CAPABILITY_IAM,CAPABILITY_AUTO_EXPAND, CAPABILITY_NAMED_IAM
no-fail-on-empty-changeset: "1"
tags: ${{ env.DEPLOYMENT_TAGS }}
install.sh:
install_dependencies() {
echo "Installing Node.js code dependencies"
for function_directory in ${STARTING_DIR}/functions/* ; do
cd ${function_directory}
if [ -f "package.json" ]; then
echo " Installing dependencies for ${function_directory}"
npm install
fi
done
echo "Installing Python code dependencies"
for function_directory in ${STARTING_DIR}/functions/* ; do
cd ${function_directory}
if [ -f "requirements.txt" ]; then
echo " Installing dependencies for ${function_directory}"
pip install -r requirements.txt
fi
done
}
install_python_layer() {
LAYER_PATH=${PWD}/python
echo " Installing Python layer in ${LAYER_PATH}"
cd ${LAYER_PATH}
if [ -f "requirements.txt" ]; then
pip install -r requirements.txt -t . --upgrade
fi
if [ "${PYTHONPATH#*LAYER_PATH}" = "${PYTHONPATH}" ]; then
if [ "${PYTHONPATH}" = "" ]; then
export PYTHONPATH=${LAYER_PATH}
else
export PYTHONPATH=${LAYER_PATH}:${PYTHONPATH}
fi
fi
}
install_nodejs_layer() {
LAYER_PATH=${PWD}/nodejs
NODE_MODULES_DIR=${LAYER_PATH}/node_modules
echo " Installing NodeJS layer in ${LAYER_PATH}"
cd ${LAYER_PATH}
if [ -f "package.json" ]; then
npm install
fi
if [ "${NODE_PATH#*$LAYER_PATH}" = "${NODE_PATH}" ]; then
if [ "$NODE_PATH" = "" ]; then
export NODE_PATH=${NODE_MODULES_DIR}
else
export NODE_PATH=${NODE_MODULES_DIR}:${NODE_PATH}
fi
fi
}
install_layers() {
echo "Installing Lambda layers"
for layer_directory in ${STARTING_DIR}/layers/*; do
cd ${layer_directory}
if [ -d "python" ]; then
install_python_layer
elif [ -d "nodejs" ]; then
install_nodejs_layer
fi
done
echo "NODE_PATH=${NODE_PATH}"
echo "PYTHONPATH=${PYTHONPATH}"
}
echo "Starting install - $(date)"
STARTING_DIR=$PWD
set -xe
install_dependencies
install_layers
cd $STARTING_DIR
unset STARTING_DIR
echo "Completed install - $(date)"
from aws-cloudformation-github-deploy.
Not sure whats different between codepipeline and github actions in terms of the installing of the lambda layers
Now, in the working code, there is a post deploy step that uploads the lambda layer to S3. but we are failing before we even get there in cloudformation
from aws-cloudformation-github-deploy.
SAM (I think) always packages it with your setup (CodeUri
with PackageType
) so I don't think this would be inline code.
from aws-cloudformation-github-deploy.
Also note an older issue on using SAM with this action: #50
from aws-cloudformation-github-deploy.
somehow, we are using codeUri with local functions, but we don't get that validation error. But wonder if it's still related
from aws-cloudformation-github-deploy.
Maybe it is SAM. Now I am getting "validation error" but it start printing out my serverless template instead of telling me what the error is and gets cut off.
from aws-cloudformation-github-deploy.
Interesting - it seems to be blowing up file sizes in GitHub actions as opposed to codebuild.
I removed “Sam build” and just kept “Sam package”
that finished the deployment, but a lambda that was previously just a few kb is now packaged as 19 MB. Have you seen this before? Should I post on Sam-CLI?
thanks always
from aws-cloudformation-github-deploy.
more context
- its installing our layer AND requirements.txt, so practically doubling the size of the functions. also, its installing packages like boto3 that are already prepackaged in lambda.
- so somehow in codebuild we WEREN'T bundling every lambda requirements.txt with the function, it was smart enough to include them if it was prepackaged into lambda
We are removing requirements.txt and it is working, but i still wish I knew why this worked with codebuild before
from aws-cloudformation-github-deploy.
I feel like this is related. I bet our internal codebuild process was doing this.
aws/serverless-application-model#1927
from aws-cloudformation-github-deploy.
Related Issues (20)
- Flag to pass on no changes HOT 1
- Node.js 12 actions are deprecated
- 'templateBody' failed to satisfy constraint: Member must have length less than or equal to 51200
- Stack is in ROLLBACK_COMPLETE state and can not be updated HOT 2
- delete this issue
- Unable to set change-set Name
- ChangeSet input is not optional
- Unexpected key 'ChangeSetName' found in params
- How do I provide a description for the deployed stack
- Actions is successful but it doesn't apply the changeset
- Deploying process in some files works and in others not.
- Node.js 16 actions are deprecated. Please update the following actions to use Node.js 20 HOT 1
- Waiter timeout HOT 4
- Stack does not exist HOT 6
- Multiple capabilites causing an issue HOT 3
- Releasing new node version HOT 2
- Allow no capabilities when performing action
- action deploys stack even if 'no-execute-changeset' is set
- Unintended exception mishandling after v3 sdk change HOT 2
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from aws-cloudformation-github-deploy.