Comments (21)
we have a pretty solid solution for our monorepo and we have different pipelines for dev and master. It uses multiple yaml files with different triggers such as:
for project A on master
trigger:
branches:
include:
- master
paths:
include:
- projectA
- sharedDependency
for project A on develop
trigger:
branches:
include:
- develop
paths:
include:
- projectA
- sharedDependency
for project B on master
trigger:
branches:
include:
- master
paths:
include:
- projectB
- sharedDependency
It means you have multiple build pipelines with different definition files, but you can leverage the power of templates to help with keeping the code duplication low. I don't know of any other way to do it, but Its been running like that for a while and it's pretty solid.
happy pipelining!
from azure-pipelines-yaml.
I think you can define three different yaml files (make sure you rename each time so that it does not replace the other). Use branch policy to trigger build for PR and for the other two, use trigger statement in your corresponding yaml file.
You can rename the yaml file which should solve the problem.
from azure-pipelines-yaml.
No, I think this is a valid issue. After the UI design the option to select a different YAML template than the default azure-pipelines.yml file has disappeared. For reference, this used to be possible:
https://sethreid.co.nz/using-multiple-yaml-build-definitions-azure-devops/
However, now if you have an azure-pipelines.yml file in the root of your repo, it's automatically picked up by default (which is great). But when I go to create a new Build Definition, it forces me to use or modify the azure-pipelines.yml file that already exists. This is very very bad as now I'm forced to use a single yaml file, where I used to be able to select a different one. Seems a lot of functionality throughout Azure DevOps has mistakenly disappeared after the latest redesign.
The only workaround I've found is to execute a build after checking in the azure-pipelines.yml file, then click and edit the build definition. In the top right of the screen next to the "Run" button is an ellipsis ("..."). Clicking on this allows you to access "Settings", and choose a new yaml file. Then you can save and rename the Build definition. Now you can create and new Build Definition and it will force you to use the default azure-pipelines.yml file, which is fine since you modified the first build definition.
This feels like a total hack and is such a poor user experience.
from azure-pipelines-yaml.
We can also leveraged the condition feature for jobs and have different jobs for each branches in the same pipeline
condition: and(eq(variables['build.sourceBranch'], 'refs/heads/master'))
So we should have azure-pipeline.yaml with same content in each branch? For example I want three pipelines from 3 branches master, dev, QA i need azure-pipeline.yaml in each branch and following trigger:
trigger:
- master
- development
- QA
BTW is it possible to override global env variables depending on which branch triggered?
variables:
REACT_APP_BACKEND: link.net
I want 3 different link under this varibale depending what branch was triggered.
from azure-pipelines-yaml.
There is a new feature in Azure Pipelines called templates. I use it to create one pipeline definition and then reuse it for different branches.
E.g.:
azure-pipelines.template.yml
stages:
- stage: Build
jobs:
- job: Build
steps:
- ...
- stage: Deploy
jobs:
- deployment: ...
azure-pipelines.master.yml
trigger:
- master
variables:
- name: SomeVariable
value: MasterSpecificValue
stages:
- template: azure-pipelines.template.yml
And so on for different branches / environments etc. Also I have separate azure-pipelines.pr.yml file for PR builds.
@AndrewCraswell as you can see, there is possibility to set different values of some variable for different branches.
from azure-pipelines-yaml.
Just an FYI for anyone using the condition:
option.
1.) It doesn't explicitly say so in the documentation, but you can add the condition to a stage (not just a job). If the condition is attached to a stage, it will skip the entire stage if the source branch isn't master. If you add the condition to the job within a stage and have approvals enabled, it will still prompt you for the approval even though the job is just skipped after that.
2.) I believe there's a typo in the above code snippet above. condition: and(eq(variables['build.sourceBranch'], 'refs/heads/master'))
threw a syntax error for me. I presume because the and
statement is expecting two arguments. The example from the link above is condition: and(succeeded(), eq(variables['build.sourceBranch'], 'refs/heads/master'))
.
from azure-pipelines-yaml.
May be author means: "How do I define different pipelines for different branches in one YAML definition file"?
Could you please write any examples, now this moment is poorly documented.
I don't understand how to do this.
Thanks.
from azure-pipelines-yaml.
It's still possible to create a pipeline by selecting a custom yaml file
You also change the file which defines a pipeline once it has been created like you mentionned
from azure-pipelines-yaml.
We can also leveraged the condition feature for jobs and have different jobs for each branches in the same pipeline
condition: and(eq(variables['build.sourceBranch'], 'refs/heads/master'))
from azure-pipelines-yaml.
Appending onto this issue, I am interested in the above asks but also looking to run multiple projects from a single repo using file/folder filtering.
eg. when a PR is created and the files changed are in /project1/ versus /project2/ project 1 gets build/deployed but project 2 is not touched. Is it possible to run another yaml/import task to build out more complex pipelines?
from azure-pipelines-yaml.
We have the following configuration set up which does pretty much what I think you're after:
- Build Pipeline (YAML)
Set to build whenever there's a commit to develop, hotfix or release branches:
trigger:
branches:
include:
- develop
- release/*
- hotfix/*
This will then trigger a build whenever there's a push to develop
.
-
Repository - Apply a Branch Policy on
develop
to require a successful build before accepting a PR merge:
This will ensure that you have a build from your published PRs - your Build Pipeline should perform the build and any unit tests that are required (this covers all "Build and Run Tests" requirements). Once this is complete, you can complete the PR and merge intodevelop
. -
Release Pipeline (currently Classic, but you should be able to do the same with a Multi-Step Pipeline and gates):
- The CD trigger is set to run whenever there is an Artifact available.
- The first stage (Dev) is triggered automatically as soon as an Artifact is available (CI was responsible for build and test).
- The second stage (QA) is then gated, primarily with an Artifact filter set to only allow the artifact was created from
develop
,release/*
orhotfix/*
branches:
Note that we still need to deploy and confirm the build through Dev and QA before we release to Prod.
You could modify those filters and policies to suit your purposes, but I'd strongly recommend that you don't do a new build from master that deploys straight into production - otherwise that would technically be the first time you've seen that specific build and codebase in an environment.
from azure-pipelines-yaml.
i'm looking to implement something similar, is this possible? how?
from azure-pipelines-yaml.
@samuel-begin this is fine for the build part, but it might become a nightmare to handle on the release pipeline in order to trigger and pickup the proper artifact
from azure-pipelines-yaml.
@ggirard07 not as much as you'd think. You can trigger a release form different artifacts and use any of them in the pipeline steps.
I understand your concern, since ive worked alot with gitlab-ci, but in azdo you can do all of that multi-pipeline fine. Or maybe you want to narrow down your needs, have some more examples and make a new issue? I'm not in the microsoft team, but this looks more like a question than a feature request
from azure-pipelines-yaml.
@KIRY4, if you find a good way of overriding the .ENV variables for front end projects, let me know! Currently I'm producing a new build artifact for each environment, where the only difference is the environment variables. Seems to be a very wasteful process and adds a lot of complexity, but I haven't had much time to investigate deeper into alternatives.
from azure-pipelines-yaml.
A template in this repository shows a 'reviewApp' pattern. See https://github.com/microsoft/azure-pipelines-yaml/blob/master/templates/deploy-to-existing-kubernetes-cluster.yml. It has {{#if reviewApp}}
blocks in it, not sure how these work yet.
from azure-pipelines-yaml.
Even though I explicitly state to include master branch and a specific path to a project, for some reason, the build is being triggered by an individual CI from my teammate's branch...
trigger:
branches:
include:
- master
paths:
include:
- $(solution_path)
resources:
- repo: self
from azure-pipelines-yaml.
In order to consolidate to fewer feedback channels, we've moved suggestions and issue reporting to Developer Community. Sorry for any confusion resulting from this move.
from azure-pipelines-yaml.
azure-pipelines.master.yml
@starkpl how do you specify to use azure-pipelines.master.yml vs azure-pipelines.dev.yml ?
Thank you
from azure-pipelines-yaml.
@cb03037 You can create separate Pipeline for each of the files/branches you want to have. When creating a pipeline select Existing Azure Pipelines YAML file, then choose the file. Do this for each of the master/dev yaml files. If you configured triggers correctly inside this files, correct pipeline runs when commits are pushed to each branch.
from azure-pipelines-yaml.
Just an FYI for anyone using the
condition:
option.1.) It doesn't explicitly say so in the documentation, but you can add the condition to a stage (not just a job). If the condition is attached to a stage, it will skip the entire stage if the source branch isn't master. If you add the condition to the job within a stage and have approvals enabled, it will still prompt you for the approval even though the job is just skipped after that.
2.) I believe there's a typo in the above code snippet above.
condition: and(eq(variables['build.sourceBranch'], 'refs/heads/master'))
threw a syntax error for me. I presume because theand
statement is expecting two arguments. The example from the link above iscondition: and(succeeded(), eq(variables['build.sourceBranch'], 'refs/heads/master'))
.
I have built this to do ephemeral env builds and destroy for non-deployable branches for devs to test their feature branches, it skips that stage when deploying to dev/uat/prod envs, the condition array works great for us.
-
stage: 'Deploy_Ephemeral'
displayName: 'Deploy To Ephemeral Environment'
dependsOn: ['Build_Stage']
condition: |
and
(
not(eq(variables['build.sourceBranch'], 'refs/heads/develop')),
not(eq(variables['build.sourceBranch'], 'refs/heads/UAT')),
not(eq(variables['build.sourceBranch'], 'refs/heads/master'))
)jobs:
- deployment: Deploy
etc, etc
- deployment: Deploy
from azure-pipelines-yaml.
Related Issues (20)
- Devops for Micro Service HOT 1
- Yaml becomes invalid when specific `checkout:` syntax occurs more than once HOT 1
- How to use `Build.SourceVersionAuthor`? HOT 3
- by design there´s not available option in DeVops Service to create a new pipeline yaml for "existing" branch HOT 5
- deploy built artifacts in container uploads both checked out projects instead of right one HOT 2
- how to triger a build pipeline from powershell HOT 2
- Pipeline resource trigger documentation missing prerequisite HOT 1
- could find the branch error in azure yaml HOT 2
- python task ran successfull but pipeline is faiing with bash exit code 1 HOT 3
- Azure YAML Pipelines - Using template expression variables at non-global scope HOT 1
- Examples for resource pipeline and source inputs are not clear. HOT 9
- Hi Team While running my Azure pipeline.yaml I am getting following error: directive are not supported for the expression that are embedded within the string Directive are only supported when the entire value is an expression HOT 1
- This repo is missing important files
- It is possible to convert a json to variables/parameters YAML and iterate? HOT 3
- Support for conditions in templates? HOT 1
- Custom expression/function in Yaml HOT 1
- Azure DevOps YAML Template Powershell script file issue HOT 2
- Azure Pipelines Resource Trigger (branch filter) not working as expected. HOT 2
- GitHub Actions Build Error: Invalid App Path for Django-Celery-Beat Dockerfile HOT 3
- Error running pipeline after renaming default GitHub branch from 'master' to 'main' HOT 3
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from azure-pipelines-yaml.