Giter VIP home page Giter VIP logo

apm-pipeline-library's Introduction

CI Shared Library for the Elastic Observability projects

Build Status Contributors GitHub release Automated Release Notes by gren pre-commit OpenSSF Scorecard

We support different CI ecosystems:

  • GitHub actions, this is the current supported CI that receives features. 📌 Active
  • Jenkins, this is deprecated and unless any security or major bugs, there will be no updates. 📌 Deprecation

User Documentation

Known Issues

A list of known issues is available on the GitHub issues page of this project.

How to obtain support

Feel free to open new issues for feature requests, bugs or general feedback on the GitHub issues page of this project.

Contributing

Read and understand our contribution guidelines before opening a pull request.

Resources

GitHub actions specific

Jenkins specific

See Jenkins resources guidelines of this project.

Further details

📌 Deprecation Notice

The specific implementation we have done for the Jenkins shared library is deprecated. Everything related to Jenkins will be deleted eventually by the end of 2023.

We encourage any consumers to start using any of the release tags rather than the current tag when consuming this Jenkins shared library.



Made with ♥️ and ☕️ by Elastic and our community.

apm-pipeline-library's People

Contributors

amannocci avatar and-blk avatar andrewvc avatar apmmachine avatar axw avatar cachedout avatar cmacknz avatar dependabot[bot] avatar efd6 avatar endorama avatar fearful-symmetry avatar giuliohome avatar jonahbull avatar jonaskunz avatar jsoriano avatar kuisathaverat avatar matschaffer avatar mdelapenya avatar mrodm avatar nkammah avatar nxei avatar pazone avatar reakaleek avatar ruflin avatar simitt avatar tarekziade avatar trentm avatar v1v avatar wayneseymour avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

apm-pipeline-library's Issues

Support to ignore patterns in isGitRegionMatch

In order to refine matching and skip pipelines, I would like to match all files in a sub-directory, that are not .asciidoc or .yml files. For example:

isGitRegionMatch(patterns: ["^libbeat/*"], ignore: ["*.asciidoc", "*.yml"])

By adding ignore patterns we can prevent CI runs for docs only changes.

Document pre-commit validation when creating a commit on a pipeline

Every time we work on a new pipeline, or an existing one, when creating a git commit, the precommit hook needs the local Jenkins instance needs to be running to perform the validation, which I always forget.

A workaround is to use an env var for the Jenkins URL pointing to the APM CI.

I'd suggest explaining this better in the docs, as now it says:

## Validate JJBB files
If the local jenkins instance has been enabled then it's possible to validate whether the JJBB files are healthy enough.

Which suggest that the validation will happen if and only if the local instance is running. But it actually fails the validation if the server is not up.

Release process: how to identify a minor version

Initial discussion: https://elastic.slack.com/archives/CJMURHEHX/p1579175789268000

One thing I noticed with our current release process is that it always generates a patch version (the Z in a semver versioning schema), which is totally OK and I'm not against that.

The thing is, once published, the version X.Y.Z of the library tells the client about what is inside a new release:

  • if Z is changed, it's supposed to contain bug fixes with backward compatibility
  • if Y is changed, it's supposed to contain new features with backward compatibility
  • and if X is changed, it's supposed to include incompatible API changes
    Please see https://semver.org/

If we take a look at our release notes, we are adding enhancements and new features inside patches. And that's is totally fine in terms of consuming the library without breaking the clients, but we are not informing explicitly to them about those changes unless they read the release notes.

So my point is to align the release notes with the version, which would mean: is there any enhancement, let's bump the Y.

The comment trigger stop working with elastic users on un trusted forks

if an Elastic user triggers a build on a PR with a comment jenkins run the tests, the build is aborted because the PR is not approved.

[2020-01-22T18:08:32.406Z] kuisathaverat commented: jenkins run the tests please
[2020-01-22T18:08:32.406Z] [Elastic] Plugin version: 1.4-SNAPSHOT
[2020-01-22T18:08:32.406Z] [Elastic] Generated the Elastic Build ID: 20200122180832-B1FFEB13
[2020-01-22T18:08:32.407Z] [Elastic] Generating Filebeat configuration.
...
[2020-01-22T18:11:16.640Z] [Elastic] Done with build 20200122180832-B1FFEB13.
[2020-01-22T18:11:17.308Z] 
[2020-01-22T18:11:17.308Z] GitHub has been notified of this commit’s build result
[2020-01-22T18:11:17.308Z] 
[2020-01-22T18:11:17.309Z] ERROR: githubPrCheckApproved: The PR is not approved yet
[2020-01-22T18:11:17.309Z] Finished: FAILURE

Build -> here

[Automation] Provide a step to know whether the changelist matches a given regexp

Already implemented in https://github.com/elastic/apm-server/blob/d628a3ee9b950ebfe3cdbee46235ac4f3b3a783a/Jenkinsfile#L61-L76

and in fact, the git plugin provides a similar feature but pipeline doesn't honor the ignore paths yet:

image

Further details: https://issues.jenkins-ci.org/browse/JENKINS-36195

Let's provide a step then we can use it somewhere else. The idea is to give some flexibility whether certain stages in the pipeline should be triggered when there are changes in certain folders/files.

Build info is not grabbed correctly

for some reason, Blueocean is not returning all info about the build in a few builds, it starts happens after the latest upgrade 2.219.

{
  "result": "FAILURE",
  "state": "FINISHED",
  "durationInMillis": 448962
}

Screenshot 2020-02-13 at 10 03 44

GIT_BASE_COMMIT seems not being updated in regular pipelines

We noticed that the GIT_BASE_COMMIT (which is populated by us) is not receiving the proper value (env.GIT_COMMIT, coming from Jenkins) in a regular pipeline.
Screenshot 2020-04-01 at 20 57 36

In the above image, shot while testing the isGitRegionMatch for a regular pipeline, the env vars are printed before and after swapping the values with:

   setEnvVar('GIT_BACKUP', "${env.GIT_PREVIOUS_COMMIT}")
   setEnvVar('GIT_PREVIOUS_COMMIT', "${env.GIT_BASE_COMMIT}")
   setEnvVar('GIT_BASE_COMMIT', "${env.GIT_BACKUP}")

We observed that not until setting the GIT_BASE_COMMIT one with the default GIT_COMMIT the git-diff was not executed with the proper values

Relates #456

[Automation] ITs for the library

This might help to add more coverage similar to what we do in production. See https://github.com/elastic/observability-dev/issues/271

There are certain initiatives to support ITs within a shared library. See:

Which are using the JenkinsRunner and the Test Framework:

Let's see whether we can add more functional/integration testing on our end to help with not just the stability but with the documentation.

cc @elastic/observablt-robots

Local development: pulling images from Make fails with access denied

Steps to reproduce:

  • Check Jenkins infra Docker image is not present in your system: docker images docker.elastic.co/infra/jenkins
  • Remove local Docker image for local Jenkins: docker rmi docker images docker.elastic.co/infra/jenkins
  • Run LOG_LEVEL=DEBUG make build to build the image with debug log level

Expected behavior: docker compose pulls the image from the private repo
Actual behavior: build fails with error

Sending auth config ('docker.elastic.co', 'eu.gcr.io', 'https://docker.elastic.co', 'https://index.docker.io/v1/', 'asia.gcr.io', 'gcr.io', 'marketplace.gcr.io', 'staging-k8s.gcr.io', 'us.gcr.io')
http://localhost:None "POST /v1.38/build?t=local_jenkins&q=False&nocache=False&rm=True&forcerm=False&pull=False HTTP/1.1" 200 None
Step 1/3 : FROM docker.elastic.co/infra/jenkins:201911011157.d339e8ba4dc2
ERROR: Service 'jenkins' failed to build: pull access denied for docker.elastic.co/infra/jenkins, repository does not exist or may require 'docker login': denied: requested access to the resource is denied
make: *** [build] Error 1

Suggestions

Docker-compose had an issue when pulling from private repos (docker/compose#6713), which was fixed in v1.25.0-rc3 (https://github.com/docker/compose/releases/tag/1.25.0-rc3)

We suggest updating local docker-compose version to 1.25.0

Use a regular github robot user for creating release commits

Spoiler: cosmetic change

Instead of using current worker as user (i.e. jenkins@apm-ci-immutable-ubuntu-1604-1574949808307231422.c.elastic-ci-prod.internal), which changes depending on the build agent, let's use a more formal user for the commits, like elasticmachine or any other robot.

Ideas?

[Question] GitHub comments don't work with multiple MBPs for the same repo

Description

If this is something we would like to support then we might need to think about the user experience.

For instance, beats uses the packaging and the pr validation, therefore, two different MBPs for the same repo.

When reporting the build status as a github comment then we need to support something like:

image

Actions

  • Agree with the team (@elastic/observablt-robots) what's the best approach.
  • Share with other teams their thoughts.
  • Implement if required.

pre-commit step doesn't work in other projects

13:31:48  Shellscript: lint........................................................Failed
13:31:48  - hook id: shell-lint
13:31:48  - exit code: 1
13:31:48  
13:31:48  shellcheck command not found
13:31:48  
13:31:48  Yaml: lint...........................................(no files to check)Skipped

Build - here

This project does provide a pre-commit library but some of the scripts are hosted in this repo, it fails when using docker as a container as the PATH is not updated with the .ci/scripts folder

[Design] Send Jenkins build data to ElasticSearch

This is regarding the current approach and known issues.

Description

Every build that happens in the CI is ingested to ElasticSearch (see dashboard) as long as there is a postbuild step that calls notifyBuild

The current amount of data is really massive, as the tests that were executed are also included. There are some projects with a few hundred of thousands of tests. As an example the apm-agent-nodejs test data is about 300mbs.

Known issues

The current design to ingest data to ES seems to be no scaling well for two reasons:

  • It uses http request and it's required to transform the file to a data object in the CI.
  • It takes ages and affect the build feedback loop.

Proposal

For that reason, what do you think if we:

  • ingest per build the test failures rather than the whole testsuite.
  • ingest per master build, or with certain periodicity, all the tests that have been executed as another indice. This will behave as the authority for the tests that should be monitored as a whole.

There is also another option to ingest data in an async mode per build.

Affected systems

  • The flakiness algorithm.
  • Dashboards?
  • Long term to use ML or other ES features.

@elastic/observablt-robots what are your thoughts?

Add mvnw to SCM

The rationale of this task is to stick to proper version of the build system (maven in this case), not forcing a local build to use the local version, which could be incompatible with the desired one.

ITs Docker images build is failling

see https://apm-ci.elastic.co/blue/organizations/jenkins/apm-shared%2Fapm-docker-images-pipeline/detail/apm-docker-images-pipeline/207/pipeline/933

[2020-03-26T05:16:49.735Z] make: Entering directory '/var/lib/jenkins/workspace/apm-shared/apm-docker-images-pipeline/integration-testing-images/docker'
[2020-03-26T05:16:49.735Z] Cloning into 'bats-core'...
[2020-03-26T05:16:50.307Z] Note: checking out 'c706d1470dd1376687776bbe985ac22d09780327'.
[2020-03-26T05:16:50.307Z] 
[2020-03-26T05:16:50.307Z] You are in 'detached HEAD' state. You can look around, make experimental
[2020-03-26T05:16:50.307Z] changes and commit them, and you can discard any commits you make in this
[2020-03-26T05:16:50.307Z] state without impacting any branches by performing another checkout.
[2020-03-26T05:16:50.307Z] 
[2020-03-26T05:16:50.307Z] If you want to create a new branch to retain commits you create, you may
[2020-03-26T05:16:50.307Z] do so (now or later) by using -b with the checkout command again. Example:
[2020-03-26T05:16:50.307Z] 
[2020-03-26T05:16:50.307Z]   git checkout -b <new-branch-name>
[2020-03-26T05:16:50.307Z] 
[2020-03-26T05:16:50.877Z] 12-alpine: Pulling from library/node
[2020-03-26T05:16:50.878Z] aad63a933944: Already exists
[2020-03-26T05:16:50.878Z] edd41271d385: Pulling fs layer
[2020-03-26T05:16:50.878Z] dd731a721451: Pulling fs layer
[2020-03-26T05:16:50.878Z] 495807fcdd37: Pulling fs layer
[2020-03-26T05:16:51.138Z] 495807fcdd37: Verifying Checksum
[2020-03-26T05:16:51.138Z] 495807fcdd37: Download complete
[2020-03-26T05:16:51.138Z] dd731a721451: Verifying Checksum
[2020-03-26T05:16:51.138Z] dd731a721451: Download complete
[2020-03-26T05:16:51.398Z] edd41271d385: Verifying Checksum
[2020-03-26T05:16:51.398Z] edd41271d385: Download complete
[2020-03-26T05:16:52.853Z] edd41271d385: Pull complete
[2020-03-26T05:16:52.853Z] dd731a721451: Pull complete
[2020-03-26T05:16:52.853Z] 495807fcdd37: Pull complete
[2020-03-26T05:16:52.853Z] Digest: sha256:6b5b783c9cfe229af0bd5b0b677dd32005bb22d58465f3d0fe7fbd1c60ce068c
[2020-03-26T05:16:52.853Z] Status: Downloaded newer image for node:12-alpine
[2020-03-26T05:16:52.853Z] docker.io/library/node:12-alpine
[2020-03-26T05:16:53.120Z] Submodule 'docker/tests/test_helper/bats-assert' (https://github.com/ztombol/bats-assert) registered for path 'tests/test_helper/bats-assert'
[2020-03-26T05:16:53.120Z] Submodule 'docker/tests/test_helper/bats-support' (https://github.com/ztombol/bats-support) registered for path 'tests/test_helper/bats-support'
[2020-03-26T05:16:53.120Z] Cloning into '/var/lib/jenkins/workspace/apm-shared/apm-docker-images-pipeline/integration-testing-images/docker/tests/test_helper/bats-assert'...
[2020-03-26T05:16:53.389Z] Cloning into '/var/lib/jenkins/workspace/apm-shared/apm-docker-images-pipeline/integration-testing-images/docker/tests/test_helper/bats-support'...
[2020-03-26T05:16:53.654Z] Submodule path 'tests/test_helper/bats-assert': checked out '9f88b4207da750093baabc4e3f41bf68f0dd3630'
[2020-03-26T05:16:53.654Z] Submodule path 'tests/test_helper/bats-support': checked out '004e707638eedd62e0481e8cdc9223ad471f12ee'
[2020-03-26T05:16:53.654Z] 1..5
[2020-03-26T05:21:00.372Z] ok 1 apm-server - build image
[2020-03-26T05:21:00.372Z] ok 2 apm-server - clean test containers
[2020-03-26T05:21:00.372Z] ok 3 apm-server - create test container
[2020-03-26T05:21:00.372Z] not ok 4 apm-server - test container with 0 as exitcode
[2020-03-26T05:21:00.372Z] # (from function `assert_output' in file tests/test_helper/bats-assert/src/assert.bash, line 239,
[2020-03-26T05:21:00.372Z] #  in test file tests/tests.bats, line 35)
[2020-03-26T05:21:00.372Z] #   `assert_output '0'' failed
[2020-03-26T05:21:00.372Z] # 
[2020-03-26T05:21:00.372Z] # -- output differs --
[2020-03-26T05:21:00.372Z] # expected : 0
[2020-03-26T05:21:00.372Z] # actual   : 1
[2020-03-26T05:21:00.372Z] # --
[2020-03-26T05:21:00.372Z] # 
[2020-03-26T05:21:00.372Z] ok 5 apm-server - clean test containers afterwards
[2020-03-26T05:21:00.372Z] /usr/local/bin/tap-xunit -> /usr/local/lib/node_modules/tap-xunit/bin/tap-xunit
[2020-03-26T05:21:00.372Z] /usr/local/bin/txunit -> /usr/local/lib/node_modules/tap-xunit/bin/tap-xunit
[2020-03-26T05:21:00.372Z] + [email protected]
[2020-03-26T05:21:00.372Z] added 21 packages from 21 contributors in 1.664s
[2020-03-26T05:21:00.372Z] Makefile:19: recipe for target 'test-apm-server' failed
[2020-03-26T05:21:00.372Z] make: *** [test-apm-server] Error 1
[2020-03-26T05:21:00.372Z] make: Leaving directory '/var/lib/jenkins/workspace/apm-shared/apm-docker-images-pipeline/integration-testing-images/docker'
script returned exit code 2[2020-03-26T05:21:00.478Z] Recording test results

Add integrations-ci images to packer-cache

To speed the build for metricbeat-e2e-poc, we'd like to cache some docker images.

The images:

  • docker.elastic.co/integrations-ci/beats-apache:2.4.12
  • docker.elastic.co/integrations-ci/beats-apache:2.4.20
  • docker.elastic.co/integrations-ci/beats-mysql:mariadb-10.2.23
  • docker.elastic.co/integrations-ci/beats-mysql:mariadb-10.3.14
  • docker.elastic.co/integrations-ci/beats-mysql:mariadb-10.4.4
  • docker.elastic.co/integrations-ci/beats-mysql:mysql-5.7.12
  • docker.elastic.co/integrations-ci/beats-mysql:mysql-8.0.13
  • docker.elastic.co/integrations-ci/beats-mysql:percona-5.7.24
  • docker.elastic.co/integrations-ci/beats-mysql:percona-8.0.13-4
  • docker.elastic.co/integrations-ci/beats-redis:3.2.12
  • docker.elastic.co/integrations-ci/beats-redis:4.0.11
  • docker.elastic.co/integrations-ci/beats-redis:5.0.5

[Automation] If no authentication then isUserTrigger fails

When using the shared library in a local jenkin sinstance which doesn't contain any security facilities, then the isUserTrigger fails with the below stacktrace

hudson.remoting.ProxyException: groovy.lang.MissingMethodException: No signature of method: net.sf.json.JSONNull.trim() is applicable for argument types: () values: []
Possible solutions: wait(), grep(), wait(long), write(java.io.Writer), grep(java.lang.Object), is(java.lang.Object)
	at org.codehaus.groovy.runtime.ScriptBytecodeAdapter.unwrap(ScriptBytecodeAdapter.java:58)
	at org.codehaus.groovy.runtime.callsite.PojoMetaClassSite.call(PojoMetaClassSite.java:49)
	at org.codehaus.groovy.runtime.callsite.CallSiteArray.defaultCall(CallSiteArray.java:48)
	at org.codehaus.groovy.runtime.callsite.AbstractCallSite.call(AbstractCallSite.java:113)
	at com.cloudbees.groovy.cps.sandbox.DefaultInvoker.methodCall(DefaultInvoker.java:20)
	at isUserTrigger.call(isUserTrigger.groovy:27)
	at gitCheckout.call(gitCheckout.groovy:79)

[Automation] docker bats fetch is failing

 make -C .ci/docker test-codecov 
Cloning bats-core
Fetching origin
Reset branch 'v1.1.0'
Clonning bats-assert
Fetching origin
Fetching upstream
Fetching kuisathaverat
fatal: reference is not a tree: 9f88b4207da750093baabc4e3f41bf68f0dd3630
make: *** [bats-assert] Error 128

[Automation] First time contributor check cannot be reset

There is a corner case @simitt has faced recently with the apm-server. The first time contributor check failed and could not be reset afterwards.

How did it happen?

  • build4 was triggered by a timer and it failed with a timeout issue in the infra side. Therefore the First Time contributor GH check was set as failed.
  • The remaining builds were triggered either manually or with a comment and therefore the GH check is not changed at all

Why?

if(!isUserTrigger() && !isCommentTrigger()){
try {
githubPrCheckApproved()
if (notify) {
githubNotify(context: 'First time contributor', status: 'SUCCESS', targetUrl: ' ')
}
} catch(err) {
if (notify) {
githubNotify(context: 'First time contributor', description: 'It requires manual inspection', status: 'FAILURE', targetUrl: ' ')
}
throw err
}
}
does only work for neither comment nor user triggers type.

codecov step seems to have some issues with the GH api

Let's see whether the codecov step error shown below is related to the implementation or an environmental issue

[2019-07-03T11:48:30.377Z]  at githubBranchRef.call(githubBranchRef.groovy:32)
[2019-07-03T11:48:30.377Z]  at codecov.call(codecov.groovy:58)

Full stacktrace:

[2019-07-03T11:48:30.377Z] Also:   org.jenkinsci.plugins.workflow.steps.FlowInterruptedException
[2019-07-03T11:48:30.377Z]      at org.jenkinsci.plugins.workflow.cps.CpsBodyExecution.cancel(CpsBodyExecution.java:253)
[2019-07-03T11:48:30.377Z]      at org.jenkinsci.plugins.workflow.steps.BodyExecution.cancel(BodyExecution.java:76)
[2019-07-03T11:48:30.377Z]      at org.jenkinsci.plugins.workflow.cps.steps.ParallelStepExecution.stop(ParallelStepExecution.java:67)
[2019-07-03T11:48:30.377Z]      at org.jenkinsci.plugins.workflow.cps.steps.ParallelStep$ResultHandler$Callback.checkAllDone(ParallelStep.java:147)
[2019-07-03T11:48:30.377Z]      at org.jenkinsci.plugins.workflow.cps.steps.ParallelStep$ResultHandler$Callback.onFailure(ParallelStep.java:134)
[2019-07-03T11:48:30.377Z]      at org.jenkinsci.plugins.workflow.cps.CpsBodyExecution$FailureAdapter.receive(CpsBodyExecution.java:361)
[2019-07-03T11:48:30.377Z]      at com.cloudbees.groovy.cps.impl.ThrowBlock$1.receive(ThrowBlock.java:68)
[2019-07-03T11:48:30.377Z] Also:   org.jenkinsci.plugins.workflow.steps.FlowInterruptedException
[2019-07-03T11:48:30.377Z]      at org.jenkinsci.plugins.workflow.cps.CpsBodyExecution.cancel(CpsBodyExecution.java:253)
[2019-07-03T11:48:30.377Z]      at org.jenkinsci.plugins.workflow.steps.BodyExecution.cancel(BodyExecution.java:76)
[2019-07-03T11:48:30.377Z]      at org.jenkinsci.plugins.workflow.cps.steps.ParallelStepExecution.stop(ParallelStepExecution.java:67)
[2019-07-03T11:48:30.377Z]      at org.jenkinsci.plugins.workflow.cps.steps.ParallelStep$ResultHandler$Callback.checkAllDone(ParallelStep.java:147)
[2019-07-03T11:48:30.377Z]      at org.jenkinsci.plugins.workflow.cps.steps.ParallelStep$ResultHandler$Callback.onFailure(ParallelStep.java:134)
[2019-07-03T11:48:30.377Z]      at org.jenkinsci.plugins.workflow.cps.CpsBodyExecution$FailureAdapter.receive(CpsBodyExecution.java:361)
[2019-07-03T11:48:30.377Z]      at com.cloudbees.groovy.cps.impl.ThrowBlock$1.receive(ThrowBlock.java:68)
[2019-07-03T11:48:30.377Z] java.lang.NullPointerException: Cannot get property 'repo' on null object
[2019-07-03T11:48:30.377Z]  at org.codehaus.groovy.runtime.NullObject.getProperty(NullObject.java:60)
[2019-07-03T11:48:30.377Z]  at org.codehaus.groovy.runtime.InvokerHelper.getProperty(InvokerHelper.java:174)
[2019-07-03T11:48:30.377Z]  at org.codehaus.groovy.runtime.ScriptBytecodeAdapter.getProperty(ScriptBytecodeAdapter.java:456)
[2019-07-03T11:48:30.377Z]  at com.cloudbees.groovy.cps.sandbox.DefaultInvoker.getProperty(DefaultInvoker.java:39)
[2019-07-03T11:48:30.377Z]  at com.cloudbees.groovy.cps.impl.PropertyAccessBlock.rawGet(PropertyAccessBlock.java:20)
[2019-07-03T11:48:30.377Z]  at githubBranchRef.call(githubBranchRef.groovy:32)
[2019-07-03T11:48:30.377Z]  at codecov.call(codecov.groovy:58)
[2019-07-03T11:48:30.377Z]  at ___cps.transform___(Native Method)
[2019-07-03T11:48:30.377Z]  at com.cloudbees.groovy.cps.impl.PropertyishBlock$ContinuationImpl.get(PropertyishBlock.java:74)
[2019-07-03T11:48:30.377Z]  at com.cloudbees.groovy.cps.LValueBlock$GetAdapter.receive(LValueBlock.java:30)
[2019-07-03T11:48:30.377Z]  at com.cloudbees.groovy.cps.impl.PropertyishBlock$ContinuationImpl.fixName(PropertyishBlock.java:66)
[2019-07-03T11:48:30.377Z]  at sun.reflect.GeneratedMethodAccessor856.invoke(Unknown Source)
[2019-07-03T11:48:30.377Z]  at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
[2019-07-03T11:48:30.377Z]  at java.lang.reflect.Method.invoke(Method.java:498)
[2019-07-03T11:48:30.377Z]  at com.cloudbees.groovy.cps.impl.ContinuationPtr$ContinuationImpl.receive(ContinuationPtr.java:72)
[2019-07-03T11:48:30.377Z]  at com.cloudbees.groovy.cps.impl.ConstantBlock.eval(ConstantBlock.java:21)
[2019-07-03T11:48:30.377Z]  at com.cloudbees.groovy.cps.Next.step(Next.java:83)
[2019-07-03T11:48:30.377Z]  at com.cloudbees.groovy.cps.Continuable$1.call(Continuable.java:174)
[2019-07-03T11:48:30.377Z]  at com.cloudbees.groovy.cps.Continuable$1.call(Continuable.java:163)
[2019-07-03T11:48:30.377Z]  at org.codehaus.groovy.runtime.GroovyCategorySupport$ThreadCategoryInfo.use(GroovyCategorySupport.java:129)
[2019-07-03T11:48:30.377Z]  at org.codehaus.groovy.runtime.GroovyCategorySupport.use(GroovyCategorySupport.java:268)
[2019-07-03T11:48:30.377Z]  at com.cloudbees.groovy.cps.Continuable.run0(Continuable.java:163)
[2019-07-03T11:48:30.377Z]  at org.jenkinsci.plugins.workflow.cps.SandboxContinuable.access$001(SandboxContinuable.java:18)
[2019-07-03T11:48:30.377Z]  at org.jenkinsci.plugins.workflow.cps.SandboxContinuable.run0(SandboxContinuable.java:51)
[2019-07-03T11:48:30.377Z]  at org.jenkinsci.plugins.workflow.cps.CpsThread.runNextChunk(CpsThread.java:186)
[2019-07-03T11:48:30.377Z]  at org.jenkinsci.plugins.workflow.cps.CpsThreadGroup.run(CpsThreadGroup.java:370)
[2019-07-03T11:48:30.377Z]  at org.jenkinsci.plugins.workflow.cps.CpsThreadGroup.access$200(CpsThreadGroup.java:93)
[2019-07-03T11:48:30.377Z]  at org.jenkinsci.plugins.workflow.cps.CpsThreadGroup$2.call(CpsThreadGroup.java:282)
[2019-07-03T11:48:30.377Z]  at org.jenkinsci.plugins.workflow.cps.CpsThreadGroup$2.call(CpsThreadGroup.java:270)
[2019-07-03T11:48:30.377Z]  at org.jenkinsci.plugins.workflow.cps.CpsVmExecutorService$2.call(CpsVmExecutorService.java:64)
[2019-07-03T11:48:30.377Z]  at java.util.concurrent.FutureTask.run(FutureTask.java:266)
[2019-07-03T11:48:30.377Z]  at hudson.remoting.SingleLaneExecutorService$1.run(SingleLaneExecutorService.java:131)
[2019-07-03T11:48:30.377Z]  at jenkins.util.ContextResettingExecutorService$1.run(ContextResettingExecutorService.java:28)
[2019-07-03T11:48:30.377Z]  at jenkins.security.ImpersonatingExecutorService$1.run(ImpersonatingExecutorService.java:59)
[2019-07-03T11:48:30.377Z]  at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
[2019-07-03T11:48:30.377Z]  at java.util.concurrent.FutureTask.run(FutureTask.java:266)
[2019-07-03T11:48:30.377Z]  at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
[2019-07-03T11:48:30.377Z]  at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
[2019-07-03T11:48:30.377Z]  at java.lang.Thread.run(Thread.java:748)

Which might be related to

12:47:52  [WARN] makeGithubApiCall: The REST API call https://api.github.com/repos/elastic/apm-agent-java/pulls/699 return the message : java.lang.Exception: httpRequest: Failure connecting to the service https://api.github.com/repos/elastic/apm-agent-java/pulls/699 : httpRequest: Failure connecting to the service https://api.github.com/repos/elastic/apm-agent-java/pulls/699 : 
12:47:52  Code: 502
12:47:52  Error: {
12:47:52    "message": "Server Error"
12:47:52  }

APM Server Docker images build fail

see https://apm-ci.elastic.co/blue/organizations/jenkins/apm-shared%2Fapm-docker-images-pipeline/detail/apm-docker-images-pipeline/207/pipeline/1009

[2020-03-26T05:25:12.914Z] + make -C .ci/docker all-tests
[2020-03-26T05:25:12.914Z] make: Entering directory '/var/lib/jenkins/workspace/apm-shared/apm-docker-images-pipeline/apm-server-images/.ci/docker'
[2020-03-26T05:25:12.914Z] Cloning bats-core
[2020-03-26T05:25:12.914Z] Cloning into '/var/lib/jenkins/workspace/apm-shared/apm-docker-images-pipeline/apm-server-images/.ci/docker/bats-core'...
[2020-03-26T05:25:13.174Z] Fetching origin
[2020-03-26T05:25:13.435Z] Switched to a new branch 'v1.1.0'
[2020-03-26T05:25:13.435Z] Clonning bats-assert
[2020-03-26T05:25:13.435Z] Cloning into '/var/lib/jenkins/workspace/apm-shared/apm-docker-images-pipeline/apm-server-images/.ci/docker/tests/test_helper/bats-assert'...
[2020-03-26T05:25:14.008Z] Fetching origin
[2020-03-26T05:25:14.008Z] Switched to a new branch 'v0.3.0'
[2020-03-26T05:25:14.008Z] Clonning bats-support
[2020-03-26T05:25:14.008Z] Cloning into '/var/lib/jenkins/workspace/apm-shared/apm-docker-images-pipeline/apm-server-images/.ci/docker/tests/test_helper/bats-support'...
[2020-03-26T05:25:14.582Z] Fetching origin
[2020-03-26T05:25:14.582Z] Switched to a new branch 'v0.3.0'
[2020-03-26T05:25:14.582Z] Pulling Alpine image
[2020-03-26T05:25:15.155Z] 12-alpine: Pulling from library/node
[2020-03-26T05:25:15.417Z] Digest: sha256:6b5b783c9cfe229af0bd5b0b677dd32005bb22d58465f3d0fe7fbd1c60ce068c
[2020-03-26T05:25:15.417Z] Status: Image is up to date for node:12-alpine
[2020-03-26T05:25:15.417Z] docker.io/library/node:12-alpine
[2020-03-26T05:25:15.417Z] 1..5
[2020-03-26T05:25:15.993Z] not ok 1 golang-mage - build image
[2020-03-26T05:25:15.993Z] # (from function `assert_success' in file tests/test_helper/bats-assert/src/assert.bash, line 114,
[2020-03-26T05:25:15.993Z] #  in test file tests/tests.bats, line 17)
[2020-03-26T05:25:15.993Z] #   `assert_success' failed
[2020-03-26T05:25:15.993Z] # 
[2020-03-26T05:25:15.993Z] # -- command failed --
[2020-03-26T05:25:15.993Z] # status : 1
[2020-03-26T05:25:15.993Z] # output (14 lines):
[2020-03-26T05:25:15.993Z] #   Sending build context to Docker daemon  2.048kB

[2020-03-26T05:25:15.993Z] #   Step 1/7 : ARG GO_VERSION=1.13.9
[2020-03-26T05:25:15.993Z] #   Step 2/7 : FROM golang:${GO_VERSION}
[2020-03-26T05:25:15.993Z] #    ---> d48f500a56fb
[2020-03-26T05:25:15.993Z] #   Step 3/7 : ENV TOOLS=/tools
[2020-03-26T05:25:15.993Z] #    ---> Running in e436e71ecbb8
[2020-03-26T05:25:15.993Z] #   Removing intermediate container e436e71ecbb8
[2020-03-26T05:25:15.993Z] #    ---> af44e6f02223
[2020-03-26T05:25:15.993Z] #   Step 4/7 : WORKDIR $TOOLS
[2020-03-26T05:25:15.993Z] #    ---> Running in 99abb19036a8
[2020-03-26T05:25:15.993Z] #   Removing intermediate container 99abb19036a8
[2020-03-26T05:25:15.993Z] #    ---> 7382e262999a
[2020-03-26T05:25:15.993Z] #   Step 5/7 : COPY go.mod .
[2020-03-26T05:25:15.993Z] #   COPY failed: stat /var/lib/docker/tmp/docker-builder638143668/go.mod: no such file or directory
[2020-03-26T05:25:15.993Z] # --
[2020-03-26T05:25:15.993Z] # 
[2020-03-26T05:25:15.993Z] ok 2 golang-mage - clean test containers
[2020-03-26T05:25:16.943Z] ok 3 golang-mage - create test container
[2020-03-26T05:25:17.893Z] ok 4 golang-mage - test container with 0 as exitcode
[2020-03-26T05:25:18.155Z] ok 5 golang-mage - clean test containers afterwards
[2020-03-26T05:25:18.155Z] make[1]: Entering directory '/var/lib/jenkins/workspace/apm-shared/apm-docker-images-pipeline/apm-server-images/.ci/docker'
[2020-03-26T05:25:21.480Z] /usr/local/bin/txunit -> /usr/local/lib/node_modules/tap-xunit/bin/tap-xunit
[2020-03-26T05:25:21.480Z] /usr/local/bin/tap-xunit -> /usr/local/lib/node_modules/tap-xunit/bin/tap-xunit
[2020-03-26T05:25:21.480Z] + [email protected]
[2020-03-26T05:25:21.480Z] added 21 packages from 21 contributors in 1.868s
[2020-03-26T05:25:21.743Z] Makefile:41: recipe for target 'convert-tests-results' failed
[2020-03-26T05:25:21.743Z] make[1]: *** [convert-tests-results] Error 1
[2020-03-26T05:25:21.743Z] make[1]: Leaving directory '/var/lib/jenkins/workspace/apm-shared/apm-docker-images-pipeline/apm-server-images/.ci/docker'
[2020-03-26T05:25:21.743Z] Makefile:45: recipe for target 'test-golang-mage' failed
[2020-03-26T05:25:21.743Z] make: *** [test-golang-mage] Error 2
[2020-03-26T05:25:21.743Z] make: Leaving directory '/var/lib/jenkins/workspace/apm-shared/apm-docker-images-pipeline/apm-server-images/.ci/docker'
script returned exit code 2[2020-03-26T05:25:21.857Z] Recording test results

Include mergeRemote param when checking out Git repositories

If the caller provides mergeTarget as input param, then the gitCheckout step is not configuring Jenkins objects properly, setting mergeRemote as NULL, which causes the checkout process to fail:

[2019-06-14T09:46:22.349Z] Merging Revision b0cef3414173aa3e5ec8100d1e2ec2b0d810f504 (detached) to null/master, UserMergeOptions{mergeRemote='null', mergeTarget='master', mergeStrategy='DEFAULT', fastForwardMode='FF'}
[2019-06-14T09:46:22.328Z]  > git rev-parse b0cef3414173aa3e5ec8100d1e2ec2b0d810f504^{commit} # timeout=10
[2019-06-14T09:46:22.352Z]  > git rev-parse null/master^{commit} # timeout=10
Command "git rev-parse null/master^{commit}" returned status code 128:
stdout: null/master^{commit}

stderr: fatal: ambiguous argument 'null/master^{commit}': unknown revision or path not in the working tree.

[Automation] docker bats fetch is failing

 make -C .ci/docker test-codecov 
Cloning bats-core
Fetching origin
Reset branch 'v1.1.0'
Clonning bats-assert
Fetching origin
Fetching upstream
Fetching kuisathaverat
fatal: reference is not a tree: 9f88b4207da750093baabc4e3f41bf68f0dd3630
make: *** [bats-assert] Error 128

Not sure whether that's related to my current environment but if I'm able to fetch without errors:

git fetch --all
Fetching origin
Fetching upstream
Fetching kuisathaverat

Not sure if the makefile section regarding the prepare-environment could be simplified with the git submodules as used to be previously, even though there were genuine reasons to move away.

Install nodejs 12.16.1 version fails on windows

This is the error we see in the example pipeline and the Node.js agent

[2020-03-26T05:56:10.718Z] Getting latest nodejs version for 12 ...
[2020-03-26T05:56:14.024Z] 
[2020-03-26T05:56:14.024Z] nodejs 12.16.1 [Approved]
[2020-03-26T05:56:14.024Z] nodejs 12.16.0 [Approved]
[2020-03-26T05:56:14.024Z] nodejs 12.15.0 [Approved]
[2020-03-26T05:56:14.024Z] nodejs 12.14.1 [Approved]
[2020-03-26T05:56:14.024Z] nodejs 12.14.0 [Approved]
[2020-03-26T05:56:14.024Z] nodejs 12.13.1 [Approved]
[2020-03-26T05:56:14.024Z] nodejs 12.13.0 [Approved]
[2020-03-26T05:56:14.024Z] nodejs 12.12.0 [Approved]
[2020-03-26T05:56:14.024Z] nodejs 12.11.1 [Approved]
[2020-03-26T05:56:14.024Z] nodejs 12.11.0 [Approved]
[2020-03-26T05:56:14.024Z] nodejs 12.10.0 [Approved]
[2020-03-26T05:56:14.024Z] nodejs 12.9.1 [Approved]
[2020-03-26T05:56:14.024Z] nodejs 12.9.0 [Approved]
[2020-03-26T05:56:14.024Z] nodejs 12.8.1 [Approved]
[2020-03-26T05:56:14.024Z] nodejs 12.8.0 [Approved]
[2020-03-26T05:56:14.024Z] nodejs 12.7.0 [Approved]
[2020-03-26T05:56:14.024Z] nodejs 12.6.0 [Approved]
[2020-03-26T05:56:14.024Z] nodejs 12.5.0 [Approved]
[2020-03-26T05:56:14.024Z] nodejs 12.4.0 [Approved]
[2020-03-26T05:56:14.024Z] nodejs 12.3.1 [Approved]
[2020-03-26T05:56:14.024Z] nodejs 12.3.0 [Approved]
[2020-03-26T05:56:14.024Z] nodejs 12.2.0 [Approved]
[2020-03-26T05:56:14.024Z] nodejs 12.1.0 [Approved]
[2020-03-26T05:56:14.024Z] nodejs 12.0.0 [Approved]
[2020-03-26T05:56:20.611Z] Installing nodejs version: 12.16.1 ...
[2020-03-26T05:56:20.611Z] Chocolatey v0.10.11
[2020-03-26T05:56:20.611Z] Installing the following packages:
[2020-03-26T05:56:20.611Z] nodejs
[2020-03-26T05:56:20.611Z] By installing you accept licenses for the packages.
[2020-03-26T05:56:21.555Z] nodejs not installed. An error occurred during installation:
[2020-03-26T05:56:21.555Z]  Unable to resolve dependency 'nodejs.install (= 12.16.1)'.
[2020-03-26T05:56:21.555Z] nodejs package files install completed. Performing other installation steps.
[2020-03-26T05:56:21.555Z] The install of nodejs was NOT successful.
[2020-03-26T05:56:21.555Z] nodejs not installed. An error occurred during installation:
[2020-03-26T05:56:21.555Z]  Unable to resolve dependency 'nodejs.install (= 12.16.1)'.
[2020-03-26T05:56:21.555Z] 
[2020-03-26T05:56:21.555Z] Chocolatey installed 0/1 packages. 1 packages failed.
[2020-03-26T05:56:21.555Z]  See the log for details (C:\ProgramData\chocolatey\logs\chocolatey.log).
[2020-03-26T05:56:21.555Z] 
[2020-03-26T05:56:21.555Z] Failures
[2020-03-26T05:56:21.555Z]  - nodejs (exited 1) - nodejs not installed. An error occurred during installation:
[2020-03-26T05:56:21.555Z]  Unable to resolve dependency 'nodejs.install (= 12.16.1)'.

[Question] Support more services accounts for reusing the comments in GitHub

If we wanna keep using:

  • def getLatestBuildComment() {
    // Get all the comments for the given PR.
    def comments = getComments()
    return comments
    .reverse()
    .find { (it.user.login == 'elasticmachine' || it.user.login == 'apmmachine') && it.body =~ /<!--PIPELINE/ }
    }

Then let's support for the beats-ci service account.

Otherwise, could we just use

def getBuildCommentFromFile() {
copyArtifacts(filter: commentIdFileName(), flatten: true, optional: true, projectName: env.JOB_NAME, selector: lastWithArtifacts())
if (fileExists(commentIdFileName())) {
return readFile(commentIdFileName())?.trim()
} else {
return ''
}
}
?

getBuildCommentFromFile is not dependent on GitHub and therefore there are certain benefits:

  • no third party connectivity issues
  • no api quota issues
  • owner agnostic for the messages.

On the other hand, the file based support requires to have a log rotation not too aggressive.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.