Giter VIP home page Giter VIP logo

code-coverage's Introduction

Mozilla Code Coverage

This project has 4 parts:

  • bot is a Python script running as a Taskcluster hook, aggregating code coverage data from Mozilla repositories,
  • backend is a Python API built with Flask, that serves the aggregated code coverage data, in an efficient way,
  • frontend is a vanilla Javascript SPA displaying code coverage data in your browser,
  • addon is a Web Extension for Firefox, extending several Mozilla websites with code coverage data. Published at https://addons.mozilla.org/firefox/addon/gecko-code-coverage/.

Help

You can reach us on our Matrix instance: #codecoverage:mozilla.org

code-coverage's People

Contributors

abpostelnicu avatar adahanu074 avatar adrian-tamas avatar ahal avatar anandhkishan avatar anubhabsen avatar arshadkazmi42 avatar av1m avatar calixteman avatar crystal-rainslide avatar dependabot-preview[bot] avatar dependabot[bot] avatar gabriel-v avatar garbas avatar imbstack avatar inftkm avatar jgraham avatar kevinluvian avatar la0 avatar marco-c avatar masterwayz avatar melvin2016 avatar pascalchevrel avatar pyup-bot avatar rhcu avatar rv404674 avatar sangimanishrao avatar sbeesm avatar ssarmis avatar staktrace avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

code-coverage's Issues

Download supported extensions at build time

It's tricky to do in a plugin, using hooks.
Here is the farther i got, the extensions.json file is created as an asset, bur require('extensions.json') does not work...

const fs = require('fs');
const download = require('download');

// Plugin downloading the supported extensions from production backend
const pluginName = 'ExtensionsPlugin';
class ExtensionsPlugin {
  apply(compiler) {
    compiler.hooks.compilation.tap(pluginName, compilation => {
      compilation.hooks.additionalAssets.tapAsync(pluginName, callback => {

        console.debug('Downloading coverage extensions...');
        download('https://coverage.moz.tools/v2/extensions').then(data => {

          // Setup downloaded file as asset
          let extensions = data.toString('utf-8');
          compilation.assets['extensions.json'] = {
            source: () => {
              return extensions;
            },
            size: () => {
              return extensions.length;
            }
          };

          callback();
        });
      });
    });
  }
}

Ack pulse messages only when a triggered task successfully finishes

This would be needed to make sure we successfully run tasks.

Let's say we receive a message for a review, we trigger a static analysis task and ack the pulse message removing it from the queue. Then pulse listener dies for some reason, the static analysis task fails. When the pulse listener is back, the failed task will not be retriggered as the pulse message is no longer in the queue.

Check the size of coverage artifact if it was already downloaded

According to this, https://github.com/marco-c/firefox-code-coverage/blob/cc386da20f389fd5159746d56729e149029de6ae/firefox_code_coverage/codecoverage.py#L81, if there is a downloaded artifact, then we don't download a new one. However, there could be a wrong one: for example, because in the previous try there was a bad Internet connection, and the download was not finished but the file exists. Can this be checked too (for example, to check whether the size of those are equal)? Or won't this condition occur because of this: https://github.com/marco-c/firefox-code-coverage/blob/cc386da20f389fd5159746d56729e149029de6ae/firefox_code_coverage/codecoverage.py#L88?
I had the same problem, and I have an artifact that weights 147 bytes and has an empty .info file inside, what seems to be wrong.

Explain heuristics used for "uncovered files" in the UI

From Marco:

For JavaScript files, the top-level code is always covered if the file is loaded, so we came up with a heuristic: a "zero coverage" file is a file which has at least one function and whose functions are all not covered.
For C/C++ it's easier, as there's no top-level code and the code is always in functions. In this case, a "zero coverage" file is a file which has no covered lines.

It would great to put this in the UI somewhere for new users. Maybe add a "?" icon next to the uncovered files option that can show this explanation on hover.

Ingest fuzzing coverage

We'll need to retrieve the covdir files and upload them on GCP.

The coverage from fuzzing should only be available to staff and NDA'd Mozillians, so we'll have to implement some kind of auth.

Add filtering capabilities to the API

At the moment, the APIs are all retrieving the overall coverage.
We should add filtering capabilities so the frontend can require specific things from the backend.

Filters we should support:

  • Platform (Linux, Windows, Mac, etc.);
  • Test suite;
  • Language?;
  • Third-party code?

Detect dead code

The basic idea is to write a clang plugin which is able to detect if a function in an uncovered file is called (directly or via a call to another function) by an other function in a covered file.
That would avoid manual inspection and give the files with a high probability to be dead.

Use coverage before changeset when on tip

A coverage request on the changeset at tip (or any changeset between the last push & tip) will give a 404.

Right now the coverage backend only looks for coverage data after the requested changeset. But that does not work in the case of tip, when no coverage data is yet available.

For that case (when changeset's push is above last changeset) the backend should fall back to the previous coverage data

Improve the reporting of new uncovered code

Today, we analyze per-push mozilla-central code coverage reports to identify new code that isn't covered by any tests, and automatically send an email report to a few people, with links to the relevant commits.

This allows manually filing bugs when new code is obviously lacking tests, and ask the developers to add tests.

But the process is very manual and tedious, and the reports (email and coverage view) are lacking important information to make pertinent decisions. It would be nice to improve these reports with more useful information, and to make them actionable (e.g. with a button to auto-file a blocking bug).

Code locations:

Information to add:

Actions to automate or streamline:

  • Pre-fill a blocking bug that asks the developer to add tests (similar to Talos perf regression blocking bugs)

Write tests for generate_suite_reports

Tests should be written in src/codecoverage/bot/tests/test_suite_reports.py for the generate function from the suite_reports module (src/codecoverage/bot/code_coverage_bot/suite_reports.py).

Dependabot couldn't fetch all your path-based dependencies

Dependabot couldn't fetch one or more of your project's path-based Python dependencies. The affected dependencies were bot/[email protected]:mozilla/code-coverage.git@85f62f11cd712e41e8a53668976af73d400cee11/setup.py.

To use path-based dependencies with Dependabot the paths must be relative and resolve to a directory in this project's source code.

You can mention @dependabot in the comments below to contact the Dependabot team.

Dependabot couldn't fetch all your path-based dependencies

Dependabot couldn't fetch one or more of your project's path-based Python dependencies. The affected dependencies were bot/[email protected]:mozilla/code-coverage.git@85f62f11cd712e41e8a53668976af73d400cee11/setup.py.

To use path-based dependencies with Dependabot the paths must be relative and resolve to a directory in this project's source code.

You can mention @dependabot in the comments below to contact the Dependabot team.

Retry failed tasks

Most of the failures are due to transient issues, we should implement some kind of retry mechanism.

Does it apply to static analysis too, @La0 @jankeromnes?

Dependabot couldn't fetch all your path-based dependencies

Dependabot couldn't fetch one or more of your project's path-based Python dependencies. The affected dependencies were bot/[email protected]:mozilla/code-coverage.git@85f62f11cd712e41e8a53668976af73d400cee11/setup.py.

To use path-based dependencies with Dependabot the paths must be relative and resolve to a directory in this project's source code.

You can mention @dependabot in the comments below to contact the Dependabot team.

Trigger code coverage jobs with a timeout

The Mercurial commits take a while to get synced to gecko-dev, so we should trigger coverage jobs after a timeout of ~30 minutes (instead of waiting in the coverage task).

Get suite and chunk name without parsing the task name

The suite and chunk info are already in the task data, we don't need to parse them from the task name:

   {
      "status": {
        "taskId": "CkICVuBvRr--6rratJoG1A",
        "provisionerId": "aws-provisioner-v1",
        "workerType": "gecko-t-linux-large",
        "schedulerId": "gecko-level-3",
        "taskGroupId": "e4wR3JiKRmCqFXMG1sZoqw",
        "deadline": "2018-01-16T14:08:30.100Z",
        "expires": "2019-01-15T14:08:30.100Z",
        "retriesLeft": 5,
        "state": "completed",
        "runs": [
          {
            "runId": 0,
            "state": "completed",
            "reasonCreated": "scheduled",
            "reasonResolved": "completed",
            "workerGroup": "us-west-2",
            "workerId": "i-02f880a62021594ec",
            "takenUntil": "2018-01-15T15:18:41.003Z",
            "scheduled": "2018-01-15T14:58:40.472Z",
            "started": "2018-01-15T14:58:41.080Z",
            "resolved": "2018-01-15T15:04:13.243Z"
          }
        ]
      },
      "task": {
        "provisionerId": "aws-provisioner-v1",
        "workerType": "gecko-t-linux-large",
        "schedulerId": "gecko-level-3",
        "taskGroupId": "e4wR3JiKRmCqFXMG1sZoqw",
        "dependencies": [
          "HyAoNcNPRvyV6mgzFFFWSg",
          "XUDNfri0ReazT-7gyTTAYg"
        ],
        "requires": "all-completed",
        "routes": [
          "tc-treeherder.v2.mozilla-central.a00e3a3dbad43759ee9474d6c7c908fa16deaee8.33255",
          "coalesce.v1.mozilla-central.71fe57dadaef770d25ba"
        ],
        "priority": "medium",
        "retries": 5,
        "created": "2018-01-15T14:08:30.100Z",
        "deadline": "2018-01-16T14:08:30.100Z",
        "expires": "2019-01-15T14:08:30.100Z",
        "scopes": [
          "secrets:get:project/taskcluster/gecko/hgfingerprint",
          "docker-worker:feature:allowPtrace",
          "docker-worker:cache:level-3-mozilla-central-test-workspace-bc7e1a7ad01a345394f1",
          "docker-worker:cache:level-3-checkouts-bc7e1a7ad01a345394f1"
        ],
        "payload": {
          "supersederUrl": "https://coalesce.mozilla-releng.net/v1/list/3600/5/mozilla-central.71fe57dadaef770d25ba",
          "onExitStatus": {
            "retry": [
              4
            ]
          },
          "maxRunTime": 3600,
          "image": {
            "path": "public/image.tar.zst",
            "type": "task-image",
            "taskId": "XUDNfri0ReazT-7gyTTAYg"
          },
          "cache": {
            "level-3-mozilla-central-test-workspace-bc7e1a7ad01a345394f1": "/builds/worker/workspace",
            "level-3-checkouts-bc7e1a7ad01a345394f1": "/builds/worker/checkouts"
          },
          "artifacts": {
            "public/logs/": {
              "path": "/builds/worker/workspace/build/upload/logs/",
              "expires": "2019-01-15T14:08:30.100Z",
              "type": "directory"
            },
            "public/test": {
              "path": "/builds/worker/artifacts/",
              "expires": "2019-01-15T14:08:30.100Z",
              "type": "directory"
            },
            "public/test_info/": {
              "path": "/builds/worker/workspace/build/blobber_upload_dir/",
              "expires": "2019-01-15T14:08:30.100Z",
              "type": "directory"
            }
          },
          "command": [
            "/builds/worker/bin/run-task",
            "--",
            "/builds/worker/bin/test-linux.sh",
            "--no-read-buildbot-config",
            "--installer-url=https://queue.taskcluster.net/v1/task/HyAoNcNPRvyV6mgzFFFWSg/artifacts/public/build/target.tar.bz2",
            "--test-packages-url=https://queue.taskcluster.net/v1/task/HyAoNcNPRvyV6mgzFFFWSg/artifacts/public/build/target.test_packages.json",
            "--reftest-suite=reftest-no-accel",
            "--e10s",
            "--allow-software-gl-layers",
            "--total-chunk=8",
            "--this-chunk=7",
            "--download-symbols=ondemand"
          ],
          "env": {
            "SCCACHE_DISABLE": "1",
            "MOZ_NODE_PATH": "/usr/local/bin/node",
            "TASKCLUSTER_CACHES": "/builds/worker/checkouts;/builds/worker/workspace",
            "HG_STORE_PATH": "/builds/worker/checkouts/hg-store",
            "GECKO_HEAD_REV": "a00e3a3dbad43759ee9474d6c7c908fa16deaee8",
            "GECKO_HEAD_REPOSITORY": "https://hg.mozilla.org/mozilla-central",
            "TASKCLUSTER_VOLUMES": "/builds/worker/.cache;/builds/worker/checkouts;/builds/worker/tooltool-cache;/builds/worker/workspace",
            "MOZHARNESS_URL": "https://queue.taskcluster.net/v1/task/HyAoNcNPRvyV6mgzFFFWSg/artifacts/public/build/mozharness.zip",
            "NEED_PULSEAUDIO": "true",
            "MOZ_AUTOMATION": "1",
            "NEED_WINDOW_MANAGER": "true",
            "MOZHARNESS_CONFIG": "unittests/linux_unittest.py remove_executables.py",
            "ENABLE_E10S": "true",
            "MOZHARNESS_SCRIPT": "desktop_unittest.py",
            "GECKO_BASE_REPOSITORY": "https://hg.mozilla.org/mozilla-unified",
            "MOZILLA_BUILD_URL": "https://queue.taskcluster.net/v1/task/HyAoNcNPRvyV6mgzFFFWSg/artifacts/public/build/target.tar.bz2"
          },
          "features": {
            "taskclusterProxy": true,
            "allowPtrace": true
          }
        },
        "metadata": {
          "owner": "[email protected]",
          "source": "https://hg.mozilla.org/mozilla-central/file/a00e3a3dbad43759ee9474d6c7c908fa16deaee8/taskcluster/ci/test",
          "description": "Reftest not accelerated run ([Treeherder push](https://treeherder.mozilla.org/#/jobs?repo=mozilla-central&revision=a00e3a3dbad43759ee9474d6c7c908fa16deaee8))",
          "name": "test-linux32/opt-reftest-no-accel-e10s-7"
        },
        "tags": {
          "kind": "test",
          "worker-implementation": "docker-worker",
          "createdForUser": "[email protected]",
          "label": "test-linux32/opt-reftest-no-accel-e10s-7",
          "test-type": "reftest",
          "os": "linux"
        },
        "extra": {
          "chunks": {
            "current": 7,
            "total": 8
          },
          "suite": {
            "flavor": "reftest-no-accel",
            "name": "reftest"
          },
          "treeherder": {
            "jobKind": "test",
            "groupSymbol": "tc-R-e10s",
            "collection": {
              "opt": true
            },
            "machine": {
              "platform": "linux32"
            },
            "groupName": "Reftests executed by TaskCluster with e10s",
            "tier": 1,
            "symbol": "Ru7"
          },
          "parent": "e4wR3JiKRmCqFXMG1sZoqw",
          "index": {
            "rank": 1516025190
          }
        }
      }
    },

Figure out how to handle secure Phabricator revisions

Right now the bot fails when it encounters a secure revision:

  File "/nix/store/c94zc39953i838di2kppybjrg37k2yyn-python3.7-mozilla-codecoverage-bot-1.0.0/lib/python3.7/site-packages/code_coverage_bot/codecov.py", line 150, in go_from_trigger_mozilla_central
    phabricatorUploader.upload(json.loads(output))
  File "/nix/store/c94zc39953i838di2kppybjrg37k2yyn-python3.7-mozilla-codecoverage-bot-1.0.0/lib/python3.7/site-packages/code_coverage_bot/phabricator.py", line 137, in upload
    rev_data = phabricator.load_revision(rev_id=rev_id)
  File "/nix/store/4g87vc35ilp7ax0rn8c1zdc6d0an7fab-python3.7-python3.7-mozilla-cli-common-1.0.0/lib/python3.7/site-packages/cli_common/phabricator.py", line 152, in load_revision
    'Revision not found'
AssertionError: Revision not found

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.