Giter VIP home page Giter VIP logo

dbt-checkpoint's Introduction

dbt-checkpoint

CI black black

Sponsors

Datacoves

Hosted VS Code, dbt-core, SqlFluff, and Airflow, find out more at Datacoves.com.


Goal

dbt-checkpoint provides pre-commit hooks to ensure the quality of your dbt projects.

dbt is awesome, but when the number of models, sources, and macros in a project grows, it becomes challenging to maintain the same level of quality across developers.. Users forget to update columns in property(yml) files or add table and column add descriptions. Without automation the reviewer workload increases and unintentional errors may be missed. dbt-checkpoint allows organizations to add automated validations improving your code review and release process.

Telemetry

dbt-checkpoint has telemetry built into some of its hooks to help the maintainers from Datacoves understand which hooks are being used and which are not to prioritize future development of dbt-checkpoint. We do not track credentials nor details of your dbt execution such as model names. We also do not track any of the dbt hooks, such as for generating documentation. The one detail we do use related to dbt is the anonymous user_id generated by dbt to help us identify distinct projects.

By default this is turned on – you can opt out of event tracking at any time by adding the following to your .dbt-checkpoint.yaml file:

version: 1
disable-tracking: true

Setting dbt project root

You can specify a dbt project root directory for all hooks. This is particularly useful when your dbt project is not located at the root of your repository but in a sub-directory of it.

In that situation, you previously had to specify a --manifest flag in each hook.

Now, you can avoid repeating yourself by adding the dbt-project-dir key to your .dbt-checkpoint.yaml config file:

version: 1
dbt-project-dir: my_dbt_project

This way, we will automatically look for the required manifest/catalog inside your my_dbt_project project folder.

General exclude and per-hook excluding

Since dbt-checkpoint 1.1.0, certain hooks implement an implicit logic that "discover" their sql/yml equivalent for checking.

For a complete background please refer to #118.

Since the root-level exclude statement is handled by pre-commit, when those hooks discover their related sql/yml files, this root exclusion is ommitted (dbt-checkpoint re-includes files that may have been excluded). To exclude files from being discovered by this logic, the exclude path/regex must be provided in each hook (#119)

List of dbt-checkpoint hooks

💡 Click on hook name to view the details.

Model checks:

Script checks:

Source checks:

Macro checks:

Modifiers:

dbt commands:


If you have a suggestion for a new hook or you find a bug, let us know

Install

For detailed installation and usage, instructions see pre-commit.com site.

pip install pre-commit

Setup

  1. Create a file named .pre-commit-config.yaml in your project root folder.
  2. Add list of hooks you want to run befor every commit. E.g.:
repos:
- repo: https://github.com/dbt-checkpoint/dbt-checkpoint
  rev: v1.2.1
  hooks:
  - id: dbt-parse
  - id: dbt-docs-generate
    args: ["--cmd-flags", "++no-compile"]
  - id: check-script-semicolon
  - id: check-script-has-no-table-name
  - id: check-model-has-all-columns
    name: Check columns - core
    files: ^models/core
  - id: check-model-has-all-columns
    name: Check columns - mart
    files: ^models/mart
  - id: check-model-columns-have-desc
    files: ^models/mart
  1. Optionally, run pre-commit install to set up the git hook scripts. With this, pre-commit will run automatically on git commit! You can also manually run pre-commit run after you stage all files you want to run. Or pre-commit run --all-files to run the hooks against all of the files (not only staged).

Run in CI/CD

Unfortunately, you cannot natively use dbt-checkpoint if you are using dbt Cloud. But you can run checks after you push changes into Github.

dbt-checkpoint for the most of the hooks needs manifest.json (see requirements section in hook documentation), that is in the target folder. Since this target folder is usually in .gitignore, you need to generate it. For that you need to run the dbt-parse command. To be able to parse dbt, you also need profiles.yml file with your credentials. To provide passwords and secrets use Github Secrets (see example).

Say you want to check that a model contains at least two tests, you would use this configuration:

repos:
- repo: https://github.com/dbt-checkpoint/dbt-checkpoint
 rev: v1.2.1
 hooks:
 - id: check-model-has-tests
   args: ["--test-cnt", "2", "--"]

To be able to run this in Github CI you need to modified it to:

repos:
- repo: https://github.com/dbt-checkpoint/dbt-checkpoint
 rev: v1.2.1
 hooks:
 - id: dbt-parse
 - id: check-model-has-tests
   args: ["--test-cnt", "2", "--"]

Create profiles.yml

First step is to create profiles.yml. E.g.

# example profiles.yml file
jaffle_shop:
  target: dev
  outputs:
    dev:
      type: snowflake
      threads: 8
      client_session_keep_alive: true
      account: "{{ env_var('ACCOUNT') }}"
      database: "{{ env_var('DATABASE') }}"
      schema: "{{ env_var('SCHEMA') }}"
      user: "{{ env_var('USER') }}"
      password: "{{ env_var('PASSWORD') }}"
      role: "{{ env_var('ROLE') }}"
      warehouse: "{{ env_var('WAREHOUSE') }}"

and store this file in project root ./profiles.yml.

Create new workflow

  • inside your Github repository create folder .github/workflows (unless it already exists).
  • create new file e.g. pr.yml
  • specify your workflow e.g.:
name: dbt-checkpoint

on:
  push:
  pull_request:
    branches:
      - main

jobs:
  dbt-checkpoint:
    runs-on: ubuntu-latest
    env:
      ACCOUNT: ${{ vars.ACCOUNT }}
      DATABASE: ${{ vars.DATABASE }}
      SCHEMA: ${{ vars.SCHEMA }}
      USER: ${{ vars.USER }}
      PASSWORD: ${{ secrets.PASSWORD }}
      ROLE: ${{ vars.ROLE }}
      WAREHOUSE: ${{ vars.WAREHOUSE }}
    steps:
      - name: Checkout code
        uses: actions/checkout@v2

      - name: Setup Python
        uses: actions/setup-python@v2

      - id: Get file changes
        uses: trilom/[email protected]
        with:
          output: " "

      - name: Run dbt checkpoint
        uses: dbt-checkpoint/[email protected]
        with:
          extra_args: --files ${{ steps.get_file_changes.outputs.files}}
          dbt_version: 1.6.3
          dbt_adapter: dbt-snowflake

Acknowledgements

Thank you to Radek Tomšej for initial development and maintenance of this great package, and for sharing your work with the community!

dbt-checkpoint's People

Contributors

alfredodimassimo avatar awal11 avatar bantonellini avatar garystrass avatar gbrunois avatar ian-r-rose avatar jfrackson avatar johnerick-py avatar jtalmi avatar karabulute avatar katieclaiborne avatar ktuft-cbh avatar landier avatar leopoldgabelmann avatar mbhoopathy avatar miki-lwy avatar ms32035 avatar neddonaldson avatar noel avatar pablopardogarcia avatar pgoslatara avatar ronak-datatonic avatar samkessaram avatar slve avatar ssassi avatar stumelius avatar thomas-george-t avatar timwinter06 avatar tlfbrito avatar tomsej avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

dbt-checkpoint's Issues

hook `check-model-has-meta-keys` not correctly identifying defined `meta` key

Describe the bug
We are using the check-model-has-meta-keys hook to check if our models have owners defined, but even with the key correctly defined (I also checked manifest.json to be sure) the hook raises an error for the models saying the owner is not defined.

It seems it's not seeing the meta keys defined at a folder level, since we are defining in our dbt_project.yml

from manifest.json:

        "model.analytics.analytics_ci_runs": {
            "raw_sql": "...",
            "compiled": true,
           ...
            "config": {
                "enabled": true,
                "alias": "ci_runs",
                "schema": "...",
                "database": null,
                "tags": [
                    "daily"
                ],
                "meta": {
                    "is_gold": true,
                    "owner": "[email protected]"
                },

And the response from the hook:
image

If this is actually an error I'd be glad to contribute if you could point me in the right direction

To Reproduce
Steps to reproduce the behavior:

  1. Add owner to group of models (folder) in dbt_project.yml file
  2. Add following config to the .pre-commit-config.yaml
      - id: check-source-has-meta-keys
        args: ['--meta-keys', 'owner', "--"]
  1. Execute pre-commit

Expected behavior
A clear and concise description of what you expected to happen.

Version:
v.1.0.1 of dbt-gloss

Additional context

Feature request: Add `check-source-has-tests-by-group`

Describe the feature you'd like
Add a hook that has the same functionality as check-model-has-tests-by-group but for sources.

Additional context
Was going to implement check-model-has-tests-by-name and check-source-has-tests-by-name but realized ultimately that we wanted OR functionality. Found that check-model-has-tests-by-group is exactly what we need, but we also need this functionality for sources.

unify-column-description should check all ymls, not just the changed ones

Describe the bug
In order to unify column descriptions across all models unify-column-description should check all ymls, not just the changed ones. This is because there may be other instances of that column in other ymls.

To Reproduce
Steps to reproduce the behavior:

  1. Have 2 .yml files containing the same column name
  2. Change only 1 .yml file with to have a different description for the column
  3. Commit changes

Expected behavior
I expect descriptions the be unified across the two .yml files, even if only one is modified during commit.

Version:
v0.1.0

Additional context
Loving this repo! I'm not sure if this is just a limitation of pre-commit hooks, if so do you recommend another method of running this?

Regex Parsing Leads to Hanging Test

Describe the bug

When attempting to run check-script-has-no-table-name, the test can hang, potentially due to catastrophic backtracking.

To Reproduce

I have yet to determine exact triggers but have spot-checked the regex and a test string both on regex101.com and in my dbt codebase.

String 1:

{#
test
#}

WITH source AS (
  SELECT *
  FROM {{ source('schema','table') }}
)
SELECT
, a
, b
FROM source

causes "catastrophic backtracking" and the test to hang when executed in my codebase

Screen Shot 2021-12-17 at 6 12 19 PM

but a string

{#
test
#}

WITH source AS (
  SELECT *
  FROM {{ source('schema','table') }}
)
SELECT
, a
FROM source

does not in both my codebase and regex101.com

Screen Shot 2021-12-17 at 6 12 25 PM

Expected behavior

The regex to parse in a manner that doesn't cause the test to hang.

Version:
v1.0.0

Additional context

This bug was introduced when the regex here was updated in this commit

Ignore a file or some sentences in a file?

Hi pre-commit-dbt is super useful for most of our files. But there are several files or so to say some lines in the file where we do not want to run pre-commit-dbt on.
Is there a way to go about this? Has anyone tried it already?
Thanks

`Check model name contract` hook problem with version

When using the following pre commit config file

repos:
- repo: https://github.com/offbi/pre-commit-dbt
  rev: v1.0.0
  hooks:
  - id: check-model-name-contract
    args: [--pattern, "(rep__).*"]
    files: models/reporting

I get the following response
[ERROR] check-model-name-contract is not present in repository https://github.com/offbi/pre-commit-dbt. Typo? Perhaps it is introduced in a newer version? Often pre-commit autoupdate fixes this.

Even if I run the autoupdate command the error message is the same. Am I missing something?

Improved ReadMe Documentation

I'm relatively new to Github Actions so my issue here could totally be my lack of understanding of Actions in general. If that's the case, I apologise. However, I find the current ReadMe difficult to understand. I would benefit from improved documentation on what this Action is doing.

Additional context
Based on my current understanding, I believe that I have to first create a .pre-commit-config.yml file (modified to first run dbt-compile as I'm running this as a Github Action) and then save this in my project's root directory. For example, if I wished to build the documentation for my project I must define the following:

repos:
- repo: https://github.com/offbi/pre-commit-dbt
  rev: v1.0.0
  hooks:
  - id: dbt-compile
    args: ["--cmd-flags", "++profiles-dir", "."]
  - id: dbt-docs-generate

Presumably this would build the documentation in a folder called ~/target as would be the case in default dbt? (I can't see any arguments in the hook's documentation to suggest it's possible to change the directory.)

From here, things get fuzzy for me. In the current ReadMe under the create new workflow section we then see the following:

name: pre-commit

on:
  pull_request:
  push:
  branches: [main]

jobs:
  pre-commit:
  runs-on: ubuntu-latest
  steps:
  - uses: actions/checkout@v2
  - uses: actions/setup-python@v2
  - id: file_changes
    uses: trilom/[email protected]
    with:
      output: ' '
  - uses: offbi/[email protected]
    env:
      DB_PASSWORD: ${{ secrets.SuperSecret }}
    with:
      args: run --files ${{ steps.file_changes.outputs.files}}

I don't at this point understand:

  • What is - id: file_changes? What is trilom/[email protected] and why is it required when using pre-commit-dbt?
  • How does offbi/pre-commit-dbt use .pre-commit-config.yml? Is the pre-commit-config file parsed with run --files ${{ steps.file_changes.outputs.files}}? Am I correct in thinking that somewhere in the above .yml, the Action will call .pre-commit-config.yml and run whatever is specified in this file?
  • Is it necessary to create a profiles.yml file and keep this in the project's root dir when using Github Actions, or can all environment variables be stored in Github Secrets? It's not clear when this file is required (dbt recommends that profiles.yml be kept in locally in ~/.dbt)

Apologies for all the questions and thanks in advance for helping a newbie out!

Support for check-model-has-one-of-tests-by-name

Describe the feature you'd like
Many dbt projects require at least one unique test per model. However, this test may be one of several flavors, e.g.:

  • test_unique
  • test_unique_where
  • test_unique_threshold
  • test_unique_combination_of_columns

Initial thought: We could leverage some sort of "OR" functionality, e.g.

test_unique | test_unique_where = 1

Unable to load manifest file ([Errno 2] No such file or directory: 'target/manifest.json')

Describe the bug
Any hook that tries to read target/manifest.json results in Unable to load manifest file ([Errno 2] No such file or directory: 'target/manifest.json') if dbt project directory (contains target/) != git project root.

To Reproduce
Steps to reproduce the behavior:

  1. Create the following project structure
my_project/
├── .git/
└── dbt_project/
     ├── dbt_project.yml
     ├── target/
     ├── models/
     └── ...
  1. Run any hook that requires the manifest

Expected behavior
I guess the current behavior is expected but there could be an option to specify the dbt project directory.

Workaround: cp -r dbt_project/target target

Version:
v1.0.0

Github action example fails to run

Describe the bug
Trying to use github action using example in the README but it fails with invalid argument error.

To Reproduce
In the .github/workflows/action.yaml

uses: offbi/pre-commit-dbt@ea9c6bafbca375250baa9c21b8ddf9207d9c0160
with:
  args: run --files ${{ steps.file_changes.outputs.files }}

and get the following errors when the github action runs:

usage: pre-commit [-h] [-V]
                  {autoupdate,clean,hook-impl,gc,init-templatedir,install,install-hooks,migrate-config,run,sample-config,try-repo,uninstall,help}
                  ...
pre-commit: error: argument command: invalid choice: 'run --files ' (choose from 'autoupdate', 'clean', 'hook-impl', 'gc', 'init-templatedir', 'install', 'install-hooks', 'migrate-config', 'run', 'sample-config', 'try-repo', 'uninstall', 'help')

Additional context
I am using the latest commit hash: ea9c6ba

Issue with the check-column-name-contract hook

Hello folks,

first I noticed that on this page when clicking on check-column-name-contract you get a 404 page. However, today I've decided to actually use this hook in my project and got the following error:

[ERROR] `check-column-name-contract` is not present in repository https://github.com/offbi/pre-commit-dbt. Typo? Perhaps it is introduced in a newer version? Often `pre-commit autoupdate` fixes this.

To Reproduce
Steps to reproduce the behavior:

  1. add check-column-name-contract to your config file
  2. run pre-commit

Version:
v1.0.0

Empty YML files result in AttributeError: 'NoneType' object has no attribute 'get'

Describe the bug
Empty YML files result in AttributeError: 'NoneType' object has no attribute 'get' when running check-source-table-has-description hook.

Traceback (most recent call last):
  File "/home/simo/.cache/pre-commit/repohg6wkw_k/py_env-python3.8/bin/check-source-table-has-description", line 8, in <module>
    sys.exit(main())
  File "/home/simo/.cache/pre-commit/repohg6wkw_k/py_env-python3.8/lib/python3.8/site-packages/pre_commit_dbt/check_source_table_has_description.py", line 33, in main
    return has_description(paths=args.filenames)
  File "/home/simo/.cache/pre-commit/repohg6wkw_k/py_env-python3.8/lib/python3.8/site-packages/pre_commit_dbt/check_source_table_has_description.py", line 17, in has_description
    for schema in schemas:
  File "/home/simo/.cache/pre-commit/repohg6wkw_k/py_env-python3.8/lib/python3.8/site-packages/pre_commit_dbt/utils.py", line 210, in get_source_schemas
    for source in schema.get("sources", []):
AttributeError: 'NoneType' object has no attribute 'get'

To Reproduce
Steps to reproduce the behavior:

  1. Create an empty .yml file in your dbt project
  2. Run check-source-table-has-description hook

Expected behavior
Empty/invalid YML files are skipped.

Version:
v1,0.0

Cannot use the action in a workflow

Describe the bug
Hi there, first of all thanks for the efforts in developing this library. We are trying to include this step in our workflow, but we are facing an issue related to the docker image. It cannot pull it:

Screenshot 2021-04-23 at 10 31 04

Might the issue be related to the actual image defined in action.yaml ?

To Reproduce
Steps to reproduce the behavior:

  1. follow either the readme example or the snippet as reported in the marketplace

Example configuration:

     - id: dbt_checks
        uses: offbi/[email protected]
        env:
          [ ... ]
        with:
          args: run --files ${{ steps.file_changes.outputs.files }}

Version:
v1.0.0

Feature Request: `check-model-has-column`

Describe the feature you'd like
Add a hook which asserts that a given column with a given type exists in a model.

Additional context
We generally require that every model has an audit timestamp column called _updated_at and it would be nice to be able to enforce that.

Move documentation to gitbook

There is too much of hooks, it is hard to maintain it. Probably it is a good time to move this to some static site

Possibility to change the dbt root for all hooks

I work in a monorepo and dbt is not at the root of the repo. A lot of checks fail because of this and have to be adapted with the args, but it's not possible for all of them. E.g. the dbt deps hooks does not allow you to change where it is executed. Maybe there is an easier way to apply this and allow the feature for all the hooks.

check-script-has-no-table-name gives false errors with certain keywords

Describe the bug
check-script-has-no-table-name gives false errors on places where SQL uses FROM but it's not a table.
For example BigQuery has EXTRACT(date FROM datetime_column)

To Reproduce

WITH date_spine AS (
  {{ dbt_utils.date_spine(
      start_date="cast(parse_date('%Y/%m/%d', '2014/11/01') AS datetime)",
      datepart="day",
      end_date="cast(date_add(current_date(), INTERVAL 40 YEAR) AS datetime)"
     )
  }}
)

SELECT
    EXTRACT(YEAR FROM date_day) AS year,
    EXTRACT(QUARTER FROM date_day) AS quarter,
    EXTRACT(MONTH FROM date_day) AS month,
    EXTRACT(DAY FROM date_day) AS day

FROM date_spine

Run the check-script-has-no-table-name and it will fail with:

models/reference/calendar.sql: does not use source() or ref() macros for tables:
- date_day

Expected behavior
I would expect this to be ignored

Version:
v1.0.0

Additional context
Nothing else I can think of

Docker image pull down image action fails

Describe the bug
When hooking to the docker file, getting error invalid reference format when the build runs the step to pull the action image.

Pull down action image 'offbi:pre-commit-dbt:v1.0.0'
/usr/bin/docker pull offbi:pre-commit-dbt:v1.0.0
invalid reference format
Warning: Docker pull failed with exit code 1, back off 3.275 seconds before retry.

To Reproduce
Steps to reproduce the behavior:

  1. add new file .github/workflows/pre-commit-dbt.yml:
name: pre-commit

on:
  pull_request:
    branches: [master]

jobs:
  pre-commit:
    runs-on: ubuntu-latest
    steps:
    - uses: actions/checkout@v2
    - uses: actions/setup-python@v2
    - id: file_changes
      uses: trilom/[email protected]
      with:
        output: ' '
    - uses: offbi/[email protected]
      env:
        DB_PASSWORD: ${{ secrets.SuperSecret }}
      with:
        args: run --files ${{ steps.file_changes.outputs.files}}
  1. add the following code to existing (or create new) .pre-commit-config.yaml
- repo: https://github.com/offbi/pre-commit-dbt
  rev: v1.0.0
  hooks:
  - id: check-script-has-no-table-name
    files: ^dbt
  1. commit changes and create a pull request to trigger test build.

Expected behavior
Build test completes successfully.

Version:
v1.0.0
(dbt version 0.21.0)

Additional context
I think it's odd that the yml file clearly shows offbi/[email protected] for the step action, but when the docker pull command is passed the syntax is changed to offbi:pre-commit-dbt:v1.0.0 (notice colons in place of slash & @). Nowhere in the documentation does it list how to avoid this.
Also note: The "test" completes successfully if the following block is removed (all other steps run correctly):

    - uses: offbi/[email protected]
      env:
        DB_PASSWORD: ${{ secrets.SuperSecret }}
      with:
        args: run --files ${{ steps.file_changes.outputs.files}}

So it seems that it's this specific docker image causing the issue.

Feature Request: `check-model-database-identifier-name-contract`

Describe the feature you'd like
check-model-name-contract checks that a model name abides by a certain 'contract'. I would like an equivalent hook for the metadata.name value (used in the 'database identifier') which exists within the catalog.json file to ensure that database identifier name abides by a contract e.g. doesn't contain a prefix e.g. project_prefix_.

Additional context
Example extract from catalog.json:

"model.example.project_prefix_example_table": {
    "metadata": {
        "type": "table",
        "schema": "example",
        "name": "example_table",
        "database": null,
        "comment": null,
        "owner": "root"
    },

`check-script-has-no-table-name` fails incorrectly due to CTE aliases

Describe the bug
If you set an alias for a CTE and use it later on, the check-script-has-no-table-name fails.

To Reproduce

If you alias a CTE, like so:

{% macro source_cte(source_name, tuple_list) -%}
WITH{% for cte_ref in tuple_list %} {{cte_ref[0]}} AS (
    SELECT * FROM {{ source(source_name, cte_ref[1]) }}
),
    {%- endfor %} final as (
{%- endmacro %}
{{ source_cte('my_source', [('s', 'some_table'), ('o', 'other_table')] }}
select * from s
inner join o on s.other_id = o.id
)
select * from final

You will get a failure:

Check the script has not table name......................................Failed
- hook id: check-script-has-no-table-name
- exit code: 1

models/your_model.sql: does not use source() or ref() macros for tables:
- s
- o

Expected behavior
CTE aliases wouldn't cause the check to fail. Ideally we'd check the guts of the CTE itself.

Version:
v1.0.0

Additional context
Add any other context about the problem here.

check-script-has-no-table-name doesn't ignore text within Jinja comment blocks

Describe the bug
If we use the word "from" or "join" in a Jinja comment block the parser thinks that the next word is the name of a table or CTE and passes an error.

sql:

{# This is a test of the check-script-has-no-table-name hook, from pre-commit-dbt

We would expect the hook to ignore this text because it is in a jinja comment block
and not actually a join to any other table.

#}

output from pre-commit run

Check the script has not table name......................................Failed
- hook id: check-script-has-no-table-name
- exit code: 1

test.sql: does not use source() or ref() macros for tables:
- pre-commit-dbt
- to

To Reproduce
Steps to reproduce the behavior:

  1. create a new sql file
  2. add into the file
{# This is a test of the check-script-has-no-table-name hook, from pre-commit-dbt

We would expect the hook to ignore this text because it is in a jinja comment block
and not actually a join to any other table.

#}
  1. Run pre-commit with the check-script-has-no-table-name hook enabled

Expected behavior
I'd expect the parser to ignore comment blocks since it's not important if they don't use ref() or source()

Version:
v0.1.1

`check_macro_arguments_have_desc` hook fails to parse arguments

Describe the bug

check_macro_arguments_have_desc hook raises the following error even though the content of the files is ok.
Other hooks are working correctly, including check_macro_has_description.

Traceback (most recent call last):
  File "/home/mache/.cache/pre-commit/repo0ldja9vt/py_env-python3/bin/check-macro-arguments-have-desc", line 8, in <module>
    sys.exit(main())
  File "/home/mache/.cache/pre-commit/repo0ldja9vt/py_env-python3/lib/python3.10/site-packages/pre_commit_dbt/check_macro_arguments_have_desc.py", line 90, in main
    status_code, _ = check_argument_desc(paths=args.filenames, manifest=manifest)
  File "/home/mache/.cache/pre-commit/repo0ldja9vt/py_env-python3/lib/python3.10/site-packages/pre_commit_dbt/check_macro_arguments_have_desc.py", line 52, in check_argument_desc
    for key, value in item.macro.get("arguments", {}).items()
AttributeError: 'list' object has no attribute 'items'

To Reproduce

Steps to reproduce the behavior using getdbt examples:

  1. macros/schema.yml
version: 2

macros:
  - name: cents_to_dollars
    description: A macro to convert cents to dollars
    arguments:
      - name: column_name
        type: string
        description: The name of the column you want to convert
      - name: precision
        type: integer
        description: Number of decimal places. Defaults to 2.
  1. macros/cents_to_dollars.sql
{% macro cents_to_dollars(column_name, precision=3) %}
    COALESCE (TRUNC(CAST({{ column_name }}/100 AS numeric), {{ precision }}), 0)
{% endmacro %}
  1. Execute the following command after dbt deps, dbt compile and dbt docs generate:
    pre-commit run check-macro-arguments-have-desc --files macros/cents_to_dollars.sql

Expected behavior
The hook should pass successfully.

Version:
commit 34a2341 (current latest commit)

Additional context
It looks like the problem is here: for key, value in item.macro.get("arguments", {}).items()
The hook is guessing the arguments key contains a dictionary while this is a list of dictionaries.
https://github.com/offbi/pre-commit-dbt/blob/main/pre_commit_dbt/check_macro_arguments_have_desc.py#L52

Tried couple things but getting below error

I have placed ".pre-commit-config.yaml" file at same level as dbt_project.yml file.

Executing cmd: dbt docs generate
Running with dbt=0.18.2
Encountered an error:
Runtime Error
fatal: Not a dbt project (or any of the parent directories). Missing dbt_project.yml file

check-model-parents-and-childs for zero child check never runs

Attempting to use check-model-parents-and-childs hook to ensure that our data consumption layer models do not have any children does not fail when models DO have children

Steps to reproduce the behavior:

  1. dbt project with a parent and child models
  2. Add check-model-parents-and-childs with --max-child-cnt of zero
  - id: check-model-parents-and-childs
    name: Check for child models in data consumption layers
    # manifest.json required
    args: ["--manifest","./pipelines/target/manifest.json","--max-child-cnt","0","--"]
    files: models/self_service/

Expected outcome:

Model that has a child is raised as failure

Actual outcome:

Hook passes

Version:

repos:
- repo: https://github.com/offbi/pre-commit-dbt
  rev: 34a2341234675d7a6b61766b2c33bdd5c33d090b

Additional info:

Offending code appears to be checking for default (None) for the --max-child-cnt returning false for zero value i.e.

 if req_cnt and req_operator(real_value, req_cnt):
     status_code = 1
     print(
...

may need to explicitly check for None?

 if req_cnt is not None and req_operator(real_value, req_cnt):
   status_code = 1
     print(
...

Add support for Slim CI

Describe the feature you'd like
Currently, the dbt-run hook will pass all staged models, possibly causing a re-run of the project that is not necessary, which can be problematic on large projects with a very long runtime.

It would be ideal to be able to support dbt's built-in Slim CI support. This way, if the developer has already ran and tested the project and made no further changes, the hook will detect this and skip the run. If, however, the developer did not run the project, the hook would run the changed models automatically, ensuring that nothing is being committed without successfully running.

check-script-has-no-table-name fails with subqueries referencing CTEs

Describe the bug
check-script-has-no-table-name fails with subqueries referencing CTEs if there is no space between the CTE name and the closing parenthesis .

To Reproduce
The following fails with the error models/staging/intermediate/assets_enriched.sql: does not use source() or ref() macros for tables: - asset_category)

with assets as (
    select * from {{ ref('stg_rse__assets') }}
),

asset_category as (
    select * from {{ ref('data_asset_category') }}
),

final as (
    select 
        assets.*,
        case
            when assets.category = 'cars' then assets.category
            when assets.category = 'wine-spirits' then assets.category
            when assets.ticker in (select ticker from asset_category) then asset_category.asset_category
            else 'unknown'
        end as asset_category
        
    from assets

    left join asset_category using (ticker)
)

select * from final

Changing the subquery on line 15 to (select ticker from asset_category ), essentially adding a space between the CTE name and closing parenthesis fixes the issue, but breaks the SQL style guide.

Version:
v0.1.1 bleeding edge

`check-script-has-no-table-name` fails incorrectly due to `EXTRACT` function

Describe the bug
Using an EXTRACT date function will recognize the column reference as a table reference.

Check the script has not table name......................................Failed
- hook id: check-script-has-no-table-name
- exit code: 1

models/example.sql: does not use source() or ref() macros for tables:
- order_date

To Reproduce
Based on the jaffle shop dbt example, create a model with the following content:

SELECT
    *,
    EXTRACT(YEAR FROM ORDER_DATE) as ORDER_YEAR
FROM source('jaffle_shop', 'orders')

Expected behavior
Using an EXTRACT date function should not make the check-script-has-no-table-name fail.

Version:
v1.0.0

Additional context
Link to EXTRACT function documentation for different warehouses:

Jinja comments cause check-script-has-no-table-name to hang

Describe the bug
Changes to dbt models containing Jinja comments {# #} cause the check-script-has-no-table-name hook to hang.

To Reproduce
Steps to reproduce the behavior:

  1. Add a Jinja comment {# #} to a dbt model
  2. Run the check-script-has-no-table-name hook
-- my_model.sql
{# adding this comment will cause the hook to hang #}

Expected behavior
I expected the hook to run normally, completing within a few seconds.

Version:
v0.1.0

Additional context
We noticed this issue when our GitHub action began to time out on certain pull requests, and confirmed that replacing Jinja comment syntax {# #} with -- resolved it. The problem seems similar to #46, but I don't know regex well enough to pinpoint the change needed.

Feature request: Add a check-exposure-has-models

Describe the feature you'd like
Add a hook that checks that each exposure depends on at least one model

Additional context
We're a team of developers who are interested in contributing in your tool since. We have already implemented the hook on a forked repo of your project. I would like to open a PR to reference our work in your project so that you can review and integrate it.

1.0.0 release is almost a year old

Seems like the latest release over 11 months old and doesn't include many of the newer tests, such as check-model-name-contract, resulting in

[ERROR] `check-model-name-contract` is not present in repository https://github.com/offbi/pre-commit-dbt.  Typo? Perhaps it is introduced in a newer version?

The current fix is to pin to main, which pre-commit only sort of supports:

The 'rev' field of repo 'https://github.com/offbi/pre-commit-dbt' appears to be a mutable reference (moving tag / branch).  Mutable references are never updated after first install and are not supported.

Any plans for a future release?

Feature request: Add a check-exposure-has-owner hook

Describe the feature you'd like
Add a hook that checks that exposures each have at least an owner with the name field defined.

Additional context
We're a team of developers who are interested in contributing in your tool since it would help us in our day-to-day lives. We already have a set of new policies/hooks, such as check-exposure-has-owner, that we would like to add to your project. It is also worth mentioning that we were able to implement the hook in question on a forked repo of your project.

I am willing to open a new PR to reference our work in your project so that you can review and maybe integrate it to your project. Also if you'd like to know more about the remaining hooks that we intend to add to your project, we are more than willing to discuss those at your best convenience.

Pass custom manifest and catalog path

How to pass custom manifest,json and catalog.json path to hooks? My dbt project it's in a subfolder and i need to append that info because the prehook give me Unable to load manifest file ([Errno 2] No such file or directory: 'target/manifest.json') errors

Thanks!

check-script-has-no-table-name is failing when using lateral flatten

Describe the bug
See updated description of the bug.

The check-script-has-no-table-name pre-commit hook is confusing CTEs with tables, and fails with code like this:

with source as (

    select * from {{ source('stripe', 'payments') }}

),

renamed as (

    select
        id as payment_id,
        order_id,
        payment_method,

        --`amount` is currently stored in cents, so we convert it to dollars
        amount / 100 as amount

    from source

)

select * from renamed

It reports that "source" and "renamed" are tables even though they are not, even though it looks the same from a code perspective.

I think this hook should perhaps fail only at the presence of schema.table or database.schema.table references, unless we can make this hook smarter by being aware of the CTEs defined in the model.

Version:
v0.1.1

Source test results in parser error when using sqlfluff

Describe the bug
When trying to run any test on a source, in this case the loader test (but it's the same for any other), I get an parser error with the following:

yaml.parser.ParserError: expected '<document start>', but found '<scalar>'
  in ".sqlfluff", line 2, column 1

I'm using sqlfluff as my linter, and it appears the source test is trying to compile this file as yaml.

To Reproduce
Steps to reproduce the behavior:

  1. Create a project using dbt v1 and install sqlfluff v0.9.*
  2. Create Source Yamls
  3. Run any source test

Expected behavior
The source test should succeed or fail, rather than error.

Version:
v1.0.0

Additional context
I've had to switch my .pre-commit-config.yaml from rev to sha to get around the issue highlighted in #26

repos:
  - repo: https://github.com/offbi/pre-commit-dbt
    sha: v1.0.0
    hooks:
#      dbt commands
      - id: dbt-deps
      - id: dbt-compile
        args: [ "--cmd-flags", "++profiles-dir", "." , "--"]
      - id: dbt-docs-generate
        args: [ "--cmd-flags", "++profiles-dir", "."]

#     Script Checks
      - id: check-script-semicolon
      - id: check-script-ref-and-source

#     Model Checks
      - id: check-model-has-properties-file


#     Source Checks
      - id: check-source-has-loader

All other checks pass except for check-source-has-loader

Add inexistant columns in documentation test

Describe the feature you'd like
If a column in the properties file is not in the related model, this will not be catched by dbt, and may cause the documentation to be out of sync with the model exposed in the database. It would be super helpful for me to have this as a pre-commit check that ensures that all columns in the documentation are in the model.

Additional context
In the catalog.json, dbt stores all the existing columns in a model, which could help to identify which columns in the properties file are not in the model.

Add integration tests

Describe the feature you'd like
I would like to add integration tests to test pre-commit-dbt if the new version of dbt is released.

check-script-has-no-table-name fails with Jinja-generated CTEs

Describe the bug
check-script-has-no-table-name fails with Jinja-generated CTEs

To Reproduce
Generate a series of CTEs through a Jinja loop such as:

{% set positions = ['left', 'right', 'center'] %}

with
{% for p in positions %}
    
    cte_{{ p }} as (
        select '{{ p }}' as position
    ),

{% endfor %}

unioned as (
    select * from cte_left
    union all
    select * from cte_right
    union all
    select * from cte_center
)

select * from unioned

The hook will complain that cte_left, cte_right and cte_center does not use source() or ref() macros for tables.

Expected behavior
I assume that this hook should inspect the compiled version rather than the raw version?

Version:
v0.1.1 bleeding edge

Missing dbt_project.yml file

When using the following pre commit config file

repos:
- repo: https://github.com/offbi/pre-commit-dbt
  rev: ea9c6bafbca375250baa9c21b8ddf9207d9c0160
  hooks:
  - id: dbt-run

I get the following response

Executing cmd: `dbt run -m customers`
13:12:46  Encountered an error:
Runtime Error
  fatal: Not a dbt project (or any of the parent directories). Missing dbt_project.yml file

If I run dbt run -m customers on the command line on the same directory (dbt/) it works fine, but when using the hook dbt-run it seems to be running the command in another directory?

Add custom Github Action

Describe the feature you'd like
Default pre-commit Github action does not have dbt dependency inside. Would be great to create custom action to be able to run dbt commands.

Use enviornemental varaibles like $PWD

Give hooks access to environmental variables like:

  • PWD
  • DBT_PROFILES_DIR
  • DBT_PROJECT_DIR
  • DBT_ARTIFACT_STATE_PATH

I am not saying that all environmental variables need to be supported but those listed above are good candidates, alternatively, we could use plain python to mutate file paths into absolute ones but I'm not sure if there are any edge cases that might suffer as result.

The reason for above idea is I tried running this hook:

  - repo: https://github.com/offbi/pre-commit-dbt
    rev: v1.0.0
    hooks:
      - id: dbt-compile
        files: "^dbt/.*(sql|yaml|yml)$"
        pass_filenames: false
        args:
          - "--cmd-flags"
          - "++project-dir"
          - "./dbt"
          - "++state"
          - "./dbt/target/HEAD"
          - "--models"
          - "state:modified"

dbt compile always fails while saying it cannot find a comparison manifest.
So I tried to run the same command in a regular shell rather than pre-commit dbt compile -m state:modified --project-dir ./dbt --state=dbt/target/HEAD and I still get the same error.
After some trial and error, it looks like --state only works if I give it an absolute path.

I find that having to specify an absolute path is a real bummer because the repo will probably be cloned to a different directory on each machine.

What I am hoping to see is that I can either get rid of --state arg and use DBT_ARTIFACT_STATE_PATH or expand PWD in --state=$PWD/dbt/target/HEAD

I am happy to tackle this myself but I would like to make sure first that this idea won't get pushback when I make the PR( assuming my implementation is reasonable of course).

pre-commit already seems to have some mechanisms to pass environmental variables I found this issue and they referenced a few examples of what the solution might look like( I believe that this already works but it just not documented well/at all).

check-column-desc-are-same fails with a Python error

Describe the bug
When trying to use the check-column-desc-are-same hook, I'm getting the following Python error. Other hooks I've tried so far are working:

Check column descriptions are same.......................................Failed
- hook id: check-column-desc-are-same
- exit code: 1

Traceback (most recent call last):
  File "/Users/Martin/.cache/pre-commit/repoewj3j5ob/py_env-python3.7/bin/check-column-desc-are-same", line 8, in <module>
    sys.exit(main())
  File "/Users/Martin/.cache/pre-commit/repoewj3j5ob/py_env-python3.7/lib/python3.7/site-packages/pre_commit_dbt/check_column_desc_are_same.py", line 80, in main
    return check_column_desc(paths=args.filenames, ignore=args.ignore)
  File "/Users/Martin/.cache/pre-commit/repoewj3j5ob/py_env-python3.7/lib/python3.7/site-packages/pre_commit_dbt/check_column_desc_are_same.py", line 55, in check_column_desc
    grouped = get_grouped(paths, ignore)
  File "/Users/Martin/.cache/pre-commit/repoewj3j5ob/py_env-python3.7/lib/python3.7/site-packages/pre_commit_dbt/check_column_desc_are_same.py", line 48, in get_grouped
    sorted(columns, key=lambda x: x.column_name), lambda x: x.column_name
  File "/Users/Martin/.cache/pre-commit/repoewj3j5ob/py_env-python3.7/lib/python3.7/site-packages/pre_commit_dbt/check_column_desc_are_same.py", line 29, in get_all_columns
    for item in schemas:
  File "/Users/Martin/.cache/pre-commit/repoewj3j5ob/py_env-python3.7/lib/python3.7/site-packages/pre_commit_dbt/utils.py", line 134, in get_model_schemas
    model_name = model.get("name")
AttributeError: 'str' object has no attribute 'get'

Version:
v0.1.1
Python 3.7.9

check-source-childs is not available and seems to be broken in main

Describe the bug
check-source-childs is not available in v1.0.0 also seems to be broken in main

To Reproduce
Steps to reproduce the behavior:

  1. Configure in .pre-commit-config.yaml
[...]
      - id: check-source-childs
        args: ["--min-child-cnt", "1", "--"]
  1. run pre-commit
   pre-commit run --all-files
[ERROR] `check-source-childs` is not present in repository https://github.com/offbi/pre-commit-dbt.  Typo? Perhaps it is introduced in a newer version?  Often `pre-commit autoupdate` fixes this.
  1. run autoupdate
   pre-commit autoupdate                                                           1 ↵
Updating https://github.com/offbi/pre-commit-dbt ... Cannot update because the update target is missing these hooks:
check-source-childs
Updating https://github.com/sqlfluff/sqlfluff ... already up to date.
  1. fork and tag, then update the pre-commit-config.yaml to point to forked latest tag
  2. run pre-commit again
[...]
  File "[...]/.dots/cache/pre-commit/repop3elpxm3/py_env-python3.9/lib/python3.9/site-packages/pre_commit_dbt/utils.py", line 209, in get_source_schemas
    for source in schema.get("sources", []):
AttributeError: 'str' object has no attribute 'get'

Expected behavior
check-source-childs hook to be available and working in mainstream version

Version:
v1.0.0

check-model-has-description fails with macros with description

Describe the bug
I get the following error when running the pre-commit hook check-model-has-description:

Check the model has description...............................................Failed
- hook id: check-model-has-description
- exit code: 1

macros/generate_schema_name.sql: does not have defined description or properties file is missing.

my .pre-commit-config.yaml file looks like:

repos:
- repo: https://github.com/offbi/pre-commit-dbt
  rev: 607cb07a1918442f5963662a9aa19da8984931e6
  hooks:
    - id: check-model-has-description

the macros yaml file (macros/macros.yml) has all the descriptions:

version: 2

macros:
  - name: generate_schema_name
    description: >
      A macro to generate the custom schema names based on the environment:

      * In `prod`:
        - If a custom schema is provided, a model's schema name should match the custom schema, rather than being concatenated to the target schema.
        - If no custom schema is provided, a model's schema name should match the target schema.

      * In other environments (e.g. `dev` or `ci`):
        - Build all models in the target schema, as in, ignore custom schema configurations.
    arguments:
      - name: custom_schema_name
        type: string
        description: The configured value of schema in the specified node, or none if a value is not supplied
      - name: node
        type: string
        description: The node that is currently being processed by dbt

and the manifest.json has the macro's description as well:

"macro.dwh.generate_schema_name": {
      "unique_id": "macro.dwh.generate_schema_name",
      "package_name": "dwh",
      "root_path": "/path/to/dbt/project/dwh",
      "path": "macros/generate_schema_name.sql",
      "original_file_path": "macros/generate_schema_name.sql",
      "name": "generate_schema_name",
      "macro_sql": "{% macro generate_schema_name(custom_schema_name, node) -%}\n    {{ generate_schema_name_for_env(custom_schema_name, node) }}\n{%- endmacro %}",
      "resource_type": "macro",
      "tags": [],
      "depends_on": {
        "macros": [
          "macro.dbt.generate_schema_name_for_env"
        ]
      },
      "description": "A macro to generate the custom schema names based on the environment:\n* In `prod`:\n  - If a custom schema is provided, a model's schema name should match the custom schema, rather than being concatenated to the target schema.\n  - If no custom schema is provided, a model's schema name should match the target schema.\n\n* In other environments (e.g. `dev` or `ci`):\n  - Build all models in the target schema, as in, ignore custom schema configurations.\n",
      "meta": {},
      "docs": {
        "show": true
      },
      "patch_path": "macros/macros.yml",
      "arguments": [
        {
          "name": "custom_schema_name",
          "type": "string",
          "description": "The configured value of schema in the specified node, or none if a value is not supplied"
        },
        {
          "name": "node",
          "type": "string",
          "description": "The node that is currently being processed by dbt"
        }
      ]
    }

Version:
pre-commit-dbt 607cb07
dbt v0.19.0
python 3.8.6

check-model-has-properties-file fails on macro with a valid properties yml

Describe the bug
When running the test check-model-has-properties-file with a macro, the test fails with the following error.

Check the model has properties file......................................Failed
- hook id: check-model-has-properties-file
- exit code: 1

macros/grant_select_on_schemas.sql: does not have model properties defined in any .yml file.

The .pre-commit-config.yaml includes the rule:

repos:
- repo: https://github.com/offbi/pre-commit-dbt
  rev: 607cb07a1918442f5963662a9aa19da8984931e6
  hooks:
  - id: check-model-has-properties-file

And the macro has the following .yml file (the filename is the same as the macro name and is stored within the macros folder):


macros:
  - name: grant_select_on_schemas
    description: "Grants privileges to groups after dbt run"
    docs:
      show: false

Hope we can get this fixed soon as this is a really useful test to include

check-script-has-no-table fails with missing whitespaces

Describe the bug
check-script-has-no-table fails, when the source or ref macro calls have no preceding whitespaces, e.g. {{source('schema','table')}} will cause the hook to fail.
Given that the dbt docs seem not to encourage the use of the one or the other, I think both versions (with and without whitespaces) should be supported.

To Reproduce

import check_script_has_no_table_name

script = """
{{
    config(
        sort=['country_id'],
        alias='countries'
    )
}}

SELECT

    /* IDS*/
    id AS country_id,
   
    /* DIMENSIONS */
    description AS country

FROM {{ source('data', 'country') }}
"""

status, tables = check_script_has_no_table_name.has_table_name(script, '')
print(status)

Expected behavior
The status should be 0

Version:
v0.1.0

Additional context
Will open a PR in a bit.

Replace check_script_semicolon

Describe the bug
https://github.com/offbi/pre-commit-dbt/blob/main/pre_commit_dbt/check_script_semicolon.py

  1. In check_script_semicolon.py the function check_semicolon has a parameter replace: bool = False but the main function does not mention it when calling the function nor does it seem to be parsing an argument for it. Thus the script is unable to replace the semicolon at the end of the file.
  2. Having comments at the end of the file is not accounted for if the SQL end with a semicolon in which case the semicolon is ignored and the test Passes.

To Reproduce
Run check_script_semicolon.py

Expected behavior

  1. Replace the semicolons when the argument replace=True is passed
  2. Recognise semicolons at the end of SQLs and not just at end of file

Version:
v0.1.0

Snapshots trigger check-model-has-properties-file

Describe the bug
Snapshot tables located in the /snapshots folder trigger the check-model-has-properties-file check, even with a property file defined per dbt documentation. Changing the "snapshots" header to "model" does not change the outcome.

To Reproduce
Steps to reproduce the behavior:

  1. Create a snapshot SQL file.
{% snapshot <snapshot name> %}

{{
    config(
        target_database=<snapshot db>,
        target_schema=<snapshot schema>,
        unique_key=<snapshot key>,

        strategy='timestamp',
        updated_at=<snapshot updated_at>,
        invalidate_hard_deletes=True,
    )
}}

select * from {{ ref(<snapshot model>) }}

{% endsnapshot %}
  1. Create a snapshot properties YAML file.
version: 2

snapshots:
  - name: <snapshot name>
    description: <markdown_string>    
	config:
      target_schema: <snapshot schema>
      target_database: <snapshot db>
      unique_key: <snapshot key>
      strategy: timestamp
      updated_at: <snapshot updated_at>

Expected behavior
The check-model-has-properties-file check should pass.

Version:
v0.1.0

Feature request: check-model-has-at-least-meta-keys

Hi! I just found this library today and it looks so promising! We plan to start using it quite soon at my team. Thanks for the work!

Describe the feature you'd like
We would like a check to ensure that our models have at least a set of meta keys.

Additional context
We have a data mesh architecture, and there are certain meta keys that are required for all our models, but we also allow individual teams to define their own meta keys. The current check-source-has-meta-keys requires that the model has exactly the same keys as defined in the check, but we need something to check that it has at least those keys, also allowing others.

I must also admit that when I first tried check-source-has-meta-keys, it was not that clear to me from its documentation that it didn't allow extra keys.

Implementation
I'm willing to open a PR to implement this feature and I would love to hear your thoughts on it. What would you prefer?

1) Add an extra argument to check-source-has-meta-keys to allow extra keys

Keep the same behavior as today, and add an extra --allow-extra-keys argument. No breaking change.

2) Create a new hook check-model-has-at-least-meta-keys

Instead of having an extra argument, have a completely new check. I haven't checked the whole code base. Do you allow one check to import functions from another check? Or would you rather have me moving similar functions to utils or another common file?

Or some other idea?

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.