Giter VIP home page Giter VIP logo

airflow-exporter's Introduction

Airflow prometheus exporter

Exposes dag and task based metrics from Airflow to a Prometheus compatible endpoint.

Compatibility with Airflow versions

>=2.0

Current version is compatible with Airflow 2.0+

<=1.10.14, >=1.10.3

Version v1.3.2 is compatible

Note: Airflow 1.10.14 with Python 3.8 users

You should install importlib-metadata package in order for plugin to be loaded. See #85 for details.

<1.10.3

Version v0.5.4 is compatible

Install

pip install airflow-exporter

That's it. You're done.

Exporting extra labels to Prometheus

It is possible to add extra labels to DAG-related metrics by providing labels dict to DAG params.

Example

dag = DAG(
    'dummy_dag',
    schedule_interval=timedelta(hours=5),
    default_args=default_args,
    catchup=False,
    params={
        'labels': {
            'env': 'test'
        }
    }
)

Label env with value test will be added to all metrics related to dummy_dag:

airflow_dag_status{dag_id="dummy_dag",env="test",owner="owner",status="running"} 12.0

Metrics

Metrics will be available at

http://<your_airflow_host_and_port>/admin/metrics/

airflow_task_status

Labels:

  • dag_id
  • task_id
  • owner
  • status

Value: number of tasks in a specific status.

airflow_dag_status

Labels:

  • dag_id
  • owner
  • status

Value: number of dags in a specific status.

airflow_dag_run_duration

Labels:

  • dag_id: unique identifier for a given DAG

Value: duration in seconds of the longest DAG Run for given DAG. This metric is not available for DAGs that have already finished.

airflow_dag_last_status

Labels:

  • dag_id
  • owner
  • status

Value: 0 or 1 depending on wherever the current state of each dag_id is status.

License

Distributed under the BSD license. See LICENSE for more information.

airflow-exporter's People

Contributors

avdushkin avatar cansjt avatar clevercat avatar dimon222 avatar ebartels avatar elephantum avatar hydrosquall avatar jmcarp avatar maxbrunet avatar msumit avatar nvn01234 avatar phanindhra876 avatar ryan-carlson avatar samarius avatar sawaca96 avatar slash-cyberpunk avatar sockeye44 avatar szyn avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

airflow-exporter's Issues

No module named prometheus_exporter.db.store

I was setting-up the exporter as per README.Got the following error while executing airflow scheduler

osboxes@osboxes:~$ airflow webserver -p 8080
[2018-08-28 07:11:24,962] {init.py:57} INFO - Using executor SequentialExecutor
[2018-08-28 07:11:25,259] {driver.py:123} INFO - Generating grammar tables from /usr/lib/python2.7/lib2to3/Grammar.txt
[2018-08-28 07:11:25,301] {driver.py:123} INFO - Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
[2018-08-28 07:11:25,690] {plugins_manager.py:90} ERROR - No module named prometheus_exporter.db.store
Traceback (most recent call last):
File "/usr/local/lib/python2.7/dist-packages/airflow/plugins_manager.py", line 79, in
m = imp.load_source(namespace, filepath)
File "/home/osboxes/airflow/plugins/prometheus-exporter/prometheus_exporter.py", line 8, in
from prometheus_exporter.db.store import get_context
ImportError: No module named prometheus_exporter.db.store
[2018-08-28 07:11:25,692] {plugins_manager.py:91} ERROR - Failed to import plugin /home/osboxes/airflow/plugins/prometheus-exporter/prometheus_exporter.py
[2018-08-28 07:11:25,728] {plugins_manager.py:90} ERROR - invalid syntax (store.py, line 32)
Traceback (most recent call last):
File "/usr/local/lib/python2.7/dist-packages/airflow/plugins_manager.py", line 79, in
m = imp.load_source(namespace, filepath)
File "/home/osboxes/airflow/plugins/prometheus-exporter/db/store.py", line 32
return function(*args, **kwargs, connection=session)
^
SyntaxError: invalid syntax
[2018-08-28 07:11:25,728] {plugins_manager.py:91} ERROR - Failed to import plugin /home/osboxes/airflow/plugins/prometheus-exporter/db/store.py


But on checking the contents of airflow-exporter folder found:

osboxes@osboxes:/airflow/plugins/prometheus-exporter$ ls
db LICENSE prometheus_exporter.py prometheus_exporter.pyc README.md
osboxes@osboxes:
/airflow/plugins/prometheus-exporter$ cd db/
osboxes@osboxes:~/airflow/plugins/prometheus-exporter/db$ ls
init.py init.pyc store.py

Not sure if it an error from my side or not.I am a newbie to airflow.Any pointers to debug or seek help from another forum would be very helpful

Unsafe code: label «run_id» of «airflow_dag_run_duration» has unlimited cardinality

The label and metric naming guide states the following about labels:

CAUTION: Remember that every unique combination of key-value label pairs represents a new time series, which can dramatically increase the amount of data stored. Do not use labels to store dimensions with high cardinality (many different label values), such as user IDs, email addresses, or other unbounded sets of values.

Given that typical run_id includes the date of run, it should generate new time series for each individual dag run, which might eventually blow up Prometheus.

bug: plugin gets loaded multiple times

I added logging to the module. Then, I realized that the plugin is being loaded on every task.

The issue is that exporter.py should include only classes and functions. Then, in init.py, we use the classes and functions. But ... it will happen only once. Currently, it is not implemented like that.

Exporter fails with MySQL: get_dag_duration_info

Airflow 2.0.1 with MySQL db
airflow exporter 1.5.0

Python version: 3.8.6
Airflow version: 2.0.1
Node: redacted
-------------------------------------------------------------------------------
Traceback (most recent call last):
  File "/data/home/osint/airflow/.venv/lib/python3.8/site-packages/flask/app.py", line 2447, in wsgi_app
    response = self.full_dispatch_request()
  File "/data/home/osint/airflow/.venv/lib/python3.8/site-packages/flask/app.py", line 1952, in full_dispatch_request
    rv = self.handle_user_exception(e)
  File "/data/home/osint/airflow/.venv/lib/python3.8/site-packages/flask/app.py", line 1821, in handle_user_exception
    reraise(exc_type, exc_value, tb)
  File "/data/home/osint/airflow/.venv/lib/python3.8/site-packages/flask/_compat.py", line 39, in reraise
    raise value
  File "/data/home/osint/airflow/.venv/lib/python3.8/site-packages/flask/app.py", line 1950, in full_dispatch_request
    rv = self.dispatch_request()
  File "/data/home/osint/airflow/.venv/lib/python3.8/site-packages/flask/app.py", line 1936, in dispatch_request
    return self.view_functions[rule.endpoint](**req.view_args)
  File "/data/home/osint/airflow/.venv/lib/python3.8/site-packages/airflow_exporter/prometheus_exporter.py", line 305, in list
    return Response(generate_latest(), mimetype='text')
  File "/data/home/osint/airflow/.venv/lib/python3.8/site-packages/prometheus_client/exposition.py", line 177, in generate_latest
    for metric in registry.collect():
  File "/data/home/osint/airflow/.venv/lib/python3.8/site-packages/prometheus_client/registry.py", line 83, in collect
    for metric in collector.collect():
  File "/data/home/osint/airflow/.venv/lib/python3.8/site-packages/airflow_exporter/prometheus_exporter.py", line 259, in collect
    for dag_duration in get_dag_duration_info():
  File "/data/home/osint/airflow/.venv/lib/python3.8/site-packages/airflow_exporter/prometheus_exporter.py", line 147, in get_dag_duration_info
    sql_res = Session.query( # pylint: disable=no-member
  File "/data/home/osint/airflow/.venv/lib/python3.8/site-packages/sqlalchemy/orm/query.py", line 3373, in all
    return list(self)
  File "/data/home/osint/airflow/.venv/lib/python3.8/site-packages/sqlalchemy/orm/loading.py", line 100, in instances
    cursor.close()
  File "/data/home/osint/airflow/.venv/lib/python3.8/site-packages/sqlalchemy/util/langhelpers.py", line 68, in __exit__
    compat.raise_(
  File "/data/home/osint/airflow/.venv/lib/python3.8/site-packages/sqlalchemy/util/compat.py", line 182, in raise_
    raise exception
  File "/data/home/osint/airflow/.venv/lib/python3.8/site-packages/sqlalchemy/orm/loading.py", line 82, in instances
    rows = [
  File "/data/home/osint/airflow/.venv/lib/python3.8/site-packages/sqlalchemy/orm/loading.py", line 83, in <listcomp>
    keyed_tuple([proc(row) for proc in process])
  File "/data/home/osint/airflow/.venv/lib/python3.8/site-packages/sqlalchemy/orm/loading.py", line 83, in <listcomp>
    keyed_tuple([proc(row) for proc in process])
  File "/data/home/osint/airflow/.venv/lib/python3.8/site-packages/sqlalchemy/sql/sqltypes.py", line 1968, in process
    return value - epoch
TypeError: unsupported operand type(s) for -: 'decimal.Decimal' and 'datetime.datetime'

DAG not found in serialized_dag table

Step by step to reproduce the bug:

  • Step 1: Delete a dag in code
  • Step 2: Do not delete the DAG on the UI, wait for it disappeared on the UI
  • Step 3: Go to admin/metrics, you will get the bug as the picture below
    image

Currently I have to run a CLI command: airflow dags delete <dag_id> to completely remove all records related to a deleted dag

The solution can be: Ignore the deleted dags (don't show them in metrics)

/label ~bug

airflow exporter is failing on /admin/metrics page

Ooops!
Something bad has happened.

Airflow is used by many users, and it is very likely that others had similar problems and you can easily find
a solution to your problem.

Consider following these steps:

  * gather the relevant information (detailed logs with errors, reproduction steps, details of your deployment)

  * find similar issues using:
     * GitHub Discussions
     * GitHub Issues
     * Stack Overflow
     * the usual search engine you use on a daily basis

  * if you run Airflow on a Managed Service, consider opening an issue using the service support channels

  * if you tried and have difficulty with diagnosing and fixing the problem yourself, consider creating a bug report.
    Make sure however, to include all relevant details and results of your investigation so far.

Python version: 3.8.12
Airflow version: 2.2.2
Node: airflow-web-7f59787854-87msz
-------------------------------------------------------------------------------
Traceback (most recent call last):
  File "/home/airflow/.local/lib/python3.8/site-packages/flask/app.py", line 2447, in wsgi_app
    response = self.full_dispatch_request()
  File "/home/airflow/.local/lib/python3.8/site-packages/flask/app.py", line 1952, in full_dispatch_request
    rv = self.handle_user_exception(e)
  File "/home/airflow/.local/lib/python3.8/site-packages/flask/app.py", line 1821, in handle_user_exception
    reraise(exc_type, exc_value, tb)
  File "/home/airflow/.local/lib/python3.8/site-packages/flask/_compat.py", line 39, in reraise
    raise value
  File "/home/airflow/.local/lib/python3.8/site-packages/flask/app.py", line 1950, in full_dispatch_request
    rv = self.dispatch_request()
  File "/home/airflow/.local/lib/python3.8/site-packages/flask/app.py", line 1936, in dispatch_request
    return self.view_functions[rule.endpoint](**req.view_args)
  File "/home/airflow/.local/lib/python3.8/site-packages/airflow_exporter/prometheus_exporter.py", line 326, in list
    return Response(generate_latest(), mimetype='text')
  File "/home/airflow/.local/lib/python3.8/site-packages/prometheus_client/exposition.py", line 176, in generate_latest
    for metric in registry.collect():
  File "/home/airflow/.local/lib/python3.8/site-packages/prometheus_client/registry.py", line 83, in collect
    for metric in collector.collect():
  File "/home/airflow/.local/lib/python3.8/site-packages/airflow_exporter/prometheus_exporter.py", line 233, in collect
    labels = get_dag_labels(dag.dag_id)
  File "/home/airflow/.local/lib/python3.8/site-packages/airflow_exporter/prometheus_exporter.py", line 201, in get_dag_labels
    labels = labels.get('__var', {})
AttributeError: 'Param' object has no attribute 'get'

Exporter fails on sql datatypes with mysql based airflow DB

When deploying the airflow exporter on my Airflow 1.10 installaion, I face following error -

webserver_1  | [2018-11-13 00:14:01,448] ERROR - unsupported operand type(s) for -: 'decimal.Decimal' and 'datetime.datetime'
webserver_1  | Traceback (most recent call last):
webserver_1  |   File "/usr/local/lib/python3.6/site-packages/airflow/plugins_manager.py", line 86, in <module>
webserver_1  |     m = imp.load_source(namespace, filepath)
webserver_1  |   File "/usr/local/lib/python3.6/imp.py", line 172, in load_source
webserver_1  |     module = _load(spec)
webserver_1  |   File "<frozen importlib._bootstrap>", line 684, in _load
webserver_1  |   File "<frozen importlib._bootstrap>", line 665, in _load_unlocked
webserver_1  |   File "<frozen importlib._bootstrap_external>", line 678, in exec_module
webserver_1  |   File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed
webserver_1  |   File "/usr/local/airflow/plugins/airflow-exporter/prometheus_exporter.py", line 119, in <module>
webserver_1  |     REGISTRY.register(MetricsCollector())
webserver_1  |   File "/usr/local/lib/python3.6/site-packages/prometheus_client/core.py", line 96, in register
webserver_1  |     names = self._get_names(collector)
webserver_1  |   File "/usr/local/lib/python3.6/site-packages/prometheus_client/core.py", line 136, in _get_names
webserver_1  |     for metric in desc_func():
webserver_1  |   File "/usr/local/airflow/plugins/airflow-exporter/prometheus_exporter.py", line 114, in collect
webserver_1  |     for dag in get_dag_duration_info():
webserver_1  |   File "/usr/local/airflow/plugins/airflow-exporter/prometheus_exporter.py", line 77, in get_dag_duration_info
webserver_1  |     DagRun.state == State.RUNNING
webserver_1  |   File "/usr/local/lib/python3.6/site-packages/sqlalchemy/orm/query.py", line 2703, in all
webserver_1  |     return list(self)
webserver_1  |   File "/usr/local/lib/python3.6/site-packages/sqlalchemy/orm/loading.py", line 90, in instances
webserver_1  |     util.raise_from_cause(err)
webserver_1  |   File "/usr/local/lib/python3.6/site-packages/sqlalchemy/util/compat.py", line 203, in raise_from_cause
webserver_1  |     reraise(type(exception), exception, tb=exc_tb, cause=cause)
webserver_1  |   File "/usr/local/lib/python3.6/site-packages/sqlalchemy/util/compat.py", line 187, in reraise
webserver_1  |     raise value
webserver_1  |   File "/usr/local/lib/python3.6/site-packages/sqlalchemy/orm/loading.py", line 78, in instances
webserver_1  |     for row in fetch]
webserver_1  |   File "/usr/local/lib/python3.6/site-packages/sqlalchemy/orm/loading.py", line 78, in <listcomp>
webserver_1  |     for row in fetch]
webserver_1  |   File "/usr/local/lib/python3.6/site-packages/sqlalchemy/orm/loading.py", line 77, in <listcomp>
webserver_1  |     rows = [keyed_tuple([proc(row) for proc in process])
webserver_1  |   File "/usr/local/lib/python3.6/site-packages/sqlalchemy/sql/sqltypes.py", line 1717, in process
webserver_1  |     return value - epoch
webserver_1  | TypeError: unsupported operand type(s) for -: 'decimal.Decimal' and 'datetime.datetime'
webserver_1  | [2018-11-13 00:14:01,567] ERROR - Failed to import plugin /usr/local/airflow/plugins/airflow-exporter/prometheus_exporter.py

UnknownFormatError: Unsupported content-type provided: text

Hi, I get the following error when using Datadog with Prometheus integration.

prometheus (3.0.0)
------------------
    Instance ID: prometheus:99fffad4f445db41 [ERROR]
    Total Runs: 2
    Metric Samples: 0, Total: 0
    Events: 0, Total: 0
    Service Checks: 1, Total: 2
    Average Execution Time : 17ms
    Error: Unsupported content-type provided: text
    Traceback (most recent call last):
    File "/opt/datadog-agent/embedded/lib/python2.7/site-packages/datadog_checks/base/checks/base.py", line 366, in run
        self.check(copy.deepcopy(self.instances[0]))
    File "/opt/datadog-agent/embedded/lib/python2.7/site-packages/datadog_checks/base/checks/prometheus/base_check.py", line 108, in check
        ignore_unmapped=True
    File "/opt/datadog-agent/embedded/lib/python2.7/site-packages/datadog_checks/base/checks/prometheus/mixins.py", line 392, in process
        for metric in self.scrape_metrics(endpoint):
    File "/opt/datadog-agent/embedded/lib/python2.7/site-packages/datadog_checks/base/checks/prometheus/mixins.py", line 366, in scrape_metrics
        for metric in self.parse_metric_family(response):
    File "/opt/datadog-agent/embedded/lib/python2.7/site-packages/datadog_checks/base/checks/prometheus/mixins.py", line 235, in parse_metric_family
        response.headers['Content-Type']))
    UnknownFormatError: Unsupported content-type provided: text

The cause of the error is mimetype='text'.
https://github.com/epoch8/airflow-exporter/blob/master/prometheus_exporter.py#L125

As prometheus/client_python, we have better to use text/plain instead of text.

How to get the latest status of a particular DAG?

I saw an example of using airflow-exporter metrics but I am not quite clear on the use case behind it:
airflow_dag_status{dag_id="dummy_dag",env="test",owner="owner",status="running"} 12.0
What is needed in most cases is a status of the last run - especially if the status is other than "success" in order to apply some alerts and notifications on top of that. So my question is - how I could contruct a PromQL query that would tell me what's the latest status of my dummy_dag? At the end of the day I would like to construct a matrix of panels in Grafana with the statuses (and other features) of all the DAGs. Thanks!

Plugin loaded multiple times in webserver

We're experimenting with the airflow-exporter and noticed that the plugin seems to be loaded more than once. Using airflow 2.0.1 and airflow-exporter 1.4.1 installed via pip. The plugin appears twice in the list under Admin | Plugins, as does the Metrics menu item. This using python 3.6.6 on Centos 7. See attached screenshot.

pluginDuplicated

installation

Hi All,

Can someone let me know how to use this repo.

I cloned the code on my target machine on which airflow container is present and did pip install,

but could not get metrics as their was no endpoint created when i ran netstat -tulpn command.

Thanks.

Not getting metrics after pip install in container

Hi Team,

I have ran PIP install in my airflow container and its telling
Building wheel for prometheus-client (setup.py) ... done
Created wheel for prometheus-client: filename=prometheus_client-0.7.1-cp36-none-any.whl size=41406 sha256=b9f03f1d9e47ac656a9b8b790ffe6fcb76de5570f8cb070d972378424c8eefcd
Stored in directory: /root/.cache/pip/wheels/1c/54/34/fd47cd9b308826cc4292b54449c1899a30251ef3b506bc91ea
Successfully built prometheus-client
Installing collected packages: prometheus-client, airflow-exporter
Successfully installed airflow-exporter-1.1.0 prometheus-client-0.7.1

but still can not view metrics at
http://<your_airflow_host_and_port>/admin/metrics/

instead i am getting Airflow 404 = lots of circles

Can some one please help us.

Also let me know if it supports Airflow 1.10.3

Or the Airflow version should be greater that 1.10.3

Thanks.

PyPI source package

It looks like source packages stopped being uploaded to PyPI as of version 1.4.2.

Could this be reinstated? This is useful for building a conda package through conda skeleton.

Documentation on the current use case and some thoughts

This is really interesting - I saw your blog post linked on reddit.

What is your current Airflow setup? 1.9.0? What executor are you running?

If you aren't using the LocalExecutor, does this mean you would scrape metrics from the "web" service? Have you given any thought as to how you would scrape metrics from all components of a CeleryExecutor setup? (web, scheduler, worker pools) or the WIP KubernetesExecutor?

What are you using to visualize the metrics you report for prometheus? Grafana, WeaveScope, StackDriver, something else?

If you did have a shareable dashboard or promql for dashboards it could be a cool resource to evolve with this plugin?

We use prometheus +grafana for monitoring for most things but at the moment have only tapped mysql plugin for grafana and a custom metrics plugin for data based metrics creation. Part of the motivation for this was probably that reporting on metrics in airflow db tied to dags immediately exposed a lot more dag and task information (like execution date, SLAs, plugin specific data) and there were fewer unknowns (for example, I didn't know how simple it was to create new flask endpoints with airflow views - cool to see. I still don't know how to create endpoints on other airflow service components that don't host a flask endpoint). But seeing this does remind me there maybe is a route to simple and effective prometheus monitoring with airflow.

A couple thoughts around prometheus for airflow and this plugin.

Imo a complete prometheus monitoring solution for airflow probably has at least 2 aspects - "service" monitoring (is x airflow component running, what is its timing, has it restarted etc - this may differ depending on the component and your executor setup) and airflow dag/task monitoring (are dags/tasks successful, do the tasks meet slas, how long do they take? Task retries? What is the timing of actual task execution relative to execution_date? When are tasks running? More or less what airflow has tried to do internally, but you know - in prometheus.

Given what I have seen with prometheus, I think the former - "service" metrics is the most typical and strongest prometheus use case - more or less what prometheus was originally intended for. Also could be difficult to implement in Airflow (there are many executors, how would you hook a metrics endpoint into each component?).

The later - dag/task metrics - or perhaps more generally systemic metrics about the completion and timing of scheduled processes - I think is a less well documented use case, yet more intriguing if there is an elegant solution that integrates well with prometheus. Imo this is probably best (not most simply) done with a prometheus push gateway where metrics from the task (can be ephemeral) or dag scheduler gets pushed to prometheus gateway instead of scraped. Similar to recommendations for monitoring cron jobs with prometheus (e.g. https://zpjiang.me/2017/07/10/Push-based-monitoring-for-Prometheus/). I also think it would be really valuable to share and understand visualizations based on this type of prometheus metrics - and in with the specific case of Airflow + prometheus, how would you visually integrate workflow + task information.

A nitpick - I don't think your task and dag status metrics are 100% correct - multiple task instances can exist for the same task - multiple dagruns can exist for the same dag. Depending on your airflow configuration, you may have multiple copies of the same dag / task simultaneously, so this doesn't necessarily generalize very well. Since to my knowledge airflow doesn't natively define task and dag status, only task instance and dag run status, it might make sense to define what it means for a task or dag to be healthy and report on that

Not compatible with 2.0.0

current airflow exporter does not compatible with 2.0.0.

[2021-01-21 11:00:32,756] {plugins_manager.py:199} ERROR - Failed to import plugin AirflowPrometheus Traceback (most recent call last): File "/usr/local/lib/python3.6/dist-packages/airflow/plugins_manager.py", line 189, in load_entrypoint_plugins plugin_class = entry_point.load() File "/usr/local/lib/python3.6/dist-packages/importlib_metadata/__init__.py", line 105, in load module = import_module(match.group('module')) File "/usr/lib/python3.6/importlib/__init__.py", line 126, in import_module return _bootstrap._gcd_import(name[level:], package, level) File "<frozen importlib._bootstrap>", line 994, in _gcd_import File "<frozen importlib._bootstrap>", line 971, in _find_and_load File "<frozen importlib._bootstrap>", line 955, in _find_and_load_unlocked File "<frozen importlib._bootstrap>", line 665, in _load_unlocked File "<frozen importlib._bootstrap_external>", line 678, in exec_module File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed File "/usr/local/lib/python3.6/dist-packages/airflow_exporter/prometheus_exporter.py", line 152, in <module> if settings.RBAC: AttributeError: module 'airflow.settings' has no attribute 'RBAC'

Do we have plan to support that?

Plugin broken with Airflow 1.10.14

Hey,

this plugin does not seem to be compatible with Airflow 1.10.14. As far as I get they changed the way plugins are loaded: https://github.com/apache/airflow/pull/12859/files#diff-7bfb481064cd476fafe05437d31ba297c9255c13cac38d6f2d2aaeed64af7e13R160

On startup this message is generated:

[2020-12-11 14:12:29,757] {plugins_manager.py:159} ERROR - Failed to import plugin AirflowPrometheus                                                                            │
│ Traceback (most recent call last):                                                                                                                                              │
│   File "/home/airflow/.local/lib/python3.8/site-packages/airflow/plugins_manager.py", line 150, in load_entrypoint_plugins                                                      │
│     plugin_obj.__usable_import_name = entry_point.module                                                                                                                        │
│ AttributeError: 'EntryPoint' object has no attribute 'module'

container

Hi Team,

I have used PIP install in my Airflow container and tried to view metrics at IP:port/admin/metrics but i am getting below error:

Traceback (most recent call last):
File "/usr/local/lib/python3.6/site-packages/flask/app.py", line 2446, in wsgi_app
response = self.full_dispatch_request()
File "/usr/local/lib/python3.6/site-packages/flask/app.py", line 1951, in full_dispatch_request
rv = self.handle_user_exception(e)
File "/usr/local/lib/python3.6/site-packages/flask/app.py", line 1820, in handle_user_exception
reraise(exc_type, exc_value, tb)
File "/usr/local/lib/python3.6/site-packages/flask/_compat.py", line 39, in reraise
raise value
File "/usr/local/lib/python3.6/site-packages/flask/app.py", line 1949, in full_dispatch_request
rv = self.dispatch_request()
File "/usr/local/lib/python3.6/site-packages/flask/app.py", line 1935, in dispatch_request
return self.view_functionsrule.endpoint
File "/usr/local/lib/python3.6/site-packages/flask_admin/base.py", line 69, in inner
return self._run_view(f, *args, **kwargs)
File "/usr/local/lib/python3.6/site-packages/flask_admin/base.py", line 368, in _run_view
return fn(self, *args, **kwargs)
File "/usr/local/lib/python3.6/site-packages/airflow_prometheus_exporter/prometheus_exporter.py", line 377, in index
return Response(generate_latest(), mimetype='text/plain')
File "/usr/local/lib/python3.6/site-packages/prometheus_client/exposition.py", line 90, in generate_latest
for metric in registry.collect():
File "/usr/local/lib/python3.6/site-packages/prometheus_client/registry.py", line 75, in collect
for metric in collector.collect():
File "/usr/local/lib/python3.6/site-packages/airflow_prometheus_exporter/prometheus_exporter.py", line 285, in collect
task_duration.add_metric(
AttributeError: 'float' object has no attribute 'add_metric'
.

Can you please help me with this?

Thanks.

Not getting metrics after pip install in container

I have airflow 1.10.12 and exporter 1.3.2, I still got 404. I saw the previously closed issue that says this was fix in 1.10.4

airflow@workflow-web-d8fcb7cc8-s8c55:~$ curl http://localhost:8080/admin/metrics -v

  • Expire in 0 ms for 6 (transfer 0x55eb15ac4f50)
  • Uses proxy env variable NO_PROXY == '10.240.0.1, iad-c-staging-data.adp-elasticsearch'
  • Expire in 1 ms for 1 (transfer 0x55eb15ac4f50)
  • Expire in 0 ms for 1 (transfer 0x55eb15ac4f50)
  • Expire in 1 ms for 1 (transfer 0x55eb15ac4f50)
  • Expire in 0 ms for 1 (transfer 0x55eb15ac4f50)
  • Expire in 0 ms for 1 (transfer 0x55eb15ac4f50)
  • Expire in 1 ms for 1 (transfer 0x55eb15ac4f50)
  • Expire in 0 ms for 1 (transfer 0x55eb15ac4f50)
  • Expire in 0 ms for 1 (transfer 0x55eb15ac4f50)
  • Expire in 1 ms for 1 (transfer 0x55eb15ac4f50)
  • Expire in 0 ms for 1 (transfer 0x55eb15ac4f50)
  • Expire in 0 ms for 1 (transfer 0x55eb15ac4f50)
  • Expire in 1 ms for 1 (transfer 0x55eb15ac4f50)
  • Expire in 0 ms for 1 (transfer 0x55eb15ac4f50)
  • Expire in 0 ms for 1 (transfer 0x55eb15ac4f50)
  • Expire in 1 ms for 1 (transfer 0x55eb15ac4f50)
  • Expire in 0 ms for 1 (transfer 0x55eb15ac4f50)
  • Expire in 0 ms for 1 (transfer 0x55eb15ac4f50)
  • Expire in 1 ms for 1 (transfer 0x55eb15ac4f50)
  • Expire in 0 ms for 1 (transfer 0x55eb15ac4f50)
  • Expire in 0 ms for 1 (transfer 0x55eb15ac4f50)
  • Expire in 1 ms for 1 (transfer 0x55eb15ac4f50)
  • Expire in 0 ms for 1 (transfer 0x55eb15ac4f50)
  • Expire in 0 ms for 1 (transfer 0x55eb15ac4f50)
  • Expire in 1 ms for 1 (transfer 0x55eb15ac4f50)
  • Expire in 0 ms for 1 (transfer 0x55eb15ac4f50)
  • Expire in 0 ms for 1 (transfer 0x55eb15ac4f50)
  • Expire in 1 ms for 1 (transfer 0x55eb15ac4f50)
  • Expire in 0 ms for 1 (transfer 0x55eb15ac4f50)
  • Expire in 0 ms for 1 (transfer 0x55eb15ac4f50)
  • Expire in 1 ms for 1 (transfer 0x55eb15ac4f50)
  • Expire in 0 ms for 1 (transfer 0x55eb15ac4f50)
  • Expire in 0 ms for 1 (transfer 0x55eb15ac4f50)
  • Expire in 0 ms for 1 (transfer 0x55eb15ac4f50)
  • Trying 127.0.0.1...
  • TCP_NODELAY set
  • Expire in 150000 ms for 3 (transfer 0x55eb15ac4f50)
  • Expire in 200 ms for 4 (transfer 0x55eb15ac4f50)
  • Connected to localhost (127.0.0.1) port 8080 (#0)

GET /admin/metrics HTTP/1.1
Host: localhost:8080
User-Agent: curl/7.64.0
Accept: /

< HTTP/1.1 404 Not Found
< Server: gunicorn/20.0.4
< Date: Sun, 20 Dec 2020 21:31:34 GMT
< Connection: close
< Transfer-Encoding: chunked
< Content-Type: text/plain
<

  • Closing connection 0
    Apache Airflow is not at this location

Exporter fails with Sqlite DB used in development environment

When using the Sqlite DB, the following error occurs when attempting to open the metrics page.

Traceback (most recent call last):
File "/Users/ryan.carlson/.virtualenvs/spothero-dataflows/lib/python3.6/site-packages/flask/app.py", line 1982, in wsgi_app
response = self.full_dispatch_request()
File "/Users/ryan.carlson/.virtualenvs/spothero-dataflows/lib/python3.6/site-packages/flask/app.py", line 1614, in full_dispatch_request
rv = self.handle_user_exception(e)
File "/Users/ryan.carlson/.virtualenvs/spothero-dataflows/lib/python3.6/site-packages/flask/app.py", line 1517, in handle_user_exception
reraise(exc_type, exc_value, tb)
File "/Users/ryan.carlson/.virtualenvs/spothero-dataflows/lib/python3.6/site-packages/flask/_compat.py", line 33, in reraise
raise value
File "/Users/ryan.carlson/.virtualenvs/spothero-dataflows/lib/python3.6/site-packages/flask/app.py", line 1612, in full_dispatch_request
rv = self.dispatch_request()
File "/Users/ryan.carlson/.virtualenvs/spothero-dataflows/lib/python3.6/site-packages/flask/app.py", line 1598, in dispatch_request
return self.view_functionsrule.endpoint
File "/Users/ryan.carlson/.virtualenvs/spothero-dataflows/lib/python3.6/site-packages/flask_admin/base.py", line 69, in inner
return self._run_view(f, *args, **kwargs)
File "/Users/ryan.carlson/.virtualenvs/spothero-dataflows/lib/python3.6/site-packages/flask_admin/base.py", line 368, in _run_view
return fn(self, *args, **kwargs)
File "/Users/ryan.carlson/airflow/plugins/prometheus_exporter/prometheus_exporter.py", line 164, in index
return Response(generate_latest(), mimetype='text/plain')
File "/Users/ryan.carlson/.virtualenvs/spothero-dataflows/lib/python3.6/site-packages/prometheus_client/exposition.py", line 88, in generate_latest
for metric in registry.collect():
File "/Users/ryan.carlson/.virtualenvs/spothero-dataflows/lib/python3.6/site-packages/prometheus_client/core.py", line 147, in collect
for metric in collector.collect():
File "/Users/ryan.carlson/airflow/plugins/prometheus_exporter/prometheus_exporter.py", line 150, in collect
for dag in get_dag_duration_info():
File "/Users/ryan.carlson/airflow/plugins/prometheus_exporter/prometheus_exporter.py", line 108, in get_dag_duration_info
DagRun.state == State.RUNNING
File "/Users/ryan.carlson/.virtualenvs/spothero-dataflows/lib/python3.6/site-packages/sqlalchemy/orm/query.py", line 2703, in all
return list(self)
File "/Users/ryan.carlson/.virtualenvs/spothero-dataflows/lib/python3.6/site-packages/sqlalchemy/orm/loading.py", line 90, in instances
util.raise_from_cause(err)
File "/Users/ryan.carlson/.virtualenvs/spothero-dataflows/lib/python3.6/site-packages/sqlalchemy/util/compat.py", line 203, in raise_from_cause
reraise(type(exception), exception, tb=exc_tb, cause=cause)
File "/Users/ryan.carlson/.virtualenvs/spothero-dataflows/lib/python3.6/site-packages/sqlalchemy/util/compat.py", line 187, in reraise
raise value
File "/Users/ryan.carlson/.virtualenvs/spothero-dataflows/lib/python3.6/site-packages/sqlalchemy/orm/loading.py", line 78, in instances
for row in fetch]
File "/Users/ryan.carlson/.virtualenvs/spothero-dataflows/lib/python3.6/site-packages/sqlalchemy/orm/loading.py", line 78, in
for row in fetch]
File "/Users/ryan.carlson/.virtualenvs/spothero-dataflows/lib/python3.6/site-packages/sqlalchemy/orm/loading.py", line 77, in
rows = [keyed_tuple([proc(row) for proc in process])
File "/Users/ryan.carlson/.virtualenvs/spothero-dataflows/lib/python3.6/site-packages/sqlalchemy/sql/sqltypes.py", line 1709, in process
value = impl_processor(value)
ValueError: Couldn't parse datetime string '0' - value is not a string.

Breaking change in Airflow 1.10.3(rc2) - removal of DagStat class

Unfortunately a class that the exporter depends on, DagStat, has been removed in the latest version of Airflow (tested with v1.10.3rc2)

The error as reported in the UI is: "Broken plugin: [/usr/local/airflow/plugins/airflow-exporter-0.5.4/prometheus_exporter.py] cannot import name 'DagStat'"

From the changelog: https://github.com/apache/airflow/blob/master/CHANGELOG.txt, the relevant line is: [AIRFLOW-3573] Remove DagStat table (#4378). That jira ticket is available here: https://issues.apache.org/jira/browse/AIRFLOW-3573, and contains the commit log etc.

Looking at the code, three fields are used from DagStat (id, state & count). Looking at airflow's www/views.py code (and the modification made to it in light of [AIRFLOW-3573], tagged as [AIRFLOW-3561], commit# 0cc078a86d00c0f798b2b26d828126be99af9c6c) it seems like DagRun would work instead. https://github.com/apache/airflow/blob/6970b233964ee254bbb343ed8bdc906c2f7bd974/airflow/www/views.py#L278

Do you guys have any thoughts?

Airflow database CPU usage

After adding the plugin to our installation the Airflow postgres DB reported ~100% CPU usage until the plugin was removed. This made it impossible for the scheduler to schedule new tasks, Airflow basically stopped functioning.
Our airflow contains around 100 dags, some with a few thousand dag runs - I think reloading this was too much for the DB. Do you know any workaround for this issue?

Dag and task metrics should be initialized to zero at startup

Airflow metrics don't get reset after a restart, however, the metrics did not get initialized. This lead to some unexpected PromQL responses when querying with missing data.

For example, a task state 'failed' is set to '1' at the first failure of the task but before the failure no data existed for the task with state 'failed'. A PromQL query that checks if the task at least executed once over a time period using the 'increase' function, based on either 'success' or 'failed' state count increase over that time period, responded as if neither state changed over the period of time because the 'increase' function extrapolates the value that is available over the time period if there is no data.

Prometheus documentation discusses about this issue:

A potential fix for this issue is to initialize all dag and their task metrics to zero at startup.

Tag 0.5.4 doesn't have a setup.py

Can you please tag a 1.10.2 safe version that has a setup.py to allow pip install i.e.

Command

✗ pip install --verbose git+git://github.com/epoch8/[email protected]

Throws an error

Created temporary directory: /private/var/folders/7v/7q7v2wtd7nz3b8g_0d4fbt980000gp/T/pip-ephem-wheel-cache-7szgnebb
Created temporary directory: /private/var/folders/7v/7q7v2wtd7nz3b8g_0d4fbt980000gp/T/pip-req-tracker-h7n0vflg
Created requirements tracker '/private/var/folders/7v/7q7v2wtd7nz3b8g_0d4fbt980000gp/T/pip-req-tracker-h7n0vflg'
Created temporary directory: /private/var/folders/7v/7q7v2wtd7nz3b8g_0d4fbt980000gp/T/pip-install-md5gak29
Collecting git+git://github.com/epoch8/[email protected]
  Created temporary directory: /private/var/folders/7v/7q7v2wtd7nz3b8g_0d4fbt980000gp/T/pip-req-build-mvlzmuh0
  Cloning git://github.com/epoch8/airflow-exporter.git (to revision v0.5.4) to /private/var/folders/7v/7q7v2wtd7nz3b8g_0d4fbt980000gp/T/pip-req-build-mvlzmuh0
  Running command git clone -q git://github.com/epoch8/airflow-exporter.git /private/var/folders/7v/7q7v2wtd7nz3b8g_0d4fbt980000gp/T/pip-req-build-mvlzmuh0
  Running command git show-ref v0.5.4
  b39c345a57765d768777da60ffe9ded69f0d4e3d refs/tags/v0.5.4
  Running command git rev-parse HEAD
  ebaf76348f1bf0e47c9b22b534342428da57e3bb
  Running command git checkout -q b39c345a57765d768777da60ffe9ded69f0d4e3d
  Added git+git://github.com/epoch8/[email protected] to build tracker '/private/var/folders/7v/7q7v2wtd7nz3b8g_0d4fbt980000gp/T/pip-req-tracker-h7n0vflg'
    Running setup.py (path:/private/var/folders/7v/7q7v2wtd7nz3b8g_0d4fbt980000gp/T/pip-req-build-mvlzmuh0/setup.py) egg_info for package from git+git://github.com/epoch8/[email protected]
    Running command python setup.py egg_info
    Traceback (most recent call last):
      File "<string>", line 1, in <module>
      File "/Users/ccash3/Code/frankcash/airflow_compose/venv/lib/python3.7/tokenize.py", line 447, in open
        buffer = _builtin_open(filename, 'rb')
    FileNotFoundError: [Errno 2] No such file or directory: '/private/var/folders/7v/7q7v2wtd7nz3b8g_0d4fbt980000gp/T/pip-req-build-mvlzmuh0/setup.py'
Cleaning up...
  Removing source in /private/var/folders/7v/7q7v2wtd7nz3b8g_0d4fbt980000gp/T/pip-req-build-mvlzmuh0
Removed git+git://github.com/epoch8/[email protected] from build tracker '/private/var/folders/7v/7q7v2wtd7nz3b8g_0d4fbt980000gp/T/pip-req-tracker-h7n0vflg'

When I download the .zip from the release and run ls -la there is no setup.py. I see on the master branch it was introduced for v1.0, but that doesn't support 1.10.2

[Feature Request] Report DAG Duration

Would it be possible to report the current duration of currently active DAG runs as an additional metric?

I'm going to look into this too, but was wondering if it's something that the epoch8 team had thought about already.

Add a label to indicate if DAG is paused or not

As the title suggests, it would be great if we can have some way to know if a DAG is paused or not, especially in the airflow_dag_run_duration metric, as the metric becomes useless if the a particular DAG is paused mid-run, then the DAG run duration just balloons up for that particular DAG.

The metric airflow_dag_status would also benefit from a paused="true"/"false" label, enabling easy filtering of DAGs in dashboards.

Exception thrown in prometheus exporter when trying to scrape

Stacktrace below.

Traceback (most recent call last):
  File "/usr/local/lib/python3.6/site-packages/flask/app.py", line 1988, in wsgi_app
    response = self.full_dispatch_request()
  File "/usr/local/lib/python3.6/site-packages/flask/app.py", line 1641, in full_dispatch_request
    rv = self.handle_user_exception(e)
  File "/usr/local/lib/python3.6/site-packages/flask/app.py", line 1544, in handle_user_exception
    reraise(exc_type, exc_value, tb)
  File "/usr/local/lib/python3.6/site-packages/flask/_compat.py", line 33, in reraise
    raise value
  File "/usr/local/lib/python3.6/site-packages/flask/app.py", line 1639, in full_dispatch_request
    rv = self.dispatch_request()
  File "/usr/local/lib/python3.6/site-packages/flask/app.py", line 1625, in dispatch_request
    return self.view_functions[rule.endpoint](**req.view_args)
  File "/usr/local/lib/python3.6/site-packages/flask_admin/base.py", line 69, in inner
    return self._run_view(f, *args, **kwargs)
  File "/usr/local/lib/python3.6/site-packages/flask_admin/base.py", line 368, in _run_view
    return fn(self, *args, **kwargs)
  File "/usr/local/airflow/plugins/prometheus_exporter/prometheus_exporter.py", line 136, in index
    return Response(generate_latest(), mimetype='text')
  File "/usr/local/airflow/plugins/prometheus_exporter/prometheus_exporter.py", line 114, in generate_latest
    for name, labels, value in metric.samples:
ValueError: too many values to unpack (expected 3)

plugin installation on latest puckel/docker-airflow causes exceptions on startup

this is a weird issue but i'll do my best to describe it and how i concluded the fault is in the plugin.

i am building a custom airflow image based on puckel/docker-airflow:latest

my Dockerfile looks like this:

FROM puckel/docker-airflow:latest

ARG AIRFLOW_HOME=/usr/local/airflow
ARG PYTHON_DEPS="boto3 prometheus_client ec2-metadata apache-airflow[s3]"

ENV PATH $AIRFLOW_HOME/utils:$PATH
ENV PYTHONPATH $AIRFLOW_HOME/utils:$PYTHONPATH

USER root

RUN apt-get update && \
    apt-get install -y git && \
    pip3 install ${PYTHON_DEPS} && \
    rm -rf /var/lib/apt/lists/*

ADD scripts/entrypoint.sh /entrypoint.sh
ADD airflow.cfg ${AIRFLOW_HOME}/airflow.cfg

RUN chown airflow. /entrypoint.sh ${AIRFLOW_HOME}/airflow.cfg && \
    chmod +x /entrypoint.sh 

# Install custom plugins
RUN mkdir -p ${AIRFLOW_HOME}/plugins && \    
    git clone https://github.com/epoch8/airflow-exporter plugins/prometheus_exporter

USER airflow
WORKDIR ${AIRFLOW_HOME} 

lately after building the image running it will produce the following error and cause the container to exit:

airflow@1cb23e140e81:~$ airflow worker
usage: airflow [global_opts] cmd1 [cmd1_opts] [cmd2 [cmd2_opts] ...]
   or: airflow --help [cmd1 cmd2 ...]
   or: airflow --help-commands
   or: airflow cmd --help

after some debugging and getting no results, i tried just commenting everything in my Dockerfile and slowly building the image while each time uncommenting 1 line and executing airflow. i finally got to the lat lines (plugin installation) an behold. even building the image with the plugin and just deleting the plugin directory inside the container fixed the issue.....

this is a real issue mainly because no informative errors get printed even. i still have no clue as to how or why this causes this....

Sqlalchemy error with Airflow 2.0.1

I just installed airflow-prometheus on my Airflow setup with Kubernetes :

airflow@airflow-uat-test2-web-6db768d5d5-rwjzd:/opt/airflow$ airflow version
2.0.1
airflow@airflow-uat-test2-web-6db768d5d5-rwjzd:/opt/airflow$ pip freeze | grep airflow
airflow-exporter==1.5.1
apache-airflow @ file:///opt/airflow
apache-airflow-providers-apache-cassandra==1.0.1
apache-airflow-providers-apache-hdfs==1.0.1

I'm using Mysql 5.7 (Percona server).

When I hit /admin/metrics/ :

Something bad has happened.
Please consider letting us know by creating a bug report using GitHub.

Python version: 3.8.10
Airflow version: 2.0.1
Node: airflow-uat-test2-web-6db768d5d5-rwjzd
-------------------------------------------------------------------------------
Traceback (most recent call last):
  File "/home/airflow/.local/lib/python3.8/site-packages/sqlalchemy/engine/base.py", line 1276, in _execute_context
    self.dialect.do_execute(
  File "/home/airflow/.local/lib/python3.8/site-packages/sqlalchemy/engine/default.py", line 608, in do_execute
    cursor.execute(statement, parameters)
  File "/home/airflow/.local/lib/python3.8/site-packages/MySQLdb/cursors.py", line 255, in execute
    self.errorhandler(self, exc, value)
  File "/home/airflow/.local/lib/python3.8/site-packages/MySQLdb/connections.py", line 50, in defaulterrorhandler
    raise errorvalue
  File "/home/airflow/.local/lib/python3.8/site-packages/MySQLdb/cursors.py", line 252, in execute
    res = self._query(query)
  File "/home/airflow/.local/lib/python3.8/site-packages/MySQLdb/cursors.py", line 378, in _query
    db.query(q)
  File "/home/airflow/.local/lib/python3.8/site-packages/MySQLdb/connections.py", line 280, in query
    _mysql.connection.query(self, query)
_mysql_exceptions.ProgrammingError: (1064, "You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near '(PARTITION BY dag_run.dag_id ORDER BY dag_run.execution_date DESC) AS `row_numbe' at line 2")

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/home/airflow/.local/lib/python3.8/site-packages/flask/app.py", line 2447, in wsgi_app
    response = self.full_dispatch_request()
  File "/home/airflow/.local/lib/python3.8/site-packages/flask/app.py", line 1952, in full_dispatch_request
    rv = self.handle_user_exception(e)
  File "/home/airflow/.local/lib/python3.8/site-packages/flask/app.py", line 1821, in handle_user_exception
    reraise(exc_type, exc_value, tb)
  File "/home/airflow/.local/lib/python3.8/site-packages/flask/_compat.py", line 39, in reraise
    raise value
  File "/home/airflow/.local/lib/python3.8/site-packages/flask/app.py", line 1950, in full_dispatch_request
    rv = self.dispatch_request()
  File "/home/airflow/.local/lib/python3.8/site-packages/flask/app.py", line 1936, in dispatch_request
    return self.view_functions[rule.endpoint](**req.view_args)
  File "/home/airflow/.local/lib/python3.8/site-packages/airflow_exporter/prometheus_exporter.py", line 306, in list
    return Response(generate_latest(), mimetype='text')
  File "/home/airflow/.local/lib/python3.8/site-packages/prometheus_client/exposition.py", line 106, in generate_latest
    for metric in registry.collect():
  File "/home/airflow/.local/lib/python3.8/site-packages/prometheus_client/registry.py", line 82, in collect
    for metric in collector.collect():
  File "/home/airflow/.local/lib/python3.8/site-packages/airflow_exporter/prometheus_exporter.py", line 229, in collect
    last_dagrun_info = get_last_dagrun_info()
  File "/home/airflow/.local/lib/python3.8/site-packages/airflow_exporter/prometheus_exporter.py", line 72, in get_last_dagrun_info
    sql_res = Session.query(
  File "/home/airflow/.local/lib/python3.8/site-packages/sqlalchemy/orm/query.py", line 3373, in all
    return list(self)
  File "/home/airflow/.local/lib/python3.8/site-packages/sqlalchemy/orm/query.py", line 3535, in __iter__
    return self._execute_and_instances(context)
  File "/home/airflow/.local/lib/python3.8/site-packages/sqlalchemy/orm/query.py", line 3560, in _execute_and_instances
    result = conn.execute(querycontext.statement, self._params)
  File "/home/airflow/.local/lib/python3.8/site-packages/sqlalchemy/engine/base.py", line 1011, in execute
    return meth(self, multiparams, params)
  File "/home/airflow/.local/lib/python3.8/site-packages/sqlalchemy/sql/elements.py", line 298, in _execute_on_connection
    return connection._execute_clauseelement(self, multiparams, params)
  File "/home/airflow/.local/lib/python3.8/site-packages/sqlalchemy/engine/base.py", line 1124, in _execute_clauseelement
    ret = self._execute_context(
  File "/home/airflow/.local/lib/python3.8/site-packages/sqlalchemy/engine/base.py", line 1316, in _execute_context
    self._handle_dbapi_exception(
  File "/home/airflow/.local/lib/python3.8/site-packages/sqlalchemy/engine/base.py", line 1510, in _handle_dbapi_exception
    util.raise_(
  File "/home/airflow/.local/lib/python3.8/site-packages/sqlalchemy/util/compat.py", line 182, in raise_
    raise exception
  File "/home/airflow/.local/lib/python3.8/site-packages/sqlalchemy/engine/base.py", line 1276, in _execute_context
    self.dialect.do_execute(
  File "/home/airflow/.local/lib/python3.8/site-packages/sqlalchemy/engine/default.py", line 608, in do_execute
    cursor.execute(statement, parameters)
  File "/home/airflow/.local/lib/python3.8/site-packages/MySQLdb/cursors.py", line 255, in execute
    self.errorhandler(self, exc, value)
  File "/home/airflow/.local/lib/python3.8/site-packages/MySQLdb/connections.py", line 50, in defaulterrorhandler
    raise errorvalue
  File "/home/airflow/.local/lib/python3.8/site-packages/MySQLdb/cursors.py", line 252, in execute
    res = self._query(query)
  File "/home/airflow/.local/lib/python3.8/site-packages/MySQLdb/cursors.py", line 378, in _query
    db.query(q)
  File "/home/airflow/.local/lib/python3.8/site-packages/MySQLdb/connections.py", line 280, in query
    _mysql.connection.query(self, query)
sqlalchemy.exc.ProgrammingError: (_mysql_exceptions.ProgrammingError) (1064, "You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near '(PARTITION BY dag_run.dag_id ORDER BY dag_run.execution_date DESC) AS `row_numbe' at line 2")
[SQL: SELECT anon_1.dag_id AS anon_1_dag_id, anon_1.state AS anon_1_state, anon_1.`row_number` AS anon_1_row_number, dag.owners AS dag_owners 
FROM (SELECT dag_run.dag_id AS dag_id, dag_run.state AS state, row_number() OVER (PARTITION BY dag_run.dag_id ORDER BY dag_run.execution_date DESC) AS `row_number` 
FROM dag_run) AS anon_1 INNER JOIN dag ON dag.dag_id = anon_1.dag_id 
WHERE anon_1.`row_number` = %s]
[parameters: (1,)]
(Background on this error at: http://sqlalche.me/e/13/f405)

airflow v2.1.3 webserver failing to start after installing latest airflow-exporter

After installing airflow-exporter airflow version 2.1.3 shifted to lower than 2.1.0 version

+ kubectl logs airflow-web-85bc665c56-5jjjr -n processing -c install-pip-packages
Requirement already satisfied: boto3 in /home/airflow/.local/lib/python3.8/site-packages (1.17.112)
Requirement already satisfied: kubernetes in /home/airflow/.local/lib/python3.8/site-packages (11.0.0)
Collecting airflow-kubernetes-job-operator
  Downloading airflow_kubernetes_job_operator-2.0.3-py2.py3-none-any.whl (47 kB)
Collecting SQLAlchemy==1.3.23
  Downloading SQLAlchemy-1.3.23-cp38-cp38-manylinux2010_x86_64.whl (1.3 MB)
Collecting airflow-exporter
  Downloading airflow_exporter-1.5.2-py2.py3-none-any.whl (6.3 kB)
Collecting authlib
  Downloading Authlib-0.15.5-py2.py3-none-any.whl (203 kB)
Collecting Flask-AppBuilder==3.1.1
  Downloading Flask_AppBuilder-3.1.1-py3-none-any.whl (1.7 MB)
Requirement already satisfied: python-dateutil<3,>=2.3 in /home/airflow/.local/lib/python3.8/site-packages (from Flask-AppBuilder==3.1.1) (2.8.2)
Requirement already satisfied: Flask-SQLAlchemy<3,>=2.4 in /home/airflow/.local/lib/python3.8/site-packages (from Flask-AppBuilder==3.1.1) (2.5.1)
Requirement already satisfied: Flask-Login<0.5,>=0.3 in /home/airflow/.local/lib/python3.8/site-packages (from Flask-AppBuilder==3.1.1) (0.4.1)
Requirement already satisfied: marshmallow-enum<2,>=1.5.1 in /home/airflow/.local/lib/python3.8/site-packages (from Flask-AppBuilder==3.1.1) (1.5.1)
Requirement already satisfied: marshmallow-sqlalchemy<0.24.0,>=0.22.0 in /home/airflow/.local/lib/python3.8/site-packages (from Flask-AppBuilder==3.1.1) (0.23.1)
Requirement already satisfied: PyJWT>=1.7.1 in /home/airflow/.local/lib/python3.8/site-packages (from Flask-AppBuilder==3.1.1) (1.7.1)
Requirement already satisfied: sqlalchemy-utils<1,>=0.32.21 in /home/airflow/.local/lib/python3.8/site-packages (from Flask-AppBuilder==3.1.1) (0.37.8)
Requirement already satisfied: Flask-JWT-Extended<4,>=3.18 in /home/airflow/.local/lib/python3.8/site-packages (from Flask-AppBuilder==3.1.1) (3.25.1)
Requirement already satisfied: Flask-OpenID<2,>=1.2.5 in /home/airflow/.local/lib/python3.8/site-packages (from Flask-AppBuilder==3.1.1) (1.3.0)
Requirement already satisfied: Flask-Babel<2,>=1 in /home/airflow/.local/lib/python3.8/site-packages (from Flask-AppBuilder==3.1.1) (1.0.0)
Requirement already satisfied: prison<1.0.0,>=0.1.3 in /home/airflow/.local/lib/python3.8/site-packages (from Flask-AppBuilder==3.1.1) (0.1.3)
Requirement already satisfied: colorama<1,>=0.3.9 in /home/airflow/.local/lib/python3.8/site-packages (from Flask-AppBuilder==3.1.1) (0.4.4)
Requirement already satisfied: apispec[yaml]<4,>=3.3 in /home/airflow/.local/lib/python3.8/site-packages (from Flask-AppBuilder==3.1.1) (3.3.2)
Requirement already satisfied: marshmallow<4,>=3 in /home/airflow/.local/lib/python3.8/site-packages (from Flask-AppBuilder==3.1.1) (3.13.0)
Requirement already satisfied: Flask<2,>=0.12 in /home/airflow/.local/lib/python3.8/site-packages (from Flask-AppBuilder==3.1.1) (1.1.4)
Requirement already satisfied: email-validator<2,>=1.0.5 in /home/airflow/.local/lib/python3.8/site-packages (from Flask-AppBuilder==3.1.1) (1.1.3)
Requirement already satisfied: click<8,>=6.7 in /home/airflow/.local/lib/python3.8/site-packages (from Flask-AppBuilder==3.1.1) (7.1.2)
Requirement already satisfied: jsonschema<4,>=3.0.1 in /home/airflow/.local/lib/python3.8/site-packages (from Flask-AppBuilder==3.1.1) (3.2.0)
Requirement already satisfied: Flask-WTF<0.15.0,>=0.14.2 in /home/airflow/.local/lib/python3.8/site-packages (from Flask-AppBuilder==3.1.1) (0.14.3)
Requirement already satisfied: s3transfer<0.5.0,>=0.4.0 in /home/airflow/.local/lib/python3.8/site-packages (from boto3) (0.4.2)
Requirement already satisfied: botocore<1.21.0,>=1.20.112 in /home/airflow/.local/lib/python3.8/site-packages (from boto3) (1.20.112)
Requirement already satisfied: jmespath<1.0.0,>=0.7.1 in /home/airflow/.local/lib/python3.8/site-packages (from boto3) (0.10.0)
Requirement already satisfied: urllib3>=1.24.2 in /home/airflow/.local/lib/python3.8/site-packages (from kubernetes) (1.26.6)
Requirement already satisfied: setuptools>=21.0.0 in /usr/local/lib/python3.8/site-packages (from kubernetes) (57.5.0)
Requirement already satisfied: pyyaml>=3.12 in /home/airflow/.local/lib/python3.8/site-packages (from kubernetes) (5.4.1)
Requirement already satisfied: six>=1.9.0 in /home/airflow/.local/lib/python3.8/site-packages (from kubernetes) (1.16.0)
Requirement already satisfied: websocket-client!=0.40.0,!=0.41.*,!=0.42.*,>=0.32.0 in /home/airflow/.local/lib/python3.8/site-packages (from kubernetes) (1.2.1)
Requirement already satisfied: requests-oauthlib in /home/airflow/.local/lib/python3.8/site-packages (from kubernetes) (1.3.0)
Requirement already satisfied: google-auth>=1.0.1 in /home/airflow/.local/lib/python3.8/site-packages (from kubernetes) (1.35.0)
Requirement already satisfied: certifi>=14.05.14 in /home/airflow/.local/lib/python3.8/site-packages (from kubernetes) (2020.12.5)
Requirement already satisfied: requests in /home/airflow/.local/lib/python3.8/site-packages (from kubernetes) (2.26.0)
Collecting zthreading>=0.1.13
  Downloading zthreading-0.1.17-py2.py3-none-any.whl (21 kB)
Requirement already satisfied: apache-airflow>=2.0.0 in /home/airflow/.local/lib/python3.8/site-packages (from airflow-exporter) (2.1.3)
Requirement already satisfied: prometheus-client>=0.4.2 in /home/airflow/.local/lib/python3.8/site-packages (from airflow-exporter) (0.8.0)
Requirement already satisfied: cryptography in /home/airflow/.local/lib/python3.8/site-packages (from authlib) (3.4.7)
Requirement already satisfied: markdown<4.0,>=2.5.2 in /home/airflow/.local/lib/python3.8/site-packages (from apache-airflow>=2.0.0->airflow-exporter) (3.3.4)
Requirement already satisfied: apache-airflow-providers-sqlite in /home/airflow/.local/lib/python3.8/site-packages (from apache-airflow>=2.0.0->airflow-exporter) (2.0.0)
Requirement already satisfied: colorlog>=4.0.2 in /home/airflow/.local/lib/python3.8/site-packages (from apache-airflow>=2.0.0->airflow-exporter) (4.8.0)
Requirement already satisfied: psutil<6.0.0,>=4.2.0 in /home/airflow/.local/lib/python3.8/site-packages (from apache-airflow>=2.0.0->airflow-exporter) (5.8.0)
Requirement already satisfied: pandas<2.0,>=0.17.1 in /home/airflow/.local/lib/python3.8/site-packages (from apache-airflow>=2.0.0->airflow-exporter) (1.3.2)
Requirement already satisfied: argcomplete~=1.10 in /home/airflow/.local/lib/python3.8/site-packages (from apache-airflow>=2.0.0->airflow-exporter) (1.12.3)
Requirement already satisfied: pendulum~=2.0 in /home/airflow/.local/lib/python3.8/site-packages (from apache-airflow>=2.0.0->airflow-exporter) (2.1.2)
Requirement already satisfied: tabulate<0.9,>=0.7.5 in /home/airflow/.local/lib/python3.8/site-packages (from apache-airflow>=2.0.0->airflow-exporter) (0.8.9)
Requirement already satisfied: swagger-ui-bundle>=0.0.2 in /home/airflow/.local/lib/python3.8/site-packages (from apache-airflow>=2.0.0->airflow-exporter) (0.0.8)
Requirement already satisfied: unicodecsv>=0.14.1 in /home/airflow/.local/lib/python3.8/site-packages (from apache-airflow>=2.0.0->airflow-exporter) (0.14.1)
Requirement already satisfied: docutils<0.17 in /home/airflow/.local/lib/python3.8/site-packages (from apache-airflow>=2.0.0->airflow-exporter) (0.16)
Requirement already satisfied: iso8601>=0.1.12 in /home/airflow/.local/lib/python3.8/site-packages (from apache-airflow>=2.0.0->airflow-exporter) (0.1.16)
Requirement already satisfied: markupsafe<2.0,>=1.1.1 in /home/airflow/.local/lib/python3.8/site-packages (from apache-airflow>=2.0.0->airflow-exporter) (1.1.1)
Requirement already satisfied: flask-caching<2.0.0,>=1.5.0 in /home/airflow/.local/lib/python3.8/site-packages (from apache-airflow>=2.0.0->airflow-exporter) (1.10.1)
Requirement already satisfied: inflection>=0.3.1 in /home/airflow/.local/lib/python3.8/site-packages (from apache-airflow>=2.0.0->airflow-exporter) (0.5.1)
Requirement already satisfied: clickclick>=1.2 in /home/airflow/.local/lib/python3.8/site-packages (from apache-airflow>=2.0.0->airflow-exporter) (20.10.2)
Requirement already satisfied: importlib-resources~=1.4 in /home/airflow/.local/lib/python3.8/site-packages (from apache-airflow>=2.0.0->airflow-exporter) (1.5.0)
Requirement already satisfied: croniter<1.1,>=0.3.17 in /home/airflow/.local/lib/python3.8/site-packages (from apache-airflow>=2.0.0->airflow-exporter) (1.0.15)
Requirement already satisfied: termcolor>=1.1.0 in /home/airflow/.local/lib/python3.8/site-packages (from apache-airflow>=2.0.0->airflow-exporter) (1.1.0)
Requirement already satisfied: marshmallow-oneofschema>=2.0.1 in /home/airflow/.local/lib/python3.8/site-packages (from apache-airflow>=2.0.0->airflow-exporter) (3.0.1)
Requirement already satisfied: sqlalchemy-jsonfield~=1.0 in /home/airflow/.local/lib/python3.8/site-packages (from apache-airflow>=2.0.0->airflow-exporter) (1.0.0)
Requirement already satisfied: apache-airflow-providers-ftp in /home/airflow/.local/lib/python3.8/site-packages (from apache-airflow>=2.0.0->airflow-exporter) (2.0.0)
Requirement already satisfied: python-nvd3~=0.15.0 in /home/airflow/.local/lib/python3.8/site-packages (from apache-airflow>=2.0.0->airflow-exporter) (0.15.0)
Requirement already satisfied: pygments<3.0,>=2.0.1 in /home/airflow/.local/lib/python3.8/site-packages (from apache-airflow>=2.0.0->airflow-exporter) (2.10.0)
Requirement already satisfied: importlib-metadata>=1.7 in /home/airflow/.local/lib/python3.8/site-packages (from apache-airflow>=2.0.0->airflow-exporter) (4.6.4)
Requirement already satisfied: lazy-object-proxy in /home/airflow/.local/lib/python3.8/site-packages (from apache-airflow>=2.0.0->airflow-exporter) (1.4.3)
Requirement already satisfied: apache-airflow-providers-imap in /home/airflow/.local/lib/python3.8/site-packages (from apache-airflow>=2.0.0->airflow-exporter) (2.0.0)
Requirement already satisfied: werkzeug>=1.0.1,~=1.0 in /home/airflow/.local/lib/python3.8/site-packages (from apache-airflow>=2.0.0->airflow-exporter) (1.0.1)
Requirement already satisfied: rich>=9.2.0 in /home/airflow/.local/lib/python3.8/site-packages (from apache-airflow>=2.0.0->airflow-exporter) (10.7.0)
Requirement already satisfied: alembic<2.0,>=1.2 in /home/airflow/.local/lib/python3.8/site-packages (from apache-airflow>=2.0.0->airflow-exporter) (1.6.5)
Requirement already satisfied: lockfile>=0.12.2 in /home/airflow/.local/lib/python3.8/site-packages (from apache-airflow>=2.0.0->airflow-exporter) (0.12.2)
Requirement already satisfied: setproctitle<2,>=1.1.8 in /home/airflow/.local/lib/python3.8/site-packages (from apache-airflow>=2.0.0->airflow-exporter) (1.2.2)
Requirement already satisfied: python3-openid~=3.2 in /home/airflow/.local/lib/python3.8/site-packages (from apache-airflow>=2.0.0->airflow-exporter) (3.2.0)
Collecting apache-airflow>=2.0.0
  Downloading apache_airflow-2.2.2-py3-none-any.whl (5.3 MB)
Requirement already satisfied: tenacity>=6.2.0 in /home/airflow/.local/lib/python3.8/site-packages (from apache-airflow>=2.0.0->airflow-exporter) (6.2.0)
Requirement already satisfied: packaging>=14.0 in /home/airflow/.local/lib/python3.8/site-packages (from apache-airflow>=2.0.0->airflow-exporter) (20.9)
Requirement already satisfied: wtforms<3.0.0 in /home/airflow/.local/lib/python3.8/site-packages (from apache-airflow>=2.0.0->airflow-exporter) (2.3.3)
Requirement already satisfied: gunicorn>=20.1.0 in /home/airflow/.local/lib/python3.8/site-packages (from apache-airflow>=2.0.0->airflow-exporter) (20.1.0)
  Downloading apache_airflow-2.2.1-py3-none-any.whl (5.3 MB)
  Downloading apache_airflow-2.2.0-py3-none-any.whl (5.3 MB)
  Downloading apache_airflow-2.1.4-py3-none-any.whl (5.3 MB)
  Downloading apache_airflow-2.1.2-py3-none-any.whl (5.2 MB)
  Downloading apache_airflow-2.1.1-py3-none-any.whl (5.2 MB)
  Downloading apache_airflow-2.1.0-py3-none-any.whl (5.3 MB)
Requirement already satisfied: cattrs~=1.1 in /home/airflow/.local/lib/python3.8/site-packages (from apache-airflow>=2.0.0->airflow-exporter) (1.5.0)
  Downloading apache_airflow-2.0.2-py3-none-any.whl (4.6 MB)
Collecting connexion[flask,swagger-ui]<3,>=2.6.0
  Downloading connexion-2.9.0-py2.py3-none-any.whl (84 kB)
Collecting gunicorn<20.0,>=19.5.0
  Downloading gunicorn-19.10.0-py2.py3-none-any.whl (113 kB)
Collecting croniter<0.4,>=0.3.17
  Downloading croniter-0.3.37-py2.py3-none-any.whl (13 kB)
Requirement already satisfied: itsdangerous>=1.1.0 in /home/airflow/.local/lib/python3.8/site-packages (from apache-airflow>=2.0.0->airflow-exporter) (1.1.0)
Collecting rich==9.2.0
  Downloading rich-9.2.0-py3-none-any.whl (164 kB)
Requirement already satisfied: python-slugify<5.0,>=3.0.0 in /home/airflow/.local/lib/python3.8/site-packages (from apache-airflow>=2.0.0->airflow-exporter) (4.0.1)
Requirement already satisfied: python-daemon>=2.2.4 in /home/airflow/.local/lib/python3.8/site-packages (from apache-airflow>=2.0.0->airflow-exporter) (2.3.0)
Collecting importlib-metadata~=1.7
  Downloading importlib_metadata-1.7.0-py2.py3-none-any.whl (31 kB)
Requirement already satisfied: graphviz>=0.12 in /home/airflow/.local/lib/python3.8/site-packages (from apache-airflow>=2.0.0->airflow-exporter) (0.17)
Requirement already satisfied: numpy in /home/airflow/.local/lib/python3.8/site-packages (from apache-airflow>=2.0.0->airflow-exporter) (1.20.3)
Requirement already satisfied: jinja2<2.12.0,>=2.10.1 in /home/airflow/.local/lib/python3.8/site-packages (from apache-airflow>=2.0.0->airflow-exporter) (2.11.3)
Requirement already satisfied: blinker in /home/airflow/.local/lib/python3.8/site-packages (from apache-airflow>=2.0.0->airflow-exporter) (1.4)
Requirement already satisfied: attrs<21.0,>=20.0 in /home/airflow/.local/lib/python3.8/site-packages (from apache-airflow>=2.0.0->airflow-exporter) (20.3.0)
Requirement already satisfied: apache-airflow-providers-http in /home/airflow/.local/lib/python3.8/site-packages (from apache-airflow>=2.0.0->airflow-exporter) (2.0.0)
Collecting cached-property~=1.5
  Downloading cached_property-1.5.2-py2.py3-none-any.whl (7.6 kB)
Requirement already satisfied: dill<0.4,>=0.2.2 in /home/airflow/.local/lib/python3.8/site-packages (from apache-airflow>=2.0.0->airflow-exporter) (0.3.1.1)
Requirement already satisfied: typing-extensions<4.0.0,>=3.7.4 in /home/airflow/.local/lib/python3.8/site-packages (from rich==9.2.0->apache-airflow>=2.0.0->airflow-exporter) (3.7.4.3)
Requirement already satisfied: commonmark<0.10.0,>=0.9.0 in /home/airflow/.local/lib/python3.8/site-packages (from rich==9.2.0->apache-airflow>=2.0.0->airflow-exporter) (0.9.1)
Requirement already satisfied: python-editor>=0.3 in /home/airflow/.local/lib/python3.8/site-packages (from alembic<2.0,>=1.2->apache-airflow>=2.0.0->airflow-exporter) (1.0.4)
Requirement already satisfied: Mako in /home/airflow/.local/lib/python3.8/site-packages (from alembic<2.0,>=1.2->apache-airflow>=2.0.0->airflow-exporter) (1.1.4)
Requirement already satisfied: openapi-spec-validator<0.4,>=0.2.4 in /home/airflow/.local/lib/python3.8/site-packages (from connexion[flask,swagger-ui]<3,>=2.6.0->apache-airflow>=2.0.0->airflow-exporter) (0.3.1)
Collecting natsort
  Downloading natsort-8.0.0-py3-none-any.whl (37 kB)
Requirement already satisfied: cffi>=1.12 in /home/airflow/.local/lib/python3.8/site-packages (from cryptography->authlib) (1.14.6)
Requirement already satisfied: pycparser in /home/airflow/.local/lib/python3.8/site-packages (from cffi>=1.12->cryptography->authlib) (2.20)
Requirement already satisfied: dnspython>=1.15.0 in /home/airflow/.local/lib/python3.8/site-packages (from email-validator<2,>=1.0.5->Flask-AppBuilder==3.1.1) (1.16.0)
Requirement already satisfied: idna>=2.0.0 in /home/airflow/.local/lib/python3.8/site-packages (from email-validator<2,>=1.0.5->Flask-AppBuilder==3.1.1) (3.2)
Requirement already satisfied: pytz in /home/airflow/.local/lib/python3.8/site-packages (from Flask-Babel<2,>=1->Flask-AppBuilder==3.1.1) (2021.1)
Requirement already satisfied: Babel>=2.3 in /home/airflow/.local/lib/python3.8/site-packages (from Flask-Babel<2,>=1->Flask-AppBuilder==3.1.1) (2.9.1)
Requirement already satisfied: rsa<5,>=3.1.4 in /home/airflow/.local/lib/python3.8/site-packages (from google-auth>=1.0.1->kubernetes) (4.7.2)
Requirement already satisfied: cachetools<5.0,>=2.0.0 in /home/airflow/.local/lib/python3.8/site-packages (from google-auth>=1.0.1->kubernetes) (4.2.2)
Requirement already satisfied: pyasn1-modules>=0.2.1 in /home/airflow/.local/lib/python3.8/site-packages (from google-auth>=1.0.1->kubernetes) (0.2.8)
Requirement already satisfied: zipp>=0.5 in /home/airflow/.local/lib/python3.8/site-packages (from importlib-metadata~=1.7->apache-airflow>=2.0.0->airflow-exporter) (3.5.0)
Requirement already satisfied: pyrsistent>=0.14.0 in /home/airflow/.local/lib/python3.8/site-packages (from jsonschema<4,>=3.0.1->Flask-AppBuilder==3.1.1) (0.18.0)
Requirement already satisfied: openapi-schema-validator in /home/airflow/.local/lib/python3.8/site-packages (from openapi-spec-validator<0.4,>=0.2.4->connexion[flask,swagger-ui]<3,>=2.6.0->apache-airflow>=2.0.0->airflow-exporter) (0.1.5)
Requirement already satisfied: pytzdata>=2020.1 in /home/airflow/.local/lib/python3.8/site-packages (from pendulum~=2.0->apache-airflow>=2.0.0->airflow-exporter) (2020.1)
Requirement already satisfied: pyasn1<0.5.0,>=0.4.6 in /home/airflow/.local/lib/python3.8/site-packages (from pyasn1-modules>=0.2.1->google-auth>=1.0.1->kubernetes) (0.4.8)
Requirement already satisfied: text-unidecode>=1.3 in /home/airflow/.local/lib/python3.8/site-packages (from python-slugify<5.0,>=3.0.0->apache-airflow>=2.0.0->airflow-exporter) (1.3)
Requirement already satisfied: defusedxml in /home/airflow/.local/lib/python3.8/site-packages (from python3-openid~=3.2->apache-airflow>=2.0.0->airflow-exporter) (0.7.1)
Requirement already satisfied: charset-normalizer~=2.0.0 in /home/airflow/.local/lib/python3.8/site-packages (from requests->kubernetes) (2.0.4)
INFO: pip is looking at multiple versions of apache-airflow-providers-http to determine which version is compatible with other requirements. This could take a while.
Collecting apache-airflow-providers-http
  Downloading apache_airflow_providers_http-2.0.1-py3-none-any.whl (21 kB)
Requirement already satisfied: isodate in /home/airflow/.local/lib/python3.8/site-packages (from openapi-schema-validator->openapi-spec-validator<0.4,>=0.2.4->connexion[flask,swagger-ui]<3,>=2.6.0->apache-airflow>=2.0.0->airflow-exporter) (0.6.0)
Requirement already satisfied: oauthlib>=3.0.0 in /home/airflow/.local/lib/python3.8/site-packages (from requests-oauthlib->kubernetes) (3.1.1)
Installing collected packages: SQLAlchemy, natsort, connexion, rich, importlib-metadata, gunicorn, Flask-AppBuilder, croniter, cached-property, apache-airflow-providers-http, zthreading, apache-airflow, authlib, airflow-kubernetes-job-operator, airflow-exporter
  Attempting uninstall: SQLAlchemy
    Found existing installation: SQLAlchemy 1.3.24
    Uninstalling SQLAlchemy-1.3.24:
      Successfully uninstalled SQLAlchemy-1.3.24
  Attempting uninstall: rich
    Found existing installation: rich 10.7.0
    Uninstalling rich-10.7.0:
      Successfully uninstalled rich-10.7.0
  Attempting uninstall: importlib-metadata
    Found existing installation: importlib-metadata 4.6.4
    Uninstalling importlib-metadata-4.6.4:
      Successfully uninstalled importlib-metadata-4.6.4
  Attempting uninstall: gunicorn
    Found existing installation: gunicorn 20.1.0
    Uninstalling gunicorn-20.1.0:
      Successfully uninstalled gunicorn-20.1.0
  Attempting uninstall: Flask-AppBuilder
    Found existing installation: Flask-AppBuilder 3.3.2
    Uninstalling Flask-AppBuilder-3.3.2:
      Successfully uninstalled Flask-AppBuilder-3.3.2
  Attempting uninstall: croniter
    Found existing installation: croniter 1.0.15
    Uninstalling croniter-1.0.15:
      Successfully uninstalled croniter-1.0.15
  Attempting uninstall: apache-airflow-providers-http
    Found existing installation: apache-airflow-providers-http 2.0.0
    Uninstalling apache-airflow-providers-http-2.0.0:
      Successfully uninstalled apache-airflow-providers-http-2.0.0
  Attempting uninstall: apache-airflow
    Found existing installation: apache-airflow 2.1.3
    Uninstalling apache-airflow-2.1.3:
      Successfully uninstalled apache-airflow-2.1.3
ERROR: pip's dependency resolver does not currently take into account all the packages that are installed. This behaviour is the source of the following dependency conflicts.
apache-airflow-providers-ssh 2.1.0 requires apache-airflow>=2.1.0, but you have apache-airflow 2.0.2 which is incompatible.
apache-airflow-providers-slack 4.0.0 requires apache-airflow>=2.1.0, but you have apache-airflow 2.0.2 which is incompatible.
apache-airflow-providers-sftp 2.1.0 requires apache-airflow>=2.1.0, but you have apache-airflow 2.0.2 which is incompatible.
apache-airflow-providers-redis 2.0.0 requires apache-airflow>=2.1.0, but you have apache-airflow 2.0.2 which is incompatible.
apache-airflow-providers-postgres 2.0.0 requires apache-airflow>=2.1.0, but you have apache-airflow 2.0.2 which is incompatible.
apache-airflow-providers-mysql 2.1.0 requires apache-airflow>=2.1.0, but you have apache-airflow 2.0.2 which is incompatible.
apache-airflow-providers-microsoft-azure 3.1.0 requires apache-airflow>=2.1.0, but you have apache-airflow 2.0.2 which is incompatible.
apache-airflow-providers-hashicorp 2.0.0 requires apache-airflow>=2.1.0, but you have apache-airflow 2.0.2 which is incompatible.
apache-airflow-providers-grpc 2.0.0 requires apache-airflow>=2.1.0, but you have apache-airflow 2.0.2 which is incompatible.
apache-airflow-providers-google 5.0.0 requires apache-airflow>=2.1.0, but you have apache-airflow 2.0.2 which is incompatible.
apache-airflow-providers-elasticsearch 2.0.2 requires apache-airflow>=2.1.0, but you have apache-airflow 2.0.2 which is incompatible.
apache-airflow-providers-docker 2.1.0 requires apache-airflow>=2.1.0, but you have apache-airflow 2.0.2 which is incompatible.
apache-airflow-providers-cncf-kubernetes 2.0.2 requires apache-airflow>=2.1.0, but you have apache-airflow 2.0.2 which is incompatible.
apache-airflow-providers-celery 2.0.0 requires apache-airflow>=2.1.0, but you have apache-airflow 2.0.2 which is incompatible.
apache-airflow-providers-amazon 2.1.0 requires apache-airflow>=2.1.0, but you have apache-airflow 2.0.2 which is incompatible.
Successfully installed Flask-AppBuilder-3.1.1 SQLAlchemy-1.3.23 airflow-exporter-1.5.2 airflow-kubernetes-job-operator-2.0.3 apache-airflow-2.0.2 apache-airflow-providers-http-2.0.1 authlib-0.15.5 cached-property-1.5.2 connexion-2.9.0 croniter-0.3.37 gunicorn-19.10.0 importlib-metadata-1.7.0 natsort-8.0.0 rich-9.2.0 zthreading-0.1.17
WARNING: You are using pip version 21.2.4; however, version 21.3.1 is available.
You should consider upgrading via the '/usr/local/bin/python -m pip install --upgrade pip' command.
copying '/home/airflow/.local/*' to '/opt/home-airflow-local'...

For comparison below is log with no airflow-exporter

(base) ubuntu@dea-dev-eks:~/datakube-apps$ k logs airflow-web-58d9cc69f8-mgjt4 -n processing -c install-pip-packages
+ kubectl logs airflow-web-58d9cc69f8-mgjt4 -n processing -c install-pip-packages
Requirement already satisfied: boto3 in /home/airflow/.local/lib/python3.8/site-packages (1.17.112)
Requirement already satisfied: kubernetes in /home/airflow/.local/lib/python3.8/site-packages (11.0.0)
Collecting airflow-kubernetes-job-operator
  Downloading airflow_kubernetes_job_operator-2.0.3-py2.py3-none-any.whl (47 kB)
Collecting SQLAlchemy==1.3.23
  Downloading SQLAlchemy-1.3.23-cp38-cp38-manylinux2010_x86_64.whl (1.3 MB)
Collecting authlib
  Downloading Authlib-0.15.5-py2.py3-none-any.whl (203 kB)
Collecting Flask-AppBuilder==3.1.1
  Downloading Flask_AppBuilder-3.1.1-py3-none-any.whl (1.7 MB)
Requirement already satisfied: click<8,>=6.7 in /home/airflow/.local/lib/python3.8/site-packages (from Flask-AppBuilder==3.1.1) (7.1.2)
Requirement already satisfied: Flask-JWT-Extended<4,>=3.18 in /home/airflow/.local/lib/python3.8/site-packages (from Flask-AppBuilder==3.1.1) (3.25.1)
Requirement already satisfied: python-dateutil<3,>=2.3 in /home/airflow/.local/lib/python3.8/site-packages (from Flask-AppBuilder==3.1.1) (2.8.2)
Requirement already satisfied: Flask-SQLAlchemy<3,>=2.4 in /home/airflow/.local/lib/python3.8/site-packages (from Flask-AppBuilder==3.1.1) (2.5.1)
Requirement already satisfied: prison<1.0.0,>=0.1.3 in /home/airflow/.local/lib/python3.8/site-packages (from Flask-AppBuilder==3.1.1) (0.1.3)
Requirement already satisfied: jsonschema<4,>=3.0.1 in /home/airflow/.local/lib/python3.8/site-packages (from Flask-AppBuilder==3.1.1) (3.2.0)
Requirement already satisfied: marshmallow-sqlalchemy<0.24.0,>=0.22.0 in /home/airflow/.local/lib/python3.8/site-packages (from Flask-AppBuilder==3.1.1) (0.23.1)
Requirement already satisfied: Flask-Babel<2,>=1 in /home/airflow/.local/lib/python3.8/site-packages (from Flask-AppBuilder==3.1.1) (1.0.0)
Requirement already satisfied: PyJWT>=1.7.1 in /home/airflow/.local/lib/python3.8/site-packages (from Flask-AppBuilder==3.1.1) (1.7.1)
Requirement already satisfied: Flask<2,>=0.12 in /home/airflow/.local/lib/python3.8/site-packages (from Flask-AppBuilder==3.1.1) (1.1.4)
Requirement already satisfied: email-validator<2,>=1.0.5 in /home/airflow/.local/lib/python3.8/site-packages (from Flask-AppBuilder==3.1.1) (1.1.3)
Requirement already satisfied: apispec[yaml]<4,>=3.3 in /home/airflow/.local/lib/python3.8/site-packages (from Flask-AppBuilder==3.1.1) (3.3.2)
Requirement already satisfied: Flask-Login<0.5,>=0.3 in /home/airflow/.local/lib/python3.8/site-packages (from Flask-AppBuilder==3.1.1) (0.4.1)
Requirement already satisfied: colorama<1,>=0.3.9 in /home/airflow/.local/lib/python3.8/site-packages (from Flask-AppBuilder==3.1.1) (0.4.4)
Requirement already satisfied: sqlalchemy-utils<1,>=0.32.21 in /home/airflow/.local/lib/python3.8/site-packages (from Flask-AppBuilder==3.1.1) (0.37.8)
Requirement already satisfied: Flask-WTF<0.15.0,>=0.14.2 in /home/airflow/.local/lib/python3.8/site-packages (from Flask-AppBuilder==3.1.1) (0.14.3)
Requirement already satisfied: marshmallow-enum<2,>=1.5.1 in /home/airflow/.local/lib/python3.8/site-packages (from Flask-AppBuilder==3.1.1) (1.5.1)
Requirement already satisfied: marshmallow<4,>=3 in /home/airflow/.local/lib/python3.8/site-packages (from Flask-AppBuilder==3.1.1) (3.13.0)
Requirement already satisfied: Flask-OpenID<2,>=1.2.5 in /home/airflow/.local/lib/python3.8/site-packages (from Flask-AppBuilder==3.1.1) (1.3.0)
Requirement already satisfied: botocore<1.21.0,>=1.20.112 in /home/airflow/.local/lib/python3.8/site-packages (from boto3) (1.20.112)
Requirement already satisfied: jmespath<1.0.0,>=0.7.1 in /home/airflow/.local/lib/python3.8/site-packages (from boto3) (0.10.0)
Requirement already satisfied: s3transfer<0.5.0,>=0.4.0 in /home/airflow/.local/lib/python3.8/site-packages (from boto3) (0.4.2)
Requirement already satisfied: urllib3>=1.24.2 in /home/airflow/.local/lib/python3.8/site-packages (from kubernetes) (1.26.6)
Requirement already satisfied: google-auth>=1.0.1 in /home/airflow/.local/lib/python3.8/site-packages (from kubernetes) (1.35.0)
Requirement already satisfied: requests in /home/airflow/.local/lib/python3.8/site-packages (from kubernetes) (2.26.0)
Requirement already satisfied: six>=1.9.0 in /home/airflow/.local/lib/python3.8/site-packages (from kubernetes) (1.16.0)
Requirement already satisfied: requests-oauthlib in /home/airflow/.local/lib/python3.8/site-packages (from kubernetes) (1.3.0)
Requirement already satisfied: certifi>=14.05.14 in /home/airflow/.local/lib/python3.8/site-packages (from kubernetes) (2020.12.5)
Requirement already satisfied: pyyaml>=3.12 in /home/airflow/.local/lib/python3.8/site-packages (from kubernetes) (5.4.1)
Requirement already satisfied: setuptools>=21.0.0 in /usr/local/lib/python3.8/site-packages (from kubernetes) (57.5.0)
Requirement already satisfied: websocket-client!=0.40.0,!=0.41.*,!=0.42.*,>=0.32.0 in /home/airflow/.local/lib/python3.8/site-packages (from kubernetes) (1.2.1)
Collecting zthreading>=0.1.13
  Downloading zthreading-0.1.17-py2.py3-none-any.whl (21 kB)
Requirement already satisfied: cryptography in /home/airflow/.local/lib/python3.8/site-packages (from authlib) (3.4.7)
Requirement already satisfied: dnspython>=1.15.0 in /home/airflow/.local/lib/python3.8/site-packages (from email-validator<2,>=1.0.5->Flask-AppBuilder==3.1.1) (1.16.0)
Requirement already satisfied: idna>=2.0.0 in /home/airflow/.local/lib/python3.8/site-packages (from email-validator<2,>=1.0.5->Flask-AppBuilder==3.1.1) (3.2)
Requirement already satisfied: itsdangerous<2.0,>=0.24 in /home/airflow/.local/lib/python3.8/site-packages (from Flask<2,>=0.12->Flask-AppBuilder==3.1.1) (1.1.0)
Requirement already satisfied: Jinja2<3.0,>=2.10.1 in /home/airflow/.local/lib/python3.8/site-packages (from Flask<2,>=0.12->Flask-AppBuilder==3.1.1) (2.11.3)
Requirement already satisfied: Werkzeug<2.0,>=0.15 in /home/airflow/.local/lib/python3.8/site-packages (from Flask<2,>=0.12->Flask-AppBuilder==3.1.1) (1.0.1)
Requirement already satisfied: Babel>=2.3 in /home/airflow/.local/lib/python3.8/site-packages (from Flask-Babel<2,>=1->Flask-AppBuilder==3.1.1) (2.9.1)
Requirement already satisfied: pytz in /home/airflow/.local/lib/python3.8/site-packages (from Flask-Babel<2,>=1->Flask-AppBuilder==3.1.1) (2021.1)
Requirement already satisfied: python3-openid>=2.0 in /home/airflow/.local/lib/python3.8/site-packages (from Flask-OpenID<2,>=1.2.5->Flask-AppBuilder==3.1.1) (3.2.0)
Requirement already satisfied: WTForms in /home/airflow/.local/lib/python3.8/site-packages (from Flask-WTF<0.15.0,>=0.14.2->Flask-AppBuilder==3.1.1) (2.3.3)
Requirement already satisfied: pyasn1-modules>=0.2.1 in /home/airflow/.local/lib/python3.8/site-packages (from google-auth>=1.0.1->kubernetes) (0.2.8)
Requirement already satisfied: cachetools<5.0,>=2.0.0 in /home/airflow/.local/lib/python3.8/site-packages (from google-auth>=1.0.1->kubernetes) (4.2.2)
Requirement already satisfied: rsa<5,>=3.1.4 in /home/airflow/.local/lib/python3.8/site-packages (from google-auth>=1.0.1->kubernetes) (4.7.2)
Requirement already satisfied: MarkupSafe>=0.23 in /home/airflow/.local/lib/python3.8/site-packages (from Jinja2<3.0,>=2.10.1->Flask<2,>=0.12->Flask-AppBuilder==3.1.1) (1.1.1)
Requirement already satisfied: pyrsistent>=0.14.0 in /home/airflow/.local/lib/python3.8/site-packages (from jsonschema<4,>=3.0.1->Flask-AppBuilder==3.1.1) (0.18.0)
Requirement already satisfied: attrs>=17.4.0 in /home/airflow/.local/lib/python3.8/site-packages (from jsonschema<4,>=3.0.1->Flask-AppBuilder==3.1.1) (20.3.0)
Requirement already satisfied: pyasn1<0.5.0,>=0.4.6 in /home/airflow/.local/lib/python3.8/site-packages (from pyasn1-modules>=0.2.1->google-auth>=1.0.1->kubernetes) (0.4.8)
Requirement already satisfied: defusedxml in /home/airflow/.local/lib/python3.8/site-packages (from python3-openid>=2.0->Flask-OpenID<2,>=1.2.5->Flask-AppBuilder==3.1.1) (0.7.1)
Requirement already satisfied: cffi>=1.12 in /home/airflow/.local/lib/python3.8/site-packages (from cryptography->authlib) (1.14.6)
Requirement already satisfied: pycparser in /home/airflow/.local/lib/python3.8/site-packages (from cffi>=1.12->cryptography->authlib) (2.20)
Requirement already satisfied: charset-normalizer~=2.0.0 in /home/airflow/.local/lib/python3.8/site-packages (from requests->kubernetes) (2.0.4)
Requirement already satisfied: oauthlib>=3.0.0 in /home/airflow/.local/lib/python3.8/site-packages (from requests-oauthlib->kubernetes) (3.1.1)
Installing collected packages: SQLAlchemy, zthreading, Flask-AppBuilder, authlib, airflow-kubernetes-job-operator
  Attempting uninstall: SQLAlchemy
    Found existing installation: SQLAlchemy 1.3.24
    Uninstalling SQLAlchemy-1.3.24:
      Successfully uninstalled SQLAlchemy-1.3.24
  Attempting uninstall: Flask-AppBuilder
    Found existing installation: Flask-AppBuilder 3.3.2
    Uninstalling Flask-AppBuilder-3.3.2:
      Successfully uninstalled Flask-AppBuilder-3.3.2
ERROR: pip's dependency resolver does not currently take into account all the packages that are installed. This behaviour is the source of the following dependency conflicts.
apache-airflow 2.1.3 requires flask-appbuilder<4.0.0,>=3.3.2, but you have flask-appbuilder 3.1.1 which is incompatible.
Successfully installed Flask-AppBuilder-3.1.1 SQLAlchemy-1.3.23 airflow-kubernetes-job-operator-2.0.3 authlib-0.15.5 zthreading-0.1.17
WARNING: You are using pip version 21.2.4; however, version 21.3.1 is available.
You should consider upgrading via the '/usr/local/bin/python -m pip install --upgrade pip' command.
copying '/home/airflow/.local/*' to '/opt/home-airflow-local'...

Installation Clarification

To get this to work, I had to move the prometheus_exporter directly into the /plugins directory, because my python path was having trouble finding the prometheus_exporter.db module. In other words, my directory looked like

$AIRFLOW_HOME/plugins/prometheus_exporter/prometheus_exporter.py
$AIRFLOW_HOME/plugins/prometheus_exporter/db/store.py
...

rather than what would happen if I followed the exact install instructions

$AIRFLOW_HOME/plugins/prometheus_exporter/prometheus_exporter/prometheus_exporter.py
$AIRFLOW_HOME/plugins/prometheus_exporter/prometheus_exporter/db/store.py
...

I wanted to check whether this was the experience of other people too, or if this is just an quirk of my particular airflow setup.

If it's the first case, I can make a PR later to clarify the installation directions. Alternately, I would recommend putting the plugin directly in the git project root directory rather than having a subfolder, similar to what is done with many of the other published airflow plugins (https://github.com/airflow-plugins/salesforce_plugin, etc)

Service level metrics

Implement technical metrics of webui, scheduler and executors like uptime, an average time to reread dagbag etc.

[question] - airflow-exporter vs. StatsD

Hi All,

I'm quite new in Airflow on Kubernetes and I don't understand the difference between airflow-exporter and StatsD. I searched lot on the internet, but I didn't find any usable material. From integration perspective airflow-exporter is much simpler and airflow-helm chart supports it out of box. My goal is to send all the metrics to Prometheus.

Please someone can tell me the differences from Airflow on Kubernetes perspective? Which one to use in which use case?

Thanks,
Andor

feature: add airflow_exporter_info metric

Propose adding airflow_exporter_info metric, similar to python_info:

# HELP airflow_exporter_info Airflow Exporter information
# TYPE airflow_exporter_info gauge
airflow_exporter_info{major="1",minor="0",version="1.0"} 1.0

Setup continuos integration

Choose between CircleCI and TravisCI.

Create pipeline that

  • performs docker-compose up for each tests/airflow1.8-py2 and test/airflow1.9-py3
  • waits for Airflow to start
  • checks that localhost:8080/admin/metrics returns 200 OK with correctly formatted metrics

Feature: Last status timestamp metrics

Using the current metrics layout, it becomes challenging to create alerts if you have a scenario where a DAG could not succeed for several periods. The failed gauge will always read non-zero if there are historical gaps, even if subsequent runs may have succeeded. In our case, these gaps might be acceptable. Here's a DAG history that illustrates this:

State	Dag Id	                         Execution Date	Run Id	
success	mydag-daily_with-gap-in-success	01-15T00:00:00	scheduled__2019-01-15T00:00:00	
success	mydag-daily_with-gap-in-success	01-14T00:00:00	scheduled__2019-01-14T00:00:00	
success	mydag-daily_with-gap-in-success	01-13T00:00:00	scheduled__2019-01-13T00:00:00	
success	mydag-daily_with-gap-in-success	01-12T00:00:00	scheduled__2019-01-12T00:00:00	
success	mydag-daily_with-gap-in-success	01-11T00:00:00	scheduled__2019-01-11T00:00:00	
success	mydag-daily_with-gap-in-success	01-10T00:00:00	scheduled__2019-01-10T00:00:00	
success	mydag-daily_with-gap-in-success	01-09T00:00:00	scheduled__2019-01-09T00:00:00	
success	mydag-daily_with-gap-in-success	01-08T00:00:00	scheduled__2019-01-08T00:00:00	
success	mydag-daily_with-gap-in-success	01-07T00:00:00	scheduled__2019-01-07T00:00:00	
success	mydag-daily_with-gap-in-success	01-06T00:00:00	scheduled__2019-01-06T00:00:00	
success	mydag-daily_with-gap-in-success	01-05T00:00:00	scheduled__2019-01-05T00:00:00	
success	mydag-daily_with-gap-in-success	01-01T00:00:00	scheduled__2019-01-01T00:00:00	
success	mydag-daily_with-gap-in-success	2018-12-31T00:00:00	scheduled__2018-12-31T00:00:00	
success	mydag-daily_with-gap-in-success	2018-12-30T00:00:00	scheduled__2018-12-30T00:00:00	
success	mydag-daily_with-gap-in-success	2018-12-29T00:00:00

The resulting metrics are as follows:

airflow_dag_status{dag_id="mydag-daily_with-gap-in-success",owner="airflow",status="failed"} 15.0
airflow_dag_status{dag_id="mydag-daily_with-gap-in-success",owner="airflow",status="running"} 0.0
airflow_dag_status{dag_id="mydag-daily_with-gap-in-success",owner="airflow",status="success"} 365.0

(a) I'm open to suggestions on how to handle this type of situation correctly. A simplistic check for failed count will alert once and never again. Checking for rate feels a bit fragile.

(b) a pattern that is common in batch monitoring is to also expose the last time a job was in X state, so that you can gauge if you've succeeded since you've failed and how long it has been since you've succeeded (failed or not). This can catch a variety of scenarios including airflow being stuck for whatever reason. I'm fairly certain exposing this as a metric would also help the concrete scenario I've proposed here.

Is this possible? A suggestion might be

airflow_dag_status_time{dag_id="mydag-daily_with-gap-in-success",owner="airflow",status="success"} 1547639402
airflow_dag_status_time{dag_id="mydag-daily_with-gap-in-success",owner="airflow",status="failed"} 1547639102

This could be extended for tasks too though I'm not sure that is useful.

Open to feedback, of course. Thanks.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.