dillon-giacoppo / rules_python_external Goto Github PK
View Code? Open in Web Editor NEWBazel rules to resolve and fetch artifacts transitively from the Python Package Index (PyPI)
License: Apache License 2.0
Bazel rules to resolve and fetch artifacts transitively from the Python Package Index (PyPI)
License: Apache License 2.0
Good morning,
I'm trying to use rules_python_external
and define black==19.10b0 as a dependency. There doesn't seem to be support in bazel
for handling wheels that rely on entry_point
definitions for their CLI interface, so I created my own. However, when trying to call off to patched_main
it seems as though the import tree for black is incorrect.
BUILD file
py_binary(
name = "black",
srcs = ["black.py"],
deps = [
requirement("black"),
],
)
black.py
import black
if __name__ == "__main__":
# print(dir(black.black.black))
black.patched_main()
the black
module seems to be the only importable attribute in the py_library.
$ bazel run //tools:black
INFO: Analyzed target //tools:black (0 packages loaded, 0 targets configured).
INFO: Found 1 target...
Target //tools:black up-to-date:
bazel-bin/tools/black
INFO: Elapsed time: 0.144s, Critical Path: 0.00s
INFO: 0 processes.
INFO: Build completed successfully, 1 total action
INFO: Build completed successfully, 1 total action
['__builtins__', '__cached__', '__doc__', '__file__', '__loader__', '__name__', '__package__', '__spec__', 'black']
Traceback (most recent call last):
File "/tmp/bazel/bryce/execroot/com_github_foobar/bazel-out/k8-fastbuild/bin/tools/black.runfiles/com_github_foobar/tools/black.py", line 10, in <module>
black.patched_main()
AttributeError: module 'black' has no attribute 'patched_main'
# WORKSPACE
load("@bazel_tools//tools/build_defs/repo:http.bzl", "http_archive", "http_file")
http_archive(
name = "rules_python",
sha256 = "b5668cde8bb6e3515057ef465a35ad712214962f0b3a314e551204266c7be90c",
strip_prefix = "rules_python-0.0.2",
url = "https://github.com/bazelbuild/rules_python/releases/download/0.0.2/rules_python-0.0.2.tar.gz",
)
load("@bazel_tools//tools/build_defs/repo:git.bzl", "git_repository")
git_repository(
name = "rules_python_external",
commit = "2c78da5b5beb78c4a96b8b4d84e9c34de8178efb",
remote = "https://github.com/dillon-giacoppo/rules_python_external",
shallow_since = "1594277101 +1000",
)
load("@rules_python_external//:repositories.bzl", "rules_python_external_dependencies")
rules_python_external_dependencies()
load("@rules_python_external//:defs.bzl", "pip_install")
pip_install(
name = "jiko_runtime_requirements",
requirements = "//:requirements.txt",
)
# requirements.txt
ruamel.yaml==0.16.10
ruamel.yaml.clib==0.2.0
# BUILD.bazel
load("@rules_python//python:python.bzl", "py_binary")
load("@jiko_runtime_requirements//:requirements.bzl", runtime_requirement = "requirement")
py_binary(
name = "example",
srcs = [
"main.py",
],
main = "main.py",
visibility = ["//visibility:public"],
deps = [
runtime_requirement("ruamel.yaml"),
],
)
# main.py
from ruamel import yaml
print(yaml.SafeDumper)
Run:
bazel run //:example
Output:
AttributeError: module 'ruamel.yaml' has no attribute 'SafeDumper'
rules_python_external creates an empty directory for ruamel_yaml_clib which is imported instead of ruamel_yaml.
We looked through the source code and saw that extract_wheels/lib/namespace_pkgs.py
add_pkgutil_style_namespace_pkg_init
is
creating a directory for each entry in namespace_packages.txt
which are ruamel
and ruamel.yaml
in the clib.
Then placing a __init__.py
there.
We solved our problem by returning instead of creating an empty directory:
We were wondering if the __init__.py
file needed to be generated if the directory did not already exist?
I am trying to include a the Modin project as a dependency. This project uses Ray which in turn uses Redis.
When executing the py_test rule that exercises this code, I get the following permissions error:
PermissionError: [Errno 13] Permission denied: '/private/var/tmp/_bazel_joshuaharrison/29b3465cefdffc832f5dd334d22b0ea7/execroot/Crestone/bazel-out/darwin-fastbuild/bin/strategies/example/test_strategy_test.runfiles/pip/pypi__ray/ray/core/src/ray/thirdparty/redis/src/redis-server'
This file path is a symlink to /private/var/tmp/_bazel_joshuaharrison/29b3465cefdffc832f5dd334d22b0ea7/external/pip/pypi__ray/ray/core/src/ray/thirdparty/redis/src/redis-server which has the following permissions:
-rw-r--r-- 1 joshuaharrison wheel 1343096 Jun 9 16:10 /private/var/tmp/_bazel_joshuaharrison/29b3465cefdffc832f5dd334d22b0ea7/external/pip/pypi__ray/ray/core/src/ray/thirdparty/redis/src/redis-server
If I manually chmod this file into being executable, I get a separate error about another non-executable file. If I keep fixing these errors in the same manner, the test will eventually run and pass.
I understand that this is not an issue specific to your project, but was hoping for some help in figuring out a possible solution. For example, I suppose I could make all files executable in extract_wheel?
Thoughts?
Installing extras
generate requirements.txt
pip-compile --generate-hashes requirements.in
# requirements.in
apache_beam[gcp]
apache-beam should be installed with extras for gcp
' failed; build aborted: no such package '@my_deps//pypi__apache_beam[gcp]':
BUILD file not found in directory 'pypi__apache_beam[gcp]'
of external repository @my_deps. Add a BUILD file to a directory to mark it as a package.
At the moment we pass the requirements.txt
directly into pip wheel
. This will automatically resolve transitive dependencies which we then install.
This behavior is not ideal as pip has a naive resolution mechanism of "first found". This can lead to unintended version bumps which break the hermetic promise of Bazel. Dependency resolution is also an expensive operation, especially for PyPI packages that need to be built.
It should be considered whether dependency resolution should be opt-in by default, for performance and hermiticity. This perhaps goes hand in hand with a "pin" concept similar to rules_jvm_external.
Note, this is not a proposal to remove the automatic dependency linking of pip packages once they are installed. i.e there should be no need to specify transitive dependencies in the deps
of py
targets.
distutils
package for python3
may not always be installed on the build system. For example, this happens in my Debian 10. This causes target pip_repository to return an error:
INFO: Call stack for the definition of repository 'sphinx_deps' which is a pip_repository (rule definition at /home/asmirnov/.cache/bazel/_bazel_asmirnov/21e836bd6a9d91d4ee2184dc83b17371/exte
rnal/rules_python_external/defs.bzl:46:18):
- /home/asmirnov/.cache/bazel/_bazel_asmirnov/21e836bd6a9d91d4ee2184dc83b17371/external/rules_python_external/defs.bzl:59:5
- /home/asmirnov/.cache/bazel/_bazel_asmirnov/21e836bd6a9d91d4ee2184dc83b17371/external/rules_python_external/defs.bzl:59:5
- /home/asmirnov/devel/scm.infinet.ru/nms/next-master/WORKSPACE:243:1
ERROR: An error occurred during the fetch of repository 'sphinx_deps':
rules_python_external failed: (Traceback (most recent call last):
File "/usr/lib/python3.6/runpy.py", line 193, in _run_module_as_main
"__main__", mod_spec)
File "/usr/lib/python3.6/runpy.py", line 85, in _run_code
exec(code, run_globals)
File "/home/asmirnov/.cache/bazel/_bazel_asmirnov/21e836bd6a9d91d4ee2184dc83b17371/external/pypi__pip/pip/__main__.py", line 16, in <module>
from pip._internal.main import main as _main # isort:skip # noqa
File "/home/asmirnov/.cache/bazel/_bazel_asmirnov/21e836bd6a9d91d4ee2184dc83b17371/external/pypi__pip/pip/_internal/main.py", line 13, in <module>
from pip._internal.cli.autocompletion import autocomplete
File "/home/asmirnov/.cache/bazel/_bazel_asmirnov/21e836bd6a9d91d4ee2184dc83b17371/external/pypi__pip/pip/_internal/cli/autocompletion.py", line 11, in <module>
from pip._internal.cli.main_parser import create_main_parser
File "/home/asmirnov/.cache/bazel/_bazel_asmirnov/21e836bd6a9d91d4ee2184dc83b17371/external/pypi__pip/pip/_internal/cli/main_parser.py", line 7, in <module>
from pip._internal.cli import cmdoptions
File "/home/asmirnov/.cache/bazel/_bazel_asmirnov/21e836bd6a9d91d4ee2184dc83b17371/external/pypi__pip/pip/_internal/cli/cmdoptions.py", line 19, in <module>
from distutils.util import strtobool
ModuleNotFoundError: No module named 'distutils.util'
Traceback (most recent call last):
File "/usr/lib/python3.6/runpy.py", line 193, in _run_module_as_main
"__main__", mod_spec)
File "/usr/lib/python3.6/runpy.py", line 85, in _run_code
exec(code, run_globals)
File "/home/asmirnov/.cache/bazel/_bazel_asmirnov/21e836bd6a9d91d4ee2184dc83b17371/external/rules_python_external/extract_wheels/__main__.py", line 5, in <module>
extract_wheels.main()
File "/home/asmirnov/.cache/bazel/_bazel_asmirnov/21e836bd6a9d91d4ee2184dc83b17371/external/rules_python_external/extract_wheels/__init__.py", line 70, in main
[sys.executable, "-m", "pip", "wheel", "-r", args.requirements]
File "/usr/lib/python3.6/subprocess.py", line 356, in check_output
**kwargs).stdout
File "/usr/lib/python3.6/subprocess.py", line 438, in run
output=stdout, stderr=stderr)
subprocess.CalledProcessError: Command '['/usr/bin/python3', '-m', 'pip', 'wheel', '-r', '/home/asmirnov/devel/scm.infinet.ru/nms/next-master/docs/requirements.txt']' returned non-zero exit s
tatus 1.
)
Maybe should add it as depends on rules_python_external or describe clarify prerequisites.
Essentially a copy-over of apt-itude/rules_pip#13.
Mentioned as nice-to-have here: #17 (comment)
I recently started to using toolchain
as I try to get deterministic behavior when running my codebase on CI.
Then I found that the pip_install
performed by this repo still uses the system's /usr/bin/python3
even if I configured the toolchain to use the py binary in the virtualenv. I think this rule should get the interpreter from the toolchain.
Even if I try to work around this issue by specifying python_interpreter
for pip_install
, it only accepts an absolute path. In other words, if I pass the same relative path to virtualevenv that I passed to the py_runtime
rule's interpreter
tag, it will error with:
INFO: Call stack for the definition of repository 'py_deps' which is a pip_repository (rule definition at /root/.cache/bazel/_bazel_root/9fe7f4f07a7276c7a0aa3365ae93522e/external/rules_python_external/defs.bzl:46:18):
- <builtin>
- /root/.cache/bazel/_bazel_root/9fe7f4f07a7276c7a0aa3365ae93522e/external/rules_python_external/defs.bzl:59:5
- /root/xlab/WORKSPACE:78:1
ERROR: An error occurred during the fetch of repository 'py_deps':
rules_python_external failed: (src/main/tools/process-wrapper-legacy.cc:58: "execvp(.venv/bin/python3.8, ...)": No such file or directory
)
ERROR: no such package '@py_deps//': rules_python_external failed: (src/main/tools/process-wrapper-legacy.cc:58: "execvp(.venv/bin/python3.8, ...)": No such file or directory
)
My commit: dayfine/xlab@1097784
Setup rules_python_external
with:
# requirements.txt
tensorflow==2.0.0
Do bazel sync
.
Sync would work fine and the Tensorflow package would be available to use with requirement
.
INFO: Call stack for the definition of repository 'pypi' which is a pip_repository (rule definition at /private/var/tmp/_bazel_jbelotti/a8a1f81a1422a72c79c6c81641a8c816/external/rules_python_external/defs.bzl:25:18):
- /Users/jbelotti/work/data-science/WORKSPACE:100:1
ERROR: An error occurred during the fetch of repository 'pypi':
failed to create pip repository: (Traceback (most recent call last):
File "/private/var/tmp/_bazel_jbelotti/a8a1f81a1422a72c79c6c81641a8c816/external/rules_python_external/tools/wheel_wrapper.py", line 67, in <module>
main()
File "/private/var/tmp/_bazel_jbelotti/a8a1f81a1422a72c79c6c81641a8c816/external/rules_python_external/tools/wheel_wrapper.py", line 63, in main
whl.main()
File "/private/var/tmp/_bazel_jbelotti/a8a1f81a1422a72c79c6c81641a8c816/external/rules_python_external/./src/extract_wheels.py", line 127, in main
extract_wheel(wheel, whl_label, [])
File "/private/var/tmp/_bazel_jbelotti/a8a1f81a1422a72c79c6c81641a8c816/external/rules_python_external/./src/extract_wheels.py", line 83, in extract_wheel
purelib.spread_purelib_into_root(directory)
File "/private/var/tmp/_bazel_jbelotti/a8a1f81a1422a72c79c6c81641a8c816/external/rules_python_external/./src/purelib.py", line 32, in spread_purelib_into_root
dst=extracted_whl_directory,
File "/nix/store/i9zg3n6dgi7jdw37nxyfi272pszkk6nd-python3-3.6.9/lib/python3.6/shutil.py", line 548, in move
raise Error("Destination path '%s' already exists" % real_dst)
shutil.Error: Destination path 'pypi__tensorflow/tensorflow_core' already exists
)
ERROR: no such package '@pypi//': failed to create pip repository: (Traceback (most recent call last):
File "/private/var/tmp/_bazel_jbelotti/a8a1f81a1422a72c79c6c81641a8c816/external/rules_python_external/tools/wheel_wrapper.py", line 67, in <module>
main()
File "/private/var/tmp/_bazel_jbelotti/a8a1f81a1422a72c79c6c81641a8c816/external/rules_python_external/tools/wheel_wrapper.py", line 63, in main
whl.main()
File "/private/var/tmp/_bazel_jbelotti/a8a1f81a1422a72c79c6c81641a8c816/external/rules_python_external/./src/extract_wheels.py", line 127, in main
extract_wheel(wheel, whl_label, [])
File "/private/var/tmp/_bazel_jbelotti/a8a1f81a1422a72c79c6c81641a8c816/external/rules_python_external/./src/extract_wheels.py", line 83, in extract_wheel
purelib.spread_purelib_into_root(directory)
File "/private/var/tmp/_bazel_jbelotti/a8a1f81a1422a72c79c6c81641a8c816/external/rules_python_external/./src/purelib.py", line 32, in spread_purelib_into_root
dst=extracted_whl_directory,
File "/nix/store/i9zg3n6dgi7jdw37nxyfi272pszkk6nd-python3-3.6.9/lib/python3.6/shutil.py", line 548, in move
raise Error("Destination path '%s' already exists" % real_dst)
shutil.Error: Destination path 'pypi__tensorflow/tensorflow_core' already exists
)
Hello,
I'm trying to depend on a version of the torch-nightly
installs by specifying the URL of the wheel in the requirements.txt. Is this behavior supported?
From the torch docs (which recommends using --find-links
option on pip), I'd like to have a requirements.txt with the following url.
equivalent pip install cmd:
pip install --pre torch torchvision -f https://download.pytorch.org/whl/nightly/cpu/torch_nightly.html
contents of the requirements.txt file:
https://download.pytorch.org/whl/nightly/cu102/torch-1.7.0.dev20200723-cp37-cp37m-linux_x86_64.whl
locally this would install with the following torch version.
import torch
print(torch.__version) # 1.7.0.dev20200723
However, this seems to break on the pip_install
rule.
Maybe there's a workaround?
I have been using rules_python_external
to build python targets, which has worked quite well. Recently I have this new case that I also want to use the packages installed directly. For example:
I am installing notebook
to use the jupyter
cli binary. I need notebook
both in the build dependency and as a regular pip package to use the cli binary. If I only use rules_python_external
, the latter is not available, as I found out rules_python_external
put all packages under /bazel-mypackage/external/py_deps
, which I suppose it's necessary for bazel discovery. As a result, right now I still need to pip install notebook
in the virtual environment, which feels redundant.
Moreover, I think this actually prevent something from working: e.g. for jupyter, I need to enable some extensions from CLI, which are applied to the configs under .venv/lib/python3.8/site-packages/
, but then when I run the bazel built binary, it does not know about the .venv config and doesn't work as expected (I think I can get working by putting the jupyter config under /user/local/
but that doesn't feel very right.
Is there a recommended way for handling this?
Our code uses f-strings
which are only available in >= Python 3.6. We should remove them or at least specify in the README
that we don't support <= 3.5
I am not sure if the problem should by fixed on Azure package design or can be fixed in this project.
I have a problem with azure.eventhub.extensions.checkpointstoreblob. Attempt to import it ends with an error
$ bazel run :main
INFO: Analyzed target //:main (18 packages loaded, 672 targets configured).
INFO: Found 1 target...
Target //:main up-to-date:
bazel-bin/main
INFO: Elapsed time: 5.008s, Critical Path: 0.01s
INFO: 0 processes.
INFO: Build completed successfully, 1 total action
INFO: Build completed successfully, 1 total action
Traceback (most recent call last):
File "/home/pjachowi/.cache/bazel/_bazel_pjachowi/0be77f5cbe63ebeec7251d4b492b30cb/execroot/example_repo/bazel-out/k8-fastbuild/bin/main.runfiles/example_repo/main.py", line 1, in <module>
from azure.eventhub.extensions.checkpointstoreblob import BlobCheckpointStore
ModuleNotFoundError: No module named 'azure.eventhub.extensions.checkpointstoreblob'
What I suspect, having limited knowledge about Python packaging machinery, azure-eventhub-checkpointstoreblob has dependency on azure-eventhub. The azure-eventhub contains directory azure/eventhub/extensions
consisting only __init__.py
which terminates search for azure.eventhub.extensions package:
$ ls ./bazel-bin/main.runfiles/example_repo/external/pip/pypi__azure_eventhub/azure/eventhub/extensions
__init__.py
$ ls ./bazel-bin/main.runfiles/example_repo/external/pip/pypi__azure_eventhub_checkpointstoreblob/azure/eventhub/extensions
checkpointstoreblob __init__.py
To reproduce the problem one needs four files in the same directory: WORKSPACE, BUILD, requirements.txt, and main.py:
WORKSPACE (same as in the example directory):
workspace(name = "example_repo")
load("@bazel_tools//tools/build_defs/repo:http.bzl", "http_archive")
http_archive(
name = "rules_python",
url = "https://github.com/bazelbuild/rules_python/releases/download/0.0.2/rules_python-0.0.2.tar.gz",
strip_prefix = "rules_python-0.0.2",
sha256 = "b5668cde8bb6e3515057ef465a35ad712214962f0b3a314e551204266c7be90c",
)
load("@rules_python//python:repositories.bzl", "py_repositories")
py_repositories()
local_repository(
name = "rules_python_external",
path = "../",
)
load("@rules_python_external//:repositories.bzl", "rules_python_external_dependencies")
rules_python_external_dependencies()
load("@rules_python_external//:defs.bzl", "pip_install")
pip_install(
requirements = "//:requirements.txt",
)
BUILD:
load("@pip//:requirements.bzl", "requirement")
py_binary(
name = "main",
srcs = ["main.py"],
deps = [
requirement("azure-eventhub-checkpointstoreblob"),
],
)
requirements.txt:
azure-eventhub-checkpointstoreblob==1.1.0
main.py:
from azure.eventhub.extensions.checkpointstoreblob import BlobCheckpointStore
if __name__ == "__main__":
pass
Hello.
This bug occurs while processing requirements.txt if any of the affected libraries are in the file.
Builds of existing, unmodified repos work as before but any update to requirements.txt seems to trigger bazel to download a new (i.e. version 50) setuptools and start failing. The box I build on has setuptools 40 so this is something which is happening inside the bazel sandbox. I use both rules_python 0.0.1 and a commit-locked version of rules_python_external from March.
I see what looks like earlier setuptools versions in your code (and in rules_python's internal requirements.txt) so I'm not sure why the bug in version 50 is showing up.
I assume I should be filing here rather than at rules_python because we use rules_python_external's pip_install rule to process our requirements files.
Any workarounds or fixes welcome. Please let me know if you need more information
Thanks.
We have a project that relies on daphne, django and channels, all in python 3.6.
py_binary(
name = "manage",
srcs = ["manage.py"],
deps = [
pip_install("channels"),
pip_install("daphne"),
pip_install("django"),
],
)
We're trying out this set of rules since the official rules are still in early stages and getting this error:
File ".../start-dev-django.runfiles/py_pip_deps/pypi__django/django/core/management/__init__.py", line 401, in execute_from_command_line
utility.execute()
File ".../start-dev-django.runfiles/py_pip_deps/pypi__django/django/core/management/__init__.py", line 377, in execute
django.setup()
File ".../start-dev-django.runfiles/py_pip_deps/pypi__django/django/__init__.py", line 24, in setup
apps.populate(settings.INSTALLED_APPS)
File ".../start-dev-django.runfiles/py_pip_deps/pypi__django/django/apps/registry.py", line 91, in populate
app_config = AppConfig.create(entry)
File ".../start-dev-django.runfiles/py_pip_deps/pypi__django/django/apps/config.py", line 116, in create
mod = import_module(mod_path)
File "/usr/lib/python3.6/importlib/__init__.py", line 126, in import_module
return _bootstrap._gcd_import(name[level:], package, level)
File "<frozen importlib._bootstrap>", line 994, in _gcd_import
File "<frozen importlib._bootstrap>", line 971, in _find_and_load
File "<frozen importlib._bootstrap>", line 955, in _find_and_load_unlocked
File "<frozen importlib._bootstrap>", line 665, in _load_unlocked
File "<frozen importlib._bootstrap_external>", line 678, in exec_module
File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed
File ".../start-dev-django.runfiles/py_pip_deps/pypi__channels/channels/apps.py", line 6, in <module>
import daphne.server
File ".../start-dev-django.runfiles/py_pip_deps/pypi__daphne/daphne/server.py", line 32, in <module>
from .ws_protocol import WebSocketFactory
File ".../start-dev-django.runfiles/py_pip_deps/pypi__daphne/daphne/ws_protocol.py", line 6, in <module>
from autobahn.twisted.websocket import (
File ".../start-dev-django.runfiles/py_pip_deps/pypi__autobahn/autobahn/twisted/__init__.py", line 82, in <module>
__ident__ = 'Autobahn/{}-Twisted/{}-{}/{}'.format(autobahn.__version__, twisted.__version__, platform.python_implementation(), platform.python_version())
AttributeError: module 'twisted' has no attribute '__version__'
It seems to have pulled everything it needs, including pypi__twisted. Any help would be appreciated. Using release 0.1.5
This has now been upstreamed to rules_python and released in 0.1.0.
I think to prevent confusion this repo should be archived with a message pointing to that release notes
https://github.com/bazelbuild/rules_python/releases/tag/0.1.0
Also maybe leave a note on the three outstanding PRs that they should be ported, and do something with the two open issues?
Thanks for this awesome work that is now canonical :)
I'm trying to install https://pypi.org/project/docker/#description (version is unpinned) for my py_test rule, but it is giving this error:
bazel build vail:test
ERROR: /home/vadmin/plops/vail/BUILD:41:1: no such package '@vail_deps//pypi__docker ': BUILD file not found in directory 'pypi__docker ' of external repository @vail_deps. Add a BUILD file to a directory to mark it as a package. and referenced by '//vail:test'
ERROR: Analysis of target '//vail:test' failed; build aborted: no such package '@vail_deps//pypi__docker ': BUILD file not found in directory 'pypi__docker ' of external repository @vail_deps. Add a BUILD file to a directory to mark it as a package.
INFO: Elapsed time: 0.652s
INFO: 0 processes.
FAILED: Build did NOT complete successfully (14 packages loaded, 13 targets configured)
I have been able to build and run the cli which it is testing (but does not depend on the docker package). Weirdly, when I go into bazel-plops/external/vail_deps/pypi__docker there is a BUILD file right there and the package looks identical to all the other packages I'm pulling in.
Additionally running bazel build @vail_deps//pypi__docker
passes
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.