projectriff / node-function-invoker Goto Github PK
View Code? Open in Web Editor NEWLicense: Apache License 2.0
License: Apache License 2.0
The node function is consistently entering a crash loop when running on PKS. It works on every other k8s runtime that we test.
To see what happens to your code in Node.js 10, Greenkeeper has created a branch with the following changes:
.travis.yml
If you’re interested in upgrading this repo to Node.js 10, you can open a PR with these changes. Please note that this issue is just intended as a friendly reminder and the PR as a possible starting point for getting your code running on Node.js 10.
Greenkeeper has checked the engines
key in any package.json
file, the .nvmrc
file, and the .travis.yml
file, if present.
engines
was only updated if it defined a single version, not a range..nvmrc
was updated to Node.js 10.travis.yml
was only changed if there was a root-level node_js
that didn’t already include Node.js 10, such as node
or lts/*
. In this case, the new version was appended to the list. We didn’t touch job or matrix configurations because these tend to be quite specific and complex, and it’s difficult to infer what the intentions were.For many simpler .travis.yml
configurations, this PR should suffice as-is, but depending on what you’re doing it may require additional work or may not be applicable at all. We’re also aware that you may have good reasons to not update to Node.js 10, which is why this was sent as an issue and not a pull request. Feel free to delete it without comment, I’m a humble robot and won’t feel rejected 🤖
There is a collection of frequently asked questions. If those don’t help, you can always ask the humans behind Greenkeeper.
Your Greenkeeper Bot 🌴
Currently the invoker only supports json, plain text, bytes and form encoded data. It may be desirable for a function author to use other standard content types or custom content types for their business domain.
The function author should be able to provide new marshalers, similar to the built in marshalers, on the exported function similar to $init
. At runtime, the invoker should merge the built in marshalers with the function provided marshalers.
Seems like this is what is happening (not sure what the incorrect input is/was, as this ran "behind my back") Maybe carriage return sent from liiklus-client?
cnb@repeater-processor-f2qxx-795967487d-7c99k:/workspace$ NODE_DEBUG=riff node /layers/io.projectriff.node/riff-invoker-node/server.js
RIFF 102: Ready to process signals
RIFF 102: New invocation started
RIFF 102: Streaming pipeline initialized
RIFF 102: Input signal received
RIFF 102: Start signal received: application/json
RIFF 102: Wiring 2 input stream(s)
RIFF 102: Wiring 1 output stream(s)
RIFF 102: Ready to process data
RIFF 102: Input signal received
RIFF 102: Forwarding data for input #0
RIFF 102: Input signal received
RIFF 102: error from unmarshaller
RIFF 102: Error occurred - ending pipeline now
events.js:174
throw er; // Unhandled 'error' event
^
Error
at InputUnmarshaller._transform (/layers/io.projectriff.node/riff-invoker-node/lib/input-unmarshaller.js:43:22)
at InputUnmarshaller.Transform._read (_stream_transform.js:190:10)
at InputUnmarshaller.Transform._write (_stream_transform.js:178:12)
at doWrite (_stream_writable.js:415:12)
at writeOrBuffer (_stream_writable.js:399:5)
at InputUnmarshaller.Writable.write (_stream_writable.js:299:11)
at StreamingPipeline._write (/layers/io.projectriff.node/riff-invoker-node/lib/streaming-pipeline.js:146:31)
at doWrite (_stream_writable.js:415:12)
at writeOrBuffer (_stream_writable.js:399:5)
at StreamingPipeline.Writable.write (_stream_writable.js:299:11)
Emitted 'error' event at:
at Stream._send (/layers/org.cloudfoundry.npm/node_modules/node_modules/highland/lib/index.js:998:18)
at Stream.write (/layers/org.cloudfoundry.npm/node_modules/node_modules/highland/lib/index.js:1658:18)
at /layers/org.cloudfoundry.npm/node_modules/node_modules/highland/lib/index.js:686:15
at /layers/org.cloudfoundry.npm/node_modules/node_modules/highland/lib/index.js:3588:17
at /layers/org.cloudfoundry.npm/node_modules/node_modules/highland/lib/index.js:1604:9
at Stream.s._send (/layers/org.cloudfoundry.npm/node_modules/node_modules/highland/lib/index.js:1560:9)
at Stream.write (/layers/org.cloudfoundry.npm/node_modules/node_modules/highland/lib/index.js:1658:18)
at Stream._send (/layers/org.cloudfoundry.npm/node_modules/node_modules/highland/lib/index.js:984:26)
at push (/layers/org.cloudfoundry.npm/node_modules/node_modules/highland/lib/index.js:1526:19)
at /layers/org.cloudfoundry.npm/node_modules/node_modules/highland/lib/index.js:2212:13
I built a function/core deployer like this:
riff function create square \
--git-repo https://github.com/projectriff-samples/node-square \
--artifact square.js \
--tail
riff core deployer create square --function-ref square --tail
Also port-forward like this:
kubectl port-forward svc/square-deployer 8080:80
So the first invocation succeeds but second one fails
Aruba:~ trisberg$ curl -v 127.0.0.1:8080 -H 'Content-Type: application/json' -H 'Accept: application/json' -d 11 ; echo
* Rebuilt URL to: 127.0.0.1:8080/
* Trying 127.0.0.1...
* TCP_NODELAY set
* Connected to 127.0.0.1 (127.0.0.1) port 8080 (#0)
> POST / HTTP/1.1
> Host: 127.0.0.1:8080
> User-Agent: curl/7.54.0
> Content-Type: application/json
> Accept: application/json
> Content-Length: 2
>
* upload completely sent off: 2 out of 2 bytes
< HTTP/1.1 200 OK
< Date: Thu, 03 Oct 2019 19:17:28 GMT
< Content-Length: 3
< Content-Type: text/plain; charset=utf-8
<
* Connection #0 to host 127.0.0.1 left intact
121
Aruba:~ trisberg$ curl -v 127.0.0.1:8080 -H 'Content-Type: application/json' -H 'Accept: application/json' -d 12 ; echo
* Rebuilt URL to: 127.0.0.1:8080/
* Trying 127.0.0.1...
* TCP_NODELAY set
* Connected to 127.0.0.1 (127.0.0.1) port 8080 (#0)
> POST / HTTP/1.1
> Host: 127.0.0.1:8080
> User-Agent: curl/7.54.0
> Content-Type: application/json
> Accept: application/json
> Content-Length: 2
>
* upload completely sent off: 2 out of 2 bytes
< HTTP/1.1 500 Internal Server Error
< Date: Thu, 03 Oct 2019 19:17:33 GMT
< Content-Length: 3
< Content-Type: text/plain; charset=utf-8
<
* Connection #0 to host 127.0.0.1 left intact
EOF
I don't see anything in the deployer logs for the second invocation.
Invoking a Node function without -H 'Content-Type: application/json'
makes it hang.
curl http://knative-square.default.svc.cluster.local -d 2
hangs with no repsonse.
In contrast, the Command Runtime returns normally.
curl http://knative-greet.default.svc.cluster.local -d 2
From @jldec in #22 (comment):
Could we add a note in the function-proto readme explaining how we publish the function.proto to @projectriff/function-proto on npm, and consume it dynamically from the invoker.
I would also suggest adding a short note to the main readme here, documenting the 2 protocols and how they work (sync, async, streams, lifecycle hooks) with a few examples.
The version of the invoker in node-invoker.yaml should always match the version in the package.json. There should be a test to make sure that value is in sync.
The node function invoker currently only supports request/reply semantics via the HTTP interface. With gRPC we can explore streaming bi-directional messages.
Looks like the artifact path is used in the FUNCTION_URI instead of the ADD docker command
This is noticed when the artifact is not in the root dir of the function
https://github.com/projectriff/node-function-invoker/blob/master/node-invoker.yaml#L23-L24
The variable use should be flipped - see the Java invoker: https://github.com/projectriff/java-function-invoker/blob/master/java-invoker.yaml#L21-L23
The invoker specification details error conditions here.
What is currently missing:
When an invoker receives the error stream completion signal on its input stream, it MUST propagate that error termination to all of the function input streams.
When a function emits an error completion signal on one of its several output streams, an invoker MUST immediately forward that signal back to the main output stream.
What is currently implemented and should be removed:
What is currently unclear:
inputStream
.pipe(map(() => throw new Error('yolo')))
.pipe(outputStream)
🚨 You need to enable Continuous Integration on all branches of this repository. 🚨
To enable Greenkeeper, you need to make sure that a commit status is reported on all branches. This is required by Greenkeeper because it uses your CI build statuses to figure out when to notify you about breaking changes.
Since we didn’t receive a CI status on the greenkeeper/initial
branch, it’s possible that you don’t have CI set up yet. We recommend using Travis CI, but Greenkeeper will work with every other CI service as well.
If you have already set up a CI for this repository, you might need to check how it’s configured. Make sure it is set to run on all new branches. If you don’t want it to run on absolutely every branch, you can whitelist branches starting with greenkeeper/
.
Once you have installed and configured CI on this repository correctly, you’ll need to re-trigger Greenkeeper’s initial pull request. To do this, please delete the greenkeeper/initial
branch in this repository, and then remove and re-add this repository to the Greenkeeper App’s white list on Github. You'll find this list on your repo or organization’s settings page, under Installed GitHub Apps.
The encoding is currently ignored and we assume UTF-8 on the way in and out.
Following this doc, the current documented command to create a square function is:
riff function create square \
--git-repo https://github.com/projectriff-samples/node-square \
--artifact package.json \
--image gcr.io/$GCP_PROJECT/square \
--wait --verbose
However, when run with the documented invoked:
riff service invoke square --text -- -w '\n' -d 8
It results in a failure:
curl 35.204.90.231/ -H 'Host: square.default.example.com' -H 'Content-Type: text/plain' -w '\n' -d 8
TypeError: fn is not a function
at DestroyableTransform._transform (/workspace/io.projectriff.riff/riff-invoker-node/lib/interaction-models/request-reply.js:36:34)
at DestroyableTransform.Transform._read (/workspace/io.projectriff.riff/riff-invoker-node/node_modules/readable-stream/lib/_stream_transform.js:184:10)
at DestroyableTransform.Transform._write (/workspace/io.projectriff.riff/riff-invoker-node/node_modules/readable-stream/lib/_stream_transform.js:172:83)
at doWrite (/workspace/io.projectriff.riff/riff-invoker-node/node_modules/readable-stream/lib/_stream_writable.js:428:64)
at writeOrBuffer (/workspace/io.projectriff.riff/riff-invoker-node/node_modules/readable-stream/lib/_stream_writable.js:417:5)
at DestroyableTransform.Writable.write (/workspace/io.projectriff.riff/riff-invoker-node/node_modules/readable-stream/lib/_stream_writable.js:334:11)
at Duplexify.ondata (/workspace/io.projectriff.riff/riff-invoker-node/node_modules/readable-stream/lib/_stream_readable.js:619:20)
at Duplexify.emit (events.js:182:13)
at Duplexify.Readable.read (/workspace/io.projectriff.riff/riff-invoker-node/node_modules/readable-stream/lib/_stream_readable.js:469:26)
at flow (/workspace/io.projectriff.riff/riff-invoker-node/node_modules/readable-stream/lib/_stream_readable.js:813:34)
When creating the function without any --artifact
, it runs well:
riff function create square \
--git-repo https://github.com/projectriff-samples/node-square \
--image gcr.io/$GCP_PROJECT/square-fbiville \
--wait --verbose
[...]
riff function create completed successfully
riff service invoke square --text -- -w '\n' -d 8
curl 35.204.90.231/ -H 'Host: square.default.example.com' -H 'Content-Type: text/plain' -w '\n' -d 8
64
The invoker currently uses debug logging as its only form of logging. While debug logging is great for diagnostics that most users don't care about, a user always cares about errors to understand why the process faulted, especially before it terminates with an abnormal code.
After some time, the function logs something like:
default/upper-processor-b478766d5-xpdg5[function]: (node:20) MaxListenersExceededWarning: Possible EventEmitter memory leak detected. 11 unpipe listeners added. Use emitter.setMaxListeners() to increase limit
In addition to #1, should marshall function result into something that the client can Accept:
The src/main/*
pattern is common in the Java space but is not common in the Node world. The most common reason to use this structure is to separate test and main code, however, the Jasmine tests currently live in src/main/node/spec
. If we strictly applied this pattern, the requires in the test code would look like require('../../main/node/lib/app')
instead of the current require('../lib/app')
.
Unless there's a good reason to keep the structure, we should flatten that directory into the root of the repository.
PORT
is a poor choice and is already taken in Knative runtime for other purposes.
The $init
and $destroy
hooks should only be called once per process. Currently they are called per invocation. There's no point to this since it's duplicative with calling the actual function.
The original intent of hooks was to perform setup and teardown logic that should not block the function invocation. For example, setting up a database connection pool is work that should happen once for the whole process and not per invocation.
Once we have a buildpack based build, functions will be able to express which version of Node they want to run with. Until then, we should track the latest stable.
#85 passed although it initially contained code that assumed TextDecoder
and TextEncoder
were globally available.
This is true for Node version >= 11.
However, the Node buildpack currently pulls Node 10.15.3 and these classes needs an explicit import, else the Node buildpack build fails.
Making sure the Node version used in CI here is the same as the buildpack would avoid this kind of late errors.
Looks like grpc-tools are missing from package.json
Handily, this invoker already supports CloudEvents' binary content mode. Since the binary mode leaves the Content-Type
header unchanged, this invoker will procede to deserialize based on that header, meaning application/json
encoded cloudevents will be turned into JS object payloads.
However, for folks using CloudEvents' structured content mode, this invoker will reject the request. That's because it uses a Content-Type
header of application/cloudevents+json
. Given that Knative eventing is choosing to adopt CloudEvents, there may be good reason to support this mode in Riff.
As a super simple implementation, application/cloudevents+json
could be treated the exact same way as application/json
(meaning useJSON.parse
and JSON.stringify
), which would allow the user to read the CloudEvent envelope and the data contained within.
Knative does not yet support gRPC. Including it in the invoker only slows down npm install and startup times. Once Knative has support for gRPC, the protobuf format will likely be different, so we're need to update much of the gRPC support anyway.
While http will become the only supported protocol, we do intend to add other protocols in the future, so maintaining the protocol decoupling in the invoker is important.
Refs knative/serving#813
Most JS functions will require dependencies installed via npm. These dependencies are commonly expressed in a package.json file. In addition to the dependencies, the entry point for the package is also defined. Instead of riff create -f
pointing at a script, it should point at a package.json file, install the declared dependencies and invoke the function defined as the entry point.
Dependencies for the invoker must not collide with dependencies for the function.
On top of #130, it seems that some tests receive the error event after the tests complete (with versions of Node < 13).
$argumentTransformers
are defined as an array of functions that transform messages based on the index of the stream. The usage is awkward as we're moving away from index based references to stream.
We should remove the argument transformers for the node-stream and always provide the full message to the function. The function author may decompose messages from the stream as they desire.
request-reply functions should continue to operate on the message payload by default, and may opt-in to the whole message.
A message is a payload with headers. Currently only the payload is exposed to the function. A function should be able to indicate that it would like to receive a message containing headers and a payload.
Likewise, a function should be able to produce a message with custom headers.
The StreamingPipeline
class should always assume streaming functions.
The function promotion must happen at every gRPC invocation and not before, as explained here.
The best is probably to promote the function just before the streaming pipeline is instantiated.
If jq is not installed, running ./build.sh
produces the following error.
Successfully tagged projectriff/node-function-invoker:latest
Error parsing reference: "projectriff/node-function-invoker:" is not a valid repository/tag: invalid reference format
In some scenarios, functions may want to maintain database connections or other handles which should be closed before shutdown in order to avoid leaks.
Is there a way to provide functions with an event hook to do this when function containers are shutdown?
e.g. this function opens a redis client connect
https://github.com/markfisher/s1p2017-faas-demo/blob/master/functions/vote-counter/vote-counter.js
Just about any non-trivial function in Node will involve an asynchronous API call. Currently the node-function-invoker only supports synchronous function invocations, meaning no asynchronous methods may be used. There are two different models for handling asynchronicity which are not easily interoperable: callbacks and async/promises.
Callbacks are a classic Node pattern and in very wide spread use. Async methods are new in ES 2017 and well supported in Node 8. Async methods are syntactic sugar which boils down to a function that returns a Promise.
A callback method would look like:
module.exports = (name, callback) => callback(null, `Hello ${name}!`);
A huge advantage of async functions is that the same invocation signature can also support sync methods transparently, whereas supporting callback functions and sync functions simultaneously is awkward and fragile.
// sync
module.exports = name => `Hello ${name}!`;
// promise
module.exports = name => Promise.resolve(`Hello ${name}!`);
// async
module.exports = async name => `Hello ${name}!`;
The current invoker code is:
var resultx = fn(req.body);
// send response
Callbacks would look like:
fn(req.body, (err, resultx) => {
// send response
});
Sync/async/promised functions would look like:
var resultx = await fn(req.body);
// send response
I'd lean towards not supporting callbacks by default, and only add support as an opt-in if there is strong demand.
If users define a function such as:
module.exports = (x) => {
if (x === 0) {
throw new Error('Division by zero');
}
return 100 / x;
}
deploy it and request it as follows:
curl http://XXX -H'Content-Type:application/json' -d'0' -H'Accept:text/plain' -v
The request will hang forever:
* Trying ::1...
* TCP_NODELAY set
* Connected to localhost (::1) port 8080 (#0)
> POST / HTTP/1.1
> Host: XXX
> User-Agent: curl/7.64.1
> Content-Type:application/json
> Accept:text/plain
> Content-Length: 1
>
* upload completely sent off: 1 out of 1 bytes
The invoker logs are as follows:
RIFF 13426: Ready to process signals
RIFF 13426: New invocation started
RIFF 13426: Promoting request-reply function to streaming function
RIFF 13426: Streaming pipeline initialized
RIFF 13426: Input signal received
RIFF 13426: Start signal with: output content types: text/plain, input names: [in], output names: [out]
RIFF 13426: Wiring 1 input stream(s)
RIFF 13426: Wiring 1 output stream(s)
RIFF 13426: Ready to process data
RIFF 13426: Input signal received
RIFF 13426: Forwarding data for input #0
(node:13426) UnhandledPromiseRejectionWarning: Error
at MappingTransform._handleError (/Users/fbiville/workspace/node-function-invoker/lib/mapping-transform.js:33:28)
at processTicksAndRejections (internal/process/task_queues.js:97:5)
(node:13426) UnhandledPromiseRejectionWarning: Unhandled promise rejection. This error originated either by throwing inside of an async function without a catch block, or by rejecting a promise which was not handled with .catch(). To terminate the node process on unhandled promise rejection, use the CLI flag `--unhandled-rejections=strict` (see https://nodejs.org/api/cli.html#cli_unhandled_rejections_mode). (rejection id: 1)
(node:13426) [DEP0018] DeprecationWarning: Unhandled promise rejections are deprecated. In the future, promise rejections that are not handled will terminate the Node.js process with a non-zero exit code.
RIFF 13426: Ending input streams
The 0.0.x riff-cli defaulted the Content-Type
to text/plain
. The new 0.1 riff-cli no longer sets a Content-Type
by default.
The invoker will error out for any request that does not contain a Content-Type
header. Instead we can default the type to text/plain
.
application/octet-stream
would also be a valid default, but does not preserve the experience of the old cli.
They already are internally but this is not exposed to the end user.
Running the SpringOne demo, i noticed that function container Pods stick around in k8s for about a minute after they are scaled away by the function controller. Besides the visual sluggishness, I worry that this is actually costing resources when functions that a second to spin up, but much longer than that to spin down.
I'm not sure if this issue is specific to the invoker (base image) but i thought we could start the investigation here, and file a separate issue for the function-sidecar or other riff modules if that's where the problem is coming from.
The gRPC invoker currently ignores the charset defined in the Content-Type
header. We should attempt to use the declared charset when converting the node Buffer to a string. We'll likely need to limit the acceptable charsets since the Buffer type only supports a few encodings.
See https://nodejs.org/api/buffer.html#buffer_buffers_and_character_encodings
The grpc protocol handler is rejecting requests with Content-Type:text/plain; charset=utf-8
$ curl -H "Content-Type:text/plain; charset=utf-8" -d "should work" http://192.168.99.100:32300/requests/t1 -v
* Trying 192.168.99.100...
* TCP_NODELAY set
* Connected to 192.168.99.100 (192.168.99.100) port 32300 (#0)
> POST /requests/t1 HTTP/1.1
> Host: 192.168.99.100:32300
> User-Agent: curl/7.54.0
> Accept: */*
> Content-Type:text/plain; charset=utf-8
> Content-Length: 11
>
* upload completely sent off: 11 out of 11 bytes
< HTTP/1.1 500 Internal Server Error
< Date: Mon, 19 Feb 2018 18:59:29 GMT
< Content-Length: 0
< Content-Type: text/plain; charset=utf-8
[sidecar]: 2018/02/19 18:59:28 Sidecar for function 't1' (t1->replies) using grpc dispatcher starting
[sidecar]: 2018/02/19 18:59:28 Rebalanced: &{Type:rebalance start Claimed:map[] Released:map[] Current:map[]}
[sidecar]: 2018/02/19 18:59:28 Rebalanced: &{Type:rebalance OK Claimed:map[t1:[0]] Released:map[] Current:map[t1:[0]]}
[main]: Node started in 76ms
[main]: gRPC loaded in 94ms
[main]: HTTP loaded in 98ms
[main]: gRPC running on localhost:10382
[main]: HTTP running on http://localhost:8080
[main]: Function invoker started in 276ms
[sidecar]: 2018/02/19 18:59:29 <<< Message{, map[error:[error-client-content-type-unsupported] correlationId:[e2e941b9-68fa-4dd8-860e-ae399a55b974]]}
The same message sent with the http protocol is handled ok.
$ curl -H "Content-Type:text/plain; charset=utf-8" -d "should work" http://192.168.99.100:32300/requests/t1 -v
* Trying 192.168.99.100...
* TCP_NODELAY set
* Connected to 192.168.99.100 (192.168.99.100) port 32300 (#0)
> POST /requests/t1 HTTP/1.1
> Host: 192.168.99.100:32300
> User-Agent: curl/7.54.0
> Accept: */*
> Content-Type:text/plain; charset=utf-8
> Content-Length: 11
>
* upload completely sent off: 11 out of 11 bytes
< HTTP/1.1 200 OK
< Content-Type: text/plain; charset=utf-8
< Date: Mon, 19 Feb 2018 19:06:15 GMT
< Content-Length: 22
<
[sidecar]: 2018/02/19 19:06:15 Sidecar for function 't1' (t1->replies) using http dispatcher starting
[sidecar]: 2018/02/19 19:06:15 Waiting for function to accept connection on localhost:8080
[sidecar]: 2018/02/19 19:06:15 Waiting for function to accept connection on localhost:8080
[sidecar]: 2018/02/19 19:06:15 Rebalanced: &{Type:rebalance start Claimed:map[] Released:map[] Current:map[]}
[sidecar]: 2018/02/19 19:06:15 Rebalanced: &{Type:rebalance OK Claimed:map[t1:[0]] Released:map[] Current:map[t1:[0]]}
[sidecar]: 2018/02/19 19:06:15 Waiting for function to accept connection on localhost:8080
[sidecar]: 2018/02/19 19:06:15 Wrapper received Message{should work, map[Content-Type:[text/plain; charset=utf-8] Accept:[*/*] correlationId:[0f4f98eb-55f1-47ca-92e3-5dd457e8fe90]]}
[sidecar]: 2018/02/19 19:06:15 Wrapper about to forward Message{should workshould work, map[Date:[Mon, 19 Feb 2018 19:06:15 GMT] Connection:[keep-alive] X-Powered-By:[Express] Content-Type:[text/plain; charset=utf-8] Content-Length:[22] Etag:[W/"16-weTPpPtGiPf2MRJe2B4n0af2bac"] correlationId:[0f4f98eb-55f1-47ca-92e3-5dd457e8fe90]]}
[sidecar]: 2018/02/19 19:06:15 <<< Message{should workshould work, map[Date:[Mon, 19 Feb 2018 19:06:15 GMT] Connection:[keep-alive] X-Powered-By:[Express] Content-Type:[text/plain; charset=utf-8] Content-Length:[22] Etag:[W/"16-weTPpPtGiPf2MRJe2B4n0af2bac"] correlationId:[0f4f98eb-55f1-47ca-92e3-5dd457e8fe90]]}
We can drop the invoker support in favor of buildpacks once a node and npm cloud native buildpack is available. Buildpacks provide a more robust and efficient build experience.
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.