Giter VIP home page Giter VIP logo

node-function-invoker's People

Contributors

aelmehdi avatar cz9779 avatar dependabot-preview[bot] avatar dependabot[bot] avatar ekcasey avatar elbandito avatar ericbottard avatar fbiville avatar greenkeeper[bot] avatar greenkeeperio-bot avatar jabrown85 avatar jldec avatar kehrlann avatar markfisher avatar nael-fridhi avatar scothis avatar trisberg avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

node-function-invoker's Issues

Upgrade node invoker to node.js v10

Version 10 of Node.js (code name Dubnium) has been released! 🎊

To see what happens to your code in Node.js 10, Greenkeeper has created a branch with the following changes:

  • Added the new Node.js version to your .travis.yml

If you’re interested in upgrading this repo to Node.js 10, you can open a PR with these changes. Please note that this issue is just intended as a friendly reminder and the PR as a possible starting point for getting your code running on Node.js 10.

More information on this issue

Greenkeeper has checked the engines key in any package.json file, the .nvmrc file, and the .travis.yml file, if present.

  • engines was only updated if it defined a single version, not a range.
  • .nvmrc was updated to Node.js 10
  • .travis.yml was only changed if there was a root-level node_js that didn’t already include Node.js 10, such as node or lts/*. In this case, the new version was appended to the list. We didn’t touch job or matrix configurations because these tend to be quite specific and complex, and it’s difficult to infer what the intentions were.

For many simpler .travis.yml configurations, this PR should suffice as-is, but depending on what you’re doing it may require additional work or may not be applicable at all. We’re also aware that you may have good reasons to not update to Node.js 10, which is why this was sent as an issue and not a pull request. Feel free to delete it without comment, I’m a humble robot and won’t feel rejected 🤖


FAQ and help

There is a collection of frequently asked questions. If those don’t help, you can always ask the humans behind Greenkeeper.


Your Greenkeeper Bot 🌴

Support user extensibility of content marshalers

Currently the invoker only supports json, plain text, bytes and form encoded data. It may be desirable for a function author to use other standard content types or custom content types for their business domain.

The function author should be able to provide new marshalers, similar to the built in marshalers, on the exported function similar to $init. At runtime, the invoker should merge the built in marshalers with the function provided marshalers.

(Un)marshalling error should not crash whole process

Seems like this is what is happening (not sure what the incorrect input is/was, as this ran "behind my back") Maybe carriage return sent from liiklus-client?

cnb@repeater-processor-f2qxx-795967487d-7c99k:/workspace$ NODE_DEBUG=riff node /layers/io.projectriff.node/riff-invoker-node/server.js
RIFF 102: Ready to process signals
RIFF 102: New invocation started
RIFF 102: Streaming pipeline initialized
RIFF 102: Input signal received
RIFF 102: Start signal received: application/json
RIFF 102: Wiring 2 input stream(s)
RIFF 102: Wiring 1 output stream(s)
RIFF 102: Ready to process data
RIFF 102: Input signal received
RIFF 102: Forwarding data for input #0
RIFF 102: Input signal received
RIFF 102: error from unmarshaller
RIFF 102: Error occurred - ending pipeline now
events.js:174
      throw er; // Unhandled 'error' event
      ^

Error
    at InputUnmarshaller._transform (/layers/io.projectriff.node/riff-invoker-node/lib/input-unmarshaller.js:43:22)
    at InputUnmarshaller.Transform._read (_stream_transform.js:190:10)
    at InputUnmarshaller.Transform._write (_stream_transform.js:178:12)
    at doWrite (_stream_writable.js:415:12)
    at writeOrBuffer (_stream_writable.js:399:5)
    at InputUnmarshaller.Writable.write (_stream_writable.js:299:11)
    at StreamingPipeline._write (/layers/io.projectriff.node/riff-invoker-node/lib/streaming-pipeline.js:146:31)
    at doWrite (_stream_writable.js:415:12)
    at writeOrBuffer (_stream_writable.js:399:5)
    at StreamingPipeline.Writable.write (_stream_writable.js:299:11)
Emitted 'error' event at:
    at Stream._send (/layers/org.cloudfoundry.npm/node_modules/node_modules/highland/lib/index.js:998:18)
    at Stream.write (/layers/org.cloudfoundry.npm/node_modules/node_modules/highland/lib/index.js:1658:18)
    at /layers/org.cloudfoundry.npm/node_modules/node_modules/highland/lib/index.js:686:15
    at /layers/org.cloudfoundry.npm/node_modules/node_modules/highland/lib/index.js:3588:17
    at /layers/org.cloudfoundry.npm/node_modules/node_modules/highland/lib/index.js:1604:9
    at Stream.s._send (/layers/org.cloudfoundry.npm/node_modules/node_modules/highland/lib/index.js:1560:9)
    at Stream.write (/layers/org.cloudfoundry.npm/node_modules/node_modules/highland/lib/index.js:1658:18)
    at Stream._send (/layers/org.cloudfoundry.npm/node_modules/node_modules/highland/lib/index.js:984:26)
    at push (/layers/org.cloudfoundry.npm/node_modules/node_modules/highland/lib/index.js:1526:19)
    at /layers/org.cloudfoundry.npm/node_modules/node_modules/highland/lib/index.js:2212:13

Node function http invocation fails on second attempt with latest invoker

I built a function/core deployer like this:

riff function create square \
   --git-repo https://github.com/projectriff-samples/node-square \
   --artifact square.js \
   --tail
riff core deployer create square --function-ref square --tail

Also port-forward like this:

kubectl port-forward svc/square-deployer 8080:80

So the first invocation succeeds but second one fails

Aruba:~ trisberg$ curl -v 127.0.0.1:8080 -H 'Content-Type: application/json' -H 'Accept: application/json' -d 11 ; echo
* Rebuilt URL to: 127.0.0.1:8080/
*   Trying 127.0.0.1...
* TCP_NODELAY set
* Connected to 127.0.0.1 (127.0.0.1) port 8080 (#0)
> POST / HTTP/1.1
> Host: 127.0.0.1:8080
> User-Agent: curl/7.54.0
> Content-Type: application/json
> Accept: application/json
> Content-Length: 2
> 
* upload completely sent off: 2 out of 2 bytes
< HTTP/1.1 200 OK
< Date: Thu, 03 Oct 2019 19:17:28 GMT
< Content-Length: 3
< Content-Type: text/plain; charset=utf-8
< 
* Connection #0 to host 127.0.0.1 left intact
121
Aruba:~ trisberg$ curl -v 127.0.0.1:8080 -H 'Content-Type: application/json' -H 'Accept: application/json' -d 12 ; echo
* Rebuilt URL to: 127.0.0.1:8080/
*   Trying 127.0.0.1...
* TCP_NODELAY set
* Connected to 127.0.0.1 (127.0.0.1) port 8080 (#0)
> POST / HTTP/1.1
> Host: 127.0.0.1:8080
> User-Agent: curl/7.54.0
> Content-Type: application/json
> Accept: application/json
> Content-Length: 2
> 
* upload completely sent off: 2 out of 2 bytes
< HTTP/1.1 500 Internal Server Error
< Date: Thu, 03 Oct 2019 19:17:33 GMT
< Content-Length: 3
< Content-Type: text/plain; charset=utf-8
< 
* Connection #0 to host 127.0.0.1 left intact
EOF

I don't see anything in the deployer logs for the second invocation.

Missing header makes invocation hang

Invoking a Node function without -H 'Content-Type: application/json' makes it hang.

curl http://knative-square.default.svc.cluster.local -d 2

hangs with no repsonse.

In contrast, the Command Runtime returns normally.

curl http://knative-greet.default.svc.cluster.local -d 2

Write a real readme

From @jldec in #22 (comment):

Could we add a note in the function-proto readme explaining how we publish the function.proto to @projectriff/function-proto on npm, and consume it dynamically from the invoker.

I would also suggest adding a short note to the main readme here, documenting the 2 protocols and how they work (sync, async, streams, lifecycle hooks) with a few examples.

Implement gRPC for node

The node function invoker currently only supports request/reply semantics via the HTTP interface. With gRPC we can explore streaming bi-directional messages.

Error handling does not currently comply to the invoker specification

The invoker specification details error conditions here.

What is currently missing:

  • When an invoker receives the error stream completion signal on its input stream, it MUST propagate that error termination to all of the function input streams.
  • When a function emits an error completion signal on one of its several output streams, an invoker MUST immediately forward that signal back to the main output stream.
  • when a function throws when invoked (as opposed to when receiving data [see this point below]), the error should be propagated to the gRPC output stream (this is part of the spec yet but should be)

What is currently implemented and should be removed:

  • the streaming pipeline currently terminates if any of the input unmarshaller errors (see this and this)

What is currently unclear:

  • how to handle functions doing something like (with fictional map operator):
inputStream
    .pipe(map(() => throw new Error('yolo')))
    .pipe(outputStream)

Action required: Greenkeeper could not be activated 🚨

🚨 You need to enable Continuous Integration on all branches of this repository. 🚨

To enable Greenkeeper, you need to make sure that a commit status is reported on all branches. This is required by Greenkeeper because it uses your CI build statuses to figure out when to notify you about breaking changes.

Since we didn’t receive a CI status on the greenkeeper/initial branch, it’s possible that you don’t have CI set up yet. We recommend using Travis CI, but Greenkeeper will work with every other CI service as well.

If you have already set up a CI for this repository, you might need to check how it’s configured. Make sure it is set to run on all new branches. If you don’t want it to run on absolutely every branch, you can whitelist branches starting with greenkeeper/.

Once you have installed and configured CI on this repository correctly, you’ll need to re-trigger Greenkeeper’s initial pull request. To do this, please delete the greenkeeper/initial branch in this repository, and then remove and re-add this repository to the Greenkeeper App’s white list on Github. You'll find this list on your repo or organization’s settings page, under Installed GitHub Apps.

Function fails to be invoked when the artifact is `package.json`

Following this doc, the current documented command to create a square function is:

riff function create square \
  --git-repo https://github.com/projectriff-samples/node-square \
  --artifact package.json \
  --image gcr.io/$GCP_PROJECT/square \
  --wait --verbose

However, when run with the documented invoked:

riff service invoke square --text -- -w '\n' -d 8

It results in a failure:

curl 35.204.90.231/ -H 'Host: square.default.example.com' -H 'Content-Type: text/plain' -w '\n' -d 8
TypeError: fn is not a function
    at DestroyableTransform._transform (/workspace/io.projectriff.riff/riff-invoker-node/lib/interaction-models/request-reply.js:36:34)
    at DestroyableTransform.Transform._read (/workspace/io.projectriff.riff/riff-invoker-node/node_modules/readable-stream/lib/_stream_transform.js:184:10)
    at DestroyableTransform.Transform._write (/workspace/io.projectriff.riff/riff-invoker-node/node_modules/readable-stream/lib/_stream_transform.js:172:83)
    at doWrite (/workspace/io.projectriff.riff/riff-invoker-node/node_modules/readable-stream/lib/_stream_writable.js:428:64)
    at writeOrBuffer (/workspace/io.projectriff.riff/riff-invoker-node/node_modules/readable-stream/lib/_stream_writable.js:417:5)
    at DestroyableTransform.Writable.write (/workspace/io.projectriff.riff/riff-invoker-node/node_modules/readable-stream/lib/_stream_writable.js:334:11)
    at Duplexify.ondata (/workspace/io.projectriff.riff/riff-invoker-node/node_modules/readable-stream/lib/_stream_readable.js:619:20)
    at Duplexify.emit (events.js:182:13)
    at Duplexify.Readable.read (/workspace/io.projectriff.riff/riff-invoker-node/node_modules/readable-stream/lib/_stream_readable.js:469:26)
    at flow (/workspace/io.projectriff.riff/riff-invoker-node/node_modules/readable-stream/lib/_stream_readable.js:813:34)

When creating the function without any --artifact, it runs well:

riff function create square \
  --git-repo https://github.com/projectriff-samples/node-square \
  --image gcr.io/$GCP_PROJECT/square-fbiville \
  --wait --verbose
[...]
riff function create completed successfully
riff service invoke square --text -- -w '\n' -d 8
curl 35.204.90.231/ -H 'Host: square.default.example.com' -H 'Content-Type: text/plain' -w '\n' -d 8
64

Always log errors

The invoker currently uses debug logging as its only form of logging. While debug logging is great for diagnostics that most users don't care about, a user always cares about errors to understand why the process faulted, especially before it terminates with an abnormal code.

Investigate possible memory leak

After some time, the function logs something like:

default/upper-processor-b478766d5-xpdg5[function]: (node:20) MaxListenersExceededWarning: Possible EventEmitter memory leak detected. 11 unpipe listeners added. Use emitter.setMaxListeners() to increase limit

Flatten src/main/node

The src/main/* pattern is common in the Java space but is not common in the Node world. The most common reason to use this structure is to separate test and main code, however, the Jasmine tests currently live in src/main/node/spec. If we strictly applied this pattern, the requires in the test code would look like require('../../main/node/lib/app') instead of the current require('../lib/app').

Unless there's a good reason to keep the structure, we should flatten that directory into the root of the repository.

$init and $destroy hooks are called per invocation

The $init and $destroy hooks should only be called once per process. Currently they are called per invocation. There's no point to this since it's duplicative with calling the actual function.

The original intent of hooks was to perform setup and teardown logic that should not block the function invocation. For example, setting up a database connection pool is work that should happen once for the whole process and not per invocation.

Update to Node 10.x

Once we have a buildpack based build, functions will be able to express which version of Node they want to run with. Until then, we should track the latest stable.

The node version used in CI should be aligned to the one used by the Node buildpack

#85 passed although it initially contained code that assumed TextDecoder and TextEncoder were globally available.
This is true for Node version >= 11.

However, the Node buildpack currently pulls Node 10.15.3 and these classes needs an explicit import, else the Node buildpack build fails.

Making sure the Node version used in CI here is the same as the buildpack would avoid this kind of late errors.

Support for application/cloudevents+json

Handily, this invoker already supports CloudEvents' binary content mode. Since the binary mode leaves the Content-Type header unchanged, this invoker will procede to deserialize based on that header, meaning application/json encoded cloudevents will be turned into JS object payloads.

However, for folks using CloudEvents' structured content mode, this invoker will reject the request. That's because it uses a Content-Type header of application/cloudevents+json. Given that Knative eventing is choosing to adopt CloudEvents, there may be good reason to support this mode in Riff.

As a super simple implementation, application/cloudevents+json could be treated the exact same way as application/json (meaning useJSON.parse and JSON.stringify), which would allow the user to read the CloudEvent envelope and the data contained within.

Drop gRPC support (for now)

Knative does not yet support gRPC. Including it in the invoker only slows down npm install and startup times. Once Knative has support for gRPC, the protobuf format will likely be different, so we're need to update much of the gRPC support anyway.

While http will become the only supported protocol, we do intend to add other protocols in the future, so maintaining the protocol decoupling in the invoker is important.

Refs knative/serving#813

Support function dependencies via package.json

Most JS functions will require dependencies installed via npm. These dependencies are commonly expressed in a package.json file. In addition to the dependencies, the entry point for the package is also defined. Instead of riff create -f pointing at a script, it should point at a package.json file, install the declared dependencies and invoke the function defined as the entry point.

Dependencies for the invoker must not collide with dependencies for the function.

Review asynchronous tests

On top of #130, it seems that some tests receive the error event after the tests complete (with versions of Node < 13).

Remove $argumentTransformers support

$argumentTransformers are defined as an array of functions that transform messages based on the index of the stream. The usage is awkward as we're moving away from index based references to stream.

We should remove the argument transformers for the node-stream and always provide the full message to the function. The function author may decompose messages from the stream as they desire.

request-reply functions should continue to operate on the message payload by default, and may opt-in to the whole message.

Expose message headers to functions

A message is a payload with headers. Currently only the payload is exposed to the function. A function should be able to indicate that it would like to receive a message containing headers and a payload.

Likewise, a function should be able to produce a message with custom headers.

build.sh assumes that jq is installed

If jq is not installed, running ./build.sh produces the following error.

Successfully tagged projectriff/node-function-invoker:latest
Error parsing reference: "projectriff/node-function-invoker:" is not a valid repository/tag: invalid reference format

Async functions

Just about any non-trivial function in Node will involve an asynchronous API call. Currently the node-function-invoker only supports synchronous function invocations, meaning no asynchronous methods may be used. There are two different models for handling asynchronicity which are not easily interoperable: callbacks and async/promises.

Callbacks are a classic Node pattern and in very wide spread use. Async methods are new in ES 2017 and well supported in Node 8. Async methods are syntactic sugar which boils down to a function that returns a Promise.

A callback method would look like:

module.exports = (name, callback) => callback(null, `Hello ${name}!`);

A huge advantage of async functions is that the same invocation signature can also support sync methods transparently, whereas supporting callback functions and sync functions simultaneously is awkward and fragile.

// sync
module.exports = name => `Hello ${name}!`;

// promise
module.exports = name => Promise.resolve(`Hello ${name}!`);

// async
module.exports = async name => `Hello ${name}!`;

The current invoker code is:

var resultx = fn(req.body);
// send response

Callbacks would look like:

fn(req.body, (err, resultx) => {
  // send response
});

Sync/async/promised functions would look like:

var resultx = await fn(req.body);
// send response

I'd lean towards not supporting callbacks by default, and only add support as an opt-in if there is strong demand.

Requesting functions hangs fovever if the (request-reply) function fails (UnhandledPromiseRejectionWarning)

If users define a function such as:

module.exports = (x) => {
        if (x === 0) {
                throw new Error('Division by zero');
        }
        return 100 / x;
}

deploy it and request it as follows:

curl http://XXX -H'Content-Type:application/json' -d'0' -H'Accept:text/plain' -v

The request will hang forever:

*   Trying ::1...
* TCP_NODELAY set
* Connected to localhost (::1) port 8080 (#0)
> POST / HTTP/1.1
> Host: XXX
> User-Agent: curl/7.64.1
> Content-Type:application/json
> Accept:text/plain
> Content-Length: 1
>
* upload completely sent off: 1 out of 1 bytes

The invoker logs are as follows:

RIFF 13426: Ready to process signals
RIFF 13426: New invocation started
RIFF 13426: Promoting request-reply function to streaming function
RIFF 13426: Streaming pipeline initialized
RIFF 13426: Input signal received
RIFF 13426: Start signal with: output content types: text/plain, input names: [in], output names: [out]
RIFF 13426: Wiring 1 input stream(s)
RIFF 13426: Wiring 1 output stream(s)
RIFF 13426: Ready to process data
RIFF 13426: Input signal received
RIFF 13426: Forwarding data for input #0
(node:13426) UnhandledPromiseRejectionWarning: Error
    at MappingTransform._handleError (/Users/fbiville/workspace/node-function-invoker/lib/mapping-transform.js:33:28)
    at processTicksAndRejections (internal/process/task_queues.js:97:5)
(node:13426) UnhandledPromiseRejectionWarning: Unhandled promise rejection. This error originated either by throwing inside of an async function without a catch block, or by rejecting a promise which was not handled with .catch(). To terminate the node process on unhandled promise rejection, use the CLI flag `--unhandled-rejections=strict` (see https://nodejs.org/api/cli.html#cli_unhandled_rejections_mode). (rejection id: 1)
(node:13426) [DEP0018] DeprecationWarning: Unhandled promise rejections are deprecated. In the future, promise rejections that are not handled will terminate the Node.js process with a non-zero exit code.
RIFF 13426: Ending input streams

Gracefully handle requests without a Content-Type

The 0.0.x riff-cli defaulted the Content-Type to text/plain. The new 0.1 riff-cli no longer sets a Content-Type by default.

The invoker will error out for any request that does not contain a Content-Type header. Instead we can default the type to text/plain.

application/octet-stream would also be a valid default, but does not preserve the experience of the old cli.

Reduce function pod termination delay

Running the SpringOne demo, i noticed that function container Pods stick around in k8s for about a minute after they are scaled away by the function controller. Besides the visual sluggishness, I worry that this is actually costing resources when functions that a second to spin up, but much longer than that to spin down.

I'm not sure if this issue is specific to the invoker (base image) but i thought we could start the investigation here, and file a separate issue for the function-sidecar or other riff modules if that's where the problem is coming from.

grpc only: error-client-content-type-unsupported on "Content-Type:text/plain; charset=utf-8"

The grpc protocol handler is rejecting requests with Content-Type:text/plain; charset=utf-8

$ curl -H "Content-Type:text/plain; charset=utf-8" -d "should work" http://192.168.99.100:32300/requests/t1 -v

*   Trying 192.168.99.100...
* TCP_NODELAY set
* Connected to 192.168.99.100 (192.168.99.100) port 32300 (#0)
> POST /requests/t1 HTTP/1.1
> Host: 192.168.99.100:32300
> User-Agent: curl/7.54.0
> Accept: */*
> Content-Type:text/plain; charset=utf-8
> Content-Length: 11
> 
* upload completely sent off: 11 out of 11 bytes
< HTTP/1.1 500 Internal Server Error
< Date: Mon, 19 Feb 2018 18:59:29 GMT
< Content-Length: 0
< Content-Type: text/plain; charset=utf-8
[sidecar]: 2018/02/19 18:59:28 Sidecar for function 't1' (t1->replies) using grpc dispatcher starting
[sidecar]: 2018/02/19 18:59:28 Rebalanced: &{Type:rebalance start Claimed:map[] Released:map[] Current:map[]}
[sidecar]: 2018/02/19 18:59:28 Rebalanced: &{Type:rebalance OK Claimed:map[t1:[0]] Released:map[] Current:map[t1:[0]]}
[main]: Node started in 76ms
[main]: gRPC loaded in 94ms
[main]: HTTP loaded in 98ms
[main]: gRPC running on localhost:10382
[main]: HTTP running on http://localhost:8080
[main]: Function invoker started in 276ms
[sidecar]: 2018/02/19 18:59:29 <<< Message{, map[error:[error-client-content-type-unsupported] correlationId:[e2e941b9-68fa-4dd8-860e-ae399a55b974]]}

The same message sent with the http protocol is handled ok.

$ curl -H "Content-Type:text/plain; charset=utf-8" -d "should work" http://192.168.99.100:32300/requests/t1 -v
*   Trying 192.168.99.100...
* TCP_NODELAY set
* Connected to 192.168.99.100 (192.168.99.100) port 32300 (#0)
> POST /requests/t1 HTTP/1.1
> Host: 192.168.99.100:32300
> User-Agent: curl/7.54.0
> Accept: */*
> Content-Type:text/plain; charset=utf-8
> Content-Length: 11
> 
* upload completely sent off: 11 out of 11 bytes
< HTTP/1.1 200 OK
< Content-Type: text/plain; charset=utf-8
< Date: Mon, 19 Feb 2018 19:06:15 GMT
< Content-Length: 22
< 
[sidecar]: 2018/02/19 19:06:15 Sidecar for function 't1' (t1->replies) using http dispatcher starting
[sidecar]: 2018/02/19 19:06:15 Waiting for function to accept connection on localhost:8080
[sidecar]: 2018/02/19 19:06:15 Waiting for function to accept connection on localhost:8080
[sidecar]: 2018/02/19 19:06:15 Rebalanced: &{Type:rebalance start Claimed:map[] Released:map[] Current:map[]}
[sidecar]: 2018/02/19 19:06:15 Rebalanced: &{Type:rebalance OK Claimed:map[t1:[0]] Released:map[] Current:map[t1:[0]]}
[sidecar]: 2018/02/19 19:06:15 Waiting for function to accept connection on localhost:8080
[sidecar]: 2018/02/19 19:06:15 Wrapper received Message{should work, map[Content-Type:[text/plain; charset=utf-8] Accept:[*/*] correlationId:[0f4f98eb-55f1-47ca-92e3-5dd457e8fe90]]}
[sidecar]: 2018/02/19 19:06:15 Wrapper about to forward Message{should workshould work, map[Date:[Mon, 19 Feb 2018 19:06:15 GMT] Connection:[keep-alive] X-Powered-By:[Express] Content-Type:[text/plain; charset=utf-8] Content-Length:[22] Etag:[W/"16-weTPpPtGiPf2MRJe2B4n0af2bac"] correlationId:[0f4f98eb-55f1-47ca-92e3-5dd457e8fe90]]}
[sidecar]: 2018/02/19 19:06:15 <<< Message{should workshould work, map[Date:[Mon, 19 Feb 2018 19:06:15 GMT] Connection:[keep-alive] X-Powered-By:[Express] Content-Type:[text/plain; charset=utf-8] Content-Length:[22] Etag:[W/"16-weTPpPtGiPf2MRJe2B4n0af2bac"] correlationId:[0f4f98eb-55f1-47ca-92e3-5dd457e8fe90]]}

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.