caolan / async Goto Github PK
View Code? Open in Web Editor NEWAsync utilities for node and the browser
Home Page: http://caolan.github.io/async/
License: MIT License
Async utilities for node and the browser
Home Page: http://caolan.github.io/async/
License: MIT License
I'm having issues where flow control results that equal (not identical to) false end up as null in the results array/object. The following code is an efficient demonstration:
var util = require('util'),
async = require('./async/lib/async');
function taskFalse(callback) {
process.nextTick(function() {
callback(null, false);
});
};
function taskUndefined(callback) {
process.nextTick(function() {
callback(null, undefined);
});
};
function taskEmpty(callback) {
process.nextTick(function() {
callback(null);
});
};
function taskNull(callback) {
process.nextTick(function() {
callback(null, null);
});
};
async.series([taskFalse, taskUndefined, taskEmpty, taskNull], function(err, results) {
util.puts(util.inspect(results));
});
Which outputs
[ null, null, null, null ]
Where in my mind it should be
[ false, undefined, undefined, null ]
I've tested in v0.1.0, git HEAD (a10cdae) and npm latest ([email protected]) so I'm guessing it affects all versions.
I was told I need to refactor my code that is using async waterfall. I used it primarily because I had several chained functions that were going too deep, and I wanted to reuse a function, etc. There is a condition at one point down the chain that is not strictly an error condition, but that I don't want to pass down the rest of the waterfall chain, because it's useless - say for example I did a query and there are no results, and that's not an error, I don't want to propagate all the way down with no results. If I were doing this without async, I guess I would have a "final" callback and which could be called to break out of the chain without explicitly throwing an error. This functionality doesn't seem to be present in waterfall. Is there a way to do this cleanly? The only way I can see to do it is to throw a special error and then handle that on the final callback so as to return a non error.
It would be nice to have a generalized way of controlling groups of async tasks, possibly using a return value from async api functions:
var parallelController = async.parallel([
function(){ ... },
function(){ ... }
], callback);
var seriesController = async.series([
function(){ ... },
function(){ ... }
]);
var someEmitter = new SomeEmitter();
someEmitter .on('stopEverything', function () {
parallelController .abort(); // if finished, does nothing
seriesController .abort();
}
someEmitter .on('oneMoreThing', function () {
seriesController.append( [function(){..}, function(){..}] ) //if finished, start new tasks
}
Might also be nice for such a value to be queueable like a function, like this:
seriesController = async.series([
function(){ ... },
function(){ ... }
]);
seriesController.pause();
someEmitter.on('okGo', function() {
async.parallel([
function(){ ... },
seriesController,
], callback);
seriesController.resume();
}
I realize this is all pretty vague, and appending etc. wouldn't work for some things like waterfall, but I've found these kinds of things useful in similar async frameworks in other environments.
I tried the parallel example:
async.parallel([
function(callback){
setTimeout(function(){
callback(null, 'one');
}, 200);
},
function(callback){
setTimeout(function(){
callback(null, 'two');
}, 100);
},
],
// optional callback
function(err, results){
// in this case, the results array will equal ['two','one']
// because the functions were run in parallel and the second
// function had a shorter timeout before calling the callback.
});
But it returns ['one', 'two'] instead of ['two', 'one']
Any ideas why?
how to do this? I cant find this in source
I have this error when I use mapSeries with a large array :
RangeError: Maximum call stack size exceeded
I was editing some code I had written synchronously with underscore.js (aren't we all :), and wanted to swap sync stuff out for some async goodness. But I found a conceptual mismatch: Underscore's ._each() has a function(value, key, list) iterator format, async only has a (value, callback): it's 'missing' the key +list.
Having the hash, key, value inside the iterator is very useful if you need a bit more then just a basic map()-like. Right now the iterator has no way of knowing where in the input collection the current value would fit or what it's key was. Which is fine when you just update the same collection (map() etc), but when you need the key for something else beside the looping it get's ackward.
Now I do a _.keys() and pull that through some other async stuff. It's nearly the same and works but it's not in the spirit of things no? :)
(edit: if I gathered enough courage I might attempt a fork/edit/pull but no promises :)
I did some performance testing and noticed that waterfall has quite a bit more overhead than series (about 5x). I would guess it's because it uses nextTick between each step in the waterfall.
Is nextTick there to prevent the call stack from growing out of control? I think you can call the next step directly, but keep track of how many calls have been made in the waterfall, and once it reaches are certain point (around 50?), dispatch the next call using nextTick. That should reduce the overhead associated with nextTick and still prevent the call stack from growing out of control.
Hi caolan,
Just curious: What is line 121 doing? It seems redundant:
https://github.com/caolan/async/blob/master/lib/async.js#L121
whilst & until would be much more useful if the test-callback could access the result of the last fn-call (it would get it as argument).
Peter
For testability it is sometimes required to be able to unmemoize a memoized function. My pull request (#57) does just that.
Maybe this will be a good addition to the async module:
http://blog.optimalbits.com/post/13423133948/asynchronous-debounce
Hi!
Is it possible to teach .waterfall()
take functions which simply return the result on success (instead of cb(null, result)
) and throw an error (instead of cb(err)
) on error condition? That way could be possible to extract the logic of actions to vanilla functions which could be both sync/async, thus making .waterfall()
neutral to asynchronity. Also that could simplify exception handling, since either programmatic throw or real exception would be equivalent.
What do you think?
TIA,
--Vladimir
I was writing a small library for working with directories when I ran into this problem:
I tried to remove a bunch of files, no matter if they still exist or not. Like this:
async.map(['file1', 'file2', ..], fs.unlink, callback);
But .map aborts - as described in your doc - after the first error. Sure, you can fix this by wrapping the callback and just ignoring the error but I dont like additional wrapper functions if they're not really necessary.
How about this:
async.map(['file1', 'file2', ..], fs.unlink, callback, true);
The last parameter is optional and stands for resumeOnErrors or something like that. I appreciate the clean API of async, but I think this would be a nice feature.
Function reads:
async.waterfall = function (tasks, callback) {
if (!tasks.length) {
return callback();
}
callback = callback || function () {};
....
should be:
async.waterfall = function (tasks, callback) {
callback = callback || function () {};
if (!tasks.length) {
return callback();
}
....
Just in case the tasks are empty and the callback is null
Better yet, you could check to make sure tasks is defined as well:
async.waterfall = function (tasks, callback) {
callback = callback || function () {};
if (!tasks || !tasks.length) {
return callback();
}
....
I am using async.parallel and i am running two db queries in each of the task.
I find that the main callback gets called immediately when the first task gets finished.
I have pasted somewhat stripped down version of the code i am using.
exports.validate = function (req, res, next) { async.parallel({ domain : function (cb) { logr.info('domain'); //this method makes a query to db checkdomain(email, function (err, det) { logr.info('domain callback'); if (err) { return cb(err); } cb(null, det); }); }, checkmail : function (cb) { logr.info('checkmail'); //another call to db. getuser(email, function (err, det) { logr.info('checkmail callback'); if (err) { return cb(err); } cb(null, det); }); } }, function (err, results) { if (err) { return next(err); } logr.info('parallel callback', err, results); res.send({ mail : results.checkmail, domain : results.domain }); }); };
logs:
[2011-05-03 16:27:06.390] [INFO] app - domain, [2011-05-03 16:27:06.449] [INFO] app - checkmail, [2011-05-03 16:27:06.459] [INFO] app - domain callback, [2011-05-03 16:27:06.463] [INFO] app - parallel callback, undefined, { domain: '/register/' }, [2011-05-03 16:27:06.483] [INFO] app - checkmail callback,
Am i missing anything? Is the main callback supposed to get called after all the tasks are finished?
Thanks.
Use case: I have some parallel tasks that writes files to a temporary directory, and I want a cleanup function to delete the temporary directory after all tasks complete. If one task fails, I'd like to delete the directory immediately. async.parallel
fits the bill perfectly, except that I need to wait to delete the directory until all currently running parallel tasks complete, lest the other running tasks try to write files to a nonexistent directory.
I imagine this is a pretty common scenario, and it's rather tricky to do without resorting to running the tasks serially. (Perhaps there's an easy way to do it that I haven't thought of -- would love to hear that.)
It would be useful if, for any parallel tasks, async
waited to call the callback function in the case of errors until all concurrently running tasks complete. That is, once a task calls its callback with an error, async
should stop starting new tasks and run the final callback once the number of running tasks reaches zero.
We are having an issue while running a async.series with in async.series ,
e..g we are having 13 document and for each document want to run check existence or some other task (for second async ).
So we are making 10 doucment async then on completion of these ten docuement making async of other three.
and for each document calling other async for checking .
While doing this it stoped after 11 docuement .
Can u suggest what need to take care while running async within async.
You have provided superior documention for this library.
Modifying the README in the following ways could save some people a bit of time:
Please insert into queue method list:
Please modify 'drain()' and 'empty() as follows:
This clarifies that the empty and drain methods function more like events - they will be called whenever the conditions are met, not just once.
Hi,
I have following code. This is working as expected if I use setTimeOout. If I simulate delay instead of using setTimeout, parallization is not working.
var async = require('async');
function getTasks(useTimeout){
return [
function(callback){
task("task 1", 500, callback, useTimeout)
}
,
function(callback){
task("task 2", 400, callback, useTimeout)
}
];
}
function task(name, delay, callback, useTimeout){
if(useTimeout){
console.log(name + "(" + delay +")");
setTimeout(function(){
callback(null, name);
},delay);
}else{
console.log(name + "(" + pause(delay) +")");
callback(null, name);
}
}
function callback(err, results){
console.log("async.parallel : "+ (new Date()-start) + "\n");
}
function pause(delay){
var start = new Date();
var len = 300000*delay;
for(var i=0;i<len;i++){};
return (new Date()-start);
}
var start = new Date();
//async.parallel(getTasks(false),callback);
async.parallel(getTasks(true),callback);
I often find myself turning my data objects ({a: 1, b: 2, ...}) into arrays so that I can .map or .forEach them. It would be great if these function could accept objects directly.
async.nextTick checks for the existence of process.nextTick on every call, which is somewhat inefficient considering how often it's called. Why not define it once, since the execution environment will never change between node.js and a browser?
I notice in package.json that you are at version 0.1.25 but newest version tagged and therefore available for well-structured download is version 0.1.6.
Please tag releases.
My interest specifically is that I intend to package async for Debian officially, and want to ease tracking upstream progress and fetching newer releases.
forEach(arr, iterator, callback)
my script did http get of 2 items
if the 2nd returned before the 1st the callback is called anyways.
even though the data of 1st is not returned yet,
if i used filter(arr, iterator, callback) instead of forEach it worked ok.
When I do "make all" I get an error since uglify isn't installed.
By passing the actual result to each function in async series you would be able to use a fantastic code flow as followed:
// this would be any library function
function someFunctionReturningDatabaseObject(callback) {
callback(null, dbResource);
}
async.series([
db: someFunctionReturningDatabaseObject,
query1: function(callback, results) {
results.db.query('SELECT ....', callback);
},
query2: function(callback, results) {
results.db.query('UPDATE .... WHERE .... = ?', results.query1.columnXYZ, callback);
}
]);
With the actual async-code this is not possible, you have to use some variables in the scope where the async.series function will be called, the proposed backwards compatible change will allow much cleaner control flows
Usually when I want an asynchronous queue, I'm adding tasks to it from a loop. For instance, I'm looping through the files in a directory; I want to read all of them simultaneously, then proceed to the next step of my application from the drain
callback.
The problem is that if the loop is zero-length, q.drain
never gets invoked. That is, the queue currently starts running only after the first task has been added.
I propose that in addition to the concurrency
option, a minTasks
option be added. minTasks = 1
would be the current behavior; minTasks = 0
would provide the behavior I want. Either would be a sensible default. One can imagine larger numbers being useful in rare cases.
In my work I came up with the following:
# Group
# Easily provide a completion event for a group of async functions
#
# Usage:
#
# # Create tasks list with a completion callback
# tasks = new util.Group (err) -> next err
#
# # Specify we have a new task to wait for
# ++tasks.total
#
# # Add our new task
# tasks.push someAsyncFunction arg1, arg2, (err) ->
# tasks.complete err
#
# # Or add our new task this way
# tasks.push someAsyncFunction arg1, arg2, tasks.completer()
#
# # Or add our new task this way
# tasks.push (complete) ->
# utsomeAsyncFunction arg1, arg2, complete
#
Group: class
# How many tasks do we have
total: 0
# How many tasks have completed?
completed: 0
# Have we already exited?
exited: false
# What to do next?
next: ->
throw new Error 'Groups require a completion callback'
# Construct our group
constructor: (@next) ->
# A task has completed
complete: (err=false) ->
if @exited is false
if err
return @exit err
else
++@completed
if @completed is @total
return @exit false
# Alias for complete
completer: ->
return (err) => @complete err
# The group has finished
exit: (err=false) ->
if @exited is false
@exited = true
@next err
else
@next new Error 'Group has already exited'
# Push a new task to the group
push: (task) ->
task (err) =>
@complete err
Here is an example use case:
# Write tree
# next(err)
writetree: (dstPath,tree,next) ->
# Group
tasks = new @Group (err) ->
next err
# Ensure Destination
util.ensurePath dstPath, (err) ->
# Checks
if err
return tasks.exit err
# Cycle
for own fileRelativePath, value of tree
++tasks.total
fileFullPath = dstPath+'/'+fileRelativePath.replace(/^\/+/,'')
#console.log 'bal-util.writetree: handling:', fileFullPath, typeof value
if typeof value is 'object'
util.writetree fileFullPath, value, tasks.completer()
else
fs.writeFile fileFullPath, value, (err) ->
if err
console.log 'bal-util.writetree: writeFile failed on:',fileFullPath
return tasks.complete err
# Empty?
if tasks.total is 0
tasks.exit false
# Return
return
# Return
return
Right now it's implemented in https://github.com/balupton/bal-util.npm - but if it can make it into async.js I think that would be a more suitable home for it
Hi,
I realy like your module cause it makes my life much easier :)
But, following example:
var object = {"a":"fooA", "b":"fooB"} async.forEach(object, function(item, callback){ //why is here no way to access the key of the item?! }, callback);
Posted this on stackexchange but maybe this is a bug?
var async = require('async');
var redis = require('redis');
var keys = ['key1', 'key2', 'key3'];
var client = redis.createClient();
var multi = client.multi();
for (var key in keys) {
multi.hmset(key, {'some': 'value'});
}
multi.exec(function(err, res) {
if (err) throw err;
console.dir(res);
var myCallback = function(err, res) {
console.log('in myCallback');
console.dir(res);
client.quit();
process.exit();
};
async.concat(keys, client.hgetall, myCallback);
});
output:
$ node redis_test.js
[ 'OK', 'OK', 'OK' ]
node.js:134
throw e; // process.nextTick error, or 'error' event on first tick
^
TypeError: Object #<Object> has no method 'send_command'
at /home/project/node_modules/redis/index.js:666:25
at /home/project/node_modules/async/lib/async.js:508:13
at /home/project/node_modules/async/lib/async.js:97:13
at Array.forEach (native)
at /home/project/node_modules/async/lib/async.js:26:24
at /home/project/node_modules/async/lib/async.js:96:9
at /home/project/node_modules/async/lib/async.js:507:9
at Object.concat (/home/project/node_modules/async/lib/async.js:141:23)
at /home/project/redis_test.js:21:9
at Command.callback (/home/project/node_modules/redis/index.js:827:13)
So a task with two dependencies would get two data parameters.
Apologies if this already exists, but the documentation seems to suggest it doesn't.
Might be a stupid question, but "finished" is never displayed in the following example
var async = require("async");
var sys = require("sys");
var a = ["foo","bar","baz"];
var f = function(arg) {
sys.log(arg);
}
async.forEach(a, f, function(err) {
sys.log("finished");
});
I'm testing the same code in Chrome (Stable, latest), IE9 and Firefox 8.0.1
The following code loads in the gallery items as normal on every other browser apart from firefox.
async.forEachSeries(this.elements, function(item, cb) {
item.css({
display: 'none',
left: pane_pos + 'px'
});
self.positions.push(pane_pos);
$('#panes').append(item);
pane_pos += item.width();
self.panewidth += item.width();
self.update_page_count();
item.slideDown('slow', "easeOutExpo");
setTimeout(cb, 100);
index++;
}, function() {
callback();
});
If you know of a fix this would help me greatly but if not I will attempt to fix it myself and submit a pull request if I manage to!
Thanks a lot for all your work on Async.js - I use it myself on many projects and love it, not that you need me to tell you that with other 1k watchers!
Regards,
Robin Duckett
The detect method fires the callback on every valid results, not the first one only.
detectSeries works as expected.
Example:
$ touch a b
$ node test.js
a
b
where test.js is:
var path = require('path')
, async = require('async');
async.detect(
['a','b']
, path.exists
, function(results) {
console.log(results);
}
);
I want to run 2 functions in waterfall then pass the value from the last one to 3 functions in parallel.
I have been struggling with waterfall + parallel without any results.
Is this possible with async somehow?
Hello,
for my needs, it would be nice if a queue could emit an event or run a callback when has processed its last task.
Is the following code decent enough to accomplish this?
diff --git a/lib/async.js b/lib/async.js
index 103e2bf..b51834d 100644
--- a/lib/async.js
+++ b/lib/async.js
@@ -544,7 +544,7 @@
}
};
- async.queue = function (worker, concurrency) {
+ async.queue = function (worker, concurrency, final_cb) {
var workers = 0;
var tasks = [];
var q = {
@@ -562,7 +562,11 @@
if (task.callback) {
task.callback.apply(task, arguments);
}
- q.process();
+ if (tasks.length === 0) {
+ if (final_cb) final_cb();
+ } else {
+ q.process();
+ }
});
}
},
Cheers,
Giacomo
Hi,
I think this might just be a matter of me being new to javascript and having so many callbacks etc., however I thought I would see if you can see what I am missing.
I am trying to have a flow similar to this:
async.parallel([
function(callback){
FetchData1(callback);
},
function(callback){
async.waterfall([
function(callback2){
FetchData2(callback2)
},
function(arg1, arg2, callback2){
FetchData3(arg1, arg2,callback2);
}
],function(err, results){
callback(null, results);
});
}],function(err, results){
// run export methods after all fetch calls are done
});
Each of the "FetchData" calls query some webservice and parse the data from it using xml2js. The issue is definitely that there are things happening asynchronously within these calls, but I cannot seem to get it to work correctly no matter what I do.
Is there a way to force waterfall to wait until everything is completely done?
I'm pretty sure the issue is in the second fetch that looks something like:
function FetchData2(arg1, arg2, callback2) {
db.collection('collec', function (err, collection) {
var stream = collection.find({"Week": weekNumber}).streamRecords();
stream.on('data', function(doc){
var parser = new xml2js.Parser({trim: false,normalize: false,emptyTag: '', explicitRoot:true});
parser.addListener('end', function(result) {
for(var i = 0; i < max; i++){ // save to db }
});
//
// get data and then call parser.parseString here
//
});
stream.on('end',function(){
callback2(null,'done');
});
});
}
@caolan <3 this library, so thanks again for being awesome. Here's the line in question: https://github.com/caolan/async/blob/master/lib/async.js#L501
Trying to migrate files between two Rackspace Cloudfiles environments using my (nodejitsu) [node-cloudfiles](http://github.com/nodejitsu/node-cloudfiles] library:
var async = require('async'),
cloudfiles = require('cloudfiles');
var source = new cloudfiles.createClient(sourceConfig),
target = new cloudfiles.createClient(targetConfig);
function migrateFiles () {
console.dir(arguments);
}
async.parallel([
async.apply(source.setAuth),
async.apply(target.setAuth)
], migrateFiles);
I tend to write "classy" Javascript, and it would be nice if I could pass a thisArg
to async.apply
somehow. I definitely see the value in having a simple case for functions that dont use this
. Maybe a new function async.thisApply
?
I think it would be nice to include a "break" option inside an iterator to emulate the 'break' statement used in loops.
Actually I created my own by sending an extra parameter to the iterator called 'break'. It can be done just by calling the callback manually but it's not so practical to create a reference to the end callback before calling async.forEach
If you think it is useful enough I can send a pull request.
From the readme:
you may like to take a look at the new implementation of the old node Promise objects
Where can I find this? Thank you.
It we be nice to have a central library containing some solid core Async functionality.
Although the current async.parallel
function is useful, it fails to respect function order giving useless results.
async.parallel([
function(callback){
setTimeout(function(){
callback(null, 'one');
}, 200);
},
function(callback){
setTimeout(function(){
callback(null, 'two');
}, 100);
},
],
// optional callback
function(err, results){
// in this case, the results array will equal ['two','one']
// because the functions were run in parallel and the second
// function had a shorter timeout before calling the callback
// because the results depend on
});
I would suggest merging similar functionality of asyn into `async'.
It provides a very clean syntax for creating async
code that runs in parallel.
asyn
(fs.writeFile)('foo','bar',asyn)// func 0
(fs.readFile)('server.js',asyn) // func 1
.end(function(asyn){
/*
* Log the second argument (a buffer) of the 2nd func
*/
console.log(String(asyn[1][1]));
})
If it doesn't seem fit for async
, it could remain separate, but it adds missing and beneficial functionality.
Hi,
when using web services usually we must respect some rate limits.
Examples:
It would be nice to have a queue that is configurable with these rules, then one would just add jobs being sure that the web service's limits will be respected.
Does it smake sense?
Is this something that would be nice to have in async or it would be better implemented in a separate module?
Cheers,
Giacomo
Was wondering when you were planning on releasing a next version?
Recently I ran async.map on a large array and was surprised to discover that it breaks down somewhere between 2048 and 4096 entries.
My code:
var requests = []; for (var i=0; i<2048; requests.push(i++)); async.map(requests, benchmark, function (err, result) { async.reduce(result, 0, function (memo, item, callback) { callback(null, memo+item); }, function (err, sum) { console.log(sum/result.length); }); });
This is the error that I get:
/usr/local/lib/node/.npm/async/0.1.8/package/lib/async.js:173 async.forEachSeries(arr, function (x, callback) { ^ undefined
I have an async function that executes some shell commands in parallel
require("fs").readdir("./", function (error, folders) { // asynched
require("underscore")._(folders).each(function (folder, folderKey, folderList) { // asynched
r("child_process").exec("ls ./" + folder, function(error, stdout, stderr) {
console.log("Cant put it here") // Will be run after the first execution is completed
})
console.log("Cant put it here either") // Will be run immediately before any execution is completed
})
console.log("Cant put it here either") // Will be run immediately before any execution is completed
})
I want to do something after those shell commands are executed but I cant figure out how to do this with async library. Those shell commands are executed in parallel, so there is no way to register a handler that is executed after all of them are executed.
Any ideas?
The current order is very awkward to use especially in coffeescript.
It doesn't seem to be possible to pass arguments to the first function in waterfall.
Could this be done in other ways because I really need the first one to get some variables.
Obviously this shouldn't happen under normal circumstances, but this caused a very difficult to track down bug for me recently.
Ideally, calling the iteration callback twice from the same loop should raise an exception.
The case i was hitting can be reduced to:
var async = require('async');
async.parallel({
one: function(cb) {
setTimeout(function() {
cb(null, 1);
}, 1000);
},
two: function(cb) {
cb(null, 2);
cb(null, 3);
}
}, function(err, result) {
console.log(result);
});
Which outputs { two: 3 }
(one
not in the object)
I had an issue with the value of this inside the chained functions while using async. Probably there's some obvious javascript concept that I'm missing here. Anyways, I stripped down the problem to the following sample:
var async = require('async');
var MyTest = function(){};
MyTest.prototype.test2 = function(cb){
console.log('test2');
console.log(this);
setTimeout(cb,200);
}
var myTest = new MyTest();
myTest.foo = "bar";
function test1(cb){
console.log('test1');
console.log(this);
setTimeout(cb,200);
}
function normalCallbacks () {
test1(function(err){
myTest.test2();
});
}
normalCallbacks();
the output is:
test1
{}
test2
{ foo: 'bar' }
When I try the same with async.series, foo: bar is not visible:
async.series([
test1
, myTest.test2
], function(err,result){
if ( err ) console.log(err);
});
the output becomes:
test1
{}
test2
{}
Is this the expected behavior?
I would like to do that but the problem is that there is no "init" method. I need to be able to start the queue whenever i want, is that possible?
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.