Giter VIP home page Giter VIP logo

transducers-js's People

Contributors

dchelimsky avatar ddeaguiar avatar puredanger avatar swannodette avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

transducers-js's Issues

Is it possible to write a transducer to perform zip?

I wasn't sure where best to ask this question. Please point me in the right direction if this is not the best place.

I was wondering if it possible to create a transducer that acted like rx's zip? For example say I had a collection of collections [[a, b, c][1, 2, 3]] I would like it to be transformed into [[a,1], [b,2], [c,3]].

Taking this a step further it would be great to be able to write it in a more reusable way, for example a combinator transducer that takes a function that will be called with the nth element of each nested collection:

var zip2 = function (x, y) {
  return [x, y];
};
var apush = function (arr, x) {
  arr.push(x);
  return arr;
};
var xf = t.combinator(zip2);
transduce(xf, apush, [], [[a, b, c][1, 2, 3]]); // [[a,1], [b,2], [c,3]]

Browser version through exception

When i try to run example from documentations

`var t = transducers;
var inc = function(n) { return n+1; };
var isEven = function(n) { return n % 2 == 0; };
var apush = function(arr,x) { arr.push(x); return arr; };
var xf = t.comp(t.map(inc),t.filter(isEven));

console.log(
t.transduce(xf, apush, [], [1,2,3,4])
); `

I had exception
TypeError: t.comp is not a function

If i changed t.comp to t.compose

Error: don't know how to iterate collection: function (r) { var value = r; for(var i=funcs.length-1; i>=0; i--) { value = funcs[i](value); } return value; }

I've installed transducers-js via bower (version 0.4.174)

What i missed ?

Anybody home?

I was so excited that @puredanger put this out there, but I'm wondering from the open issues and last update to the code - has the library has been abandoned?

Official spec for transformer protocol

@swannodette if you recall, this was something I had asked about and while I don't recall the exact reason for not having a transformer protocol implemented in transducers-js, I do remember there was a reason.

I've been thinking about it a bit lately, and I do think it's worthwhile to be able to define, particularly for JavaScript where it's much simpler to bake this into the prototype rather than defining as a lookup map of handlers as in transit-js.

As more libraries in JavaScript begin implementing this protocol (this was prompted by @kevinbeaty's excellent transducer PR on the ramda.js project), I wanted to see if there could be an agreed upon spec for the transformer before things get too far along.

This is the implementation kicked off by @jlongster

var t = require('./transducers');
Immutable.Vector.prototype[t.protocols.transformer] = {
  init: function() {
    return Immutable.Vector().asMutable();
  },
  result: function(vec) {
    return vec.asImmutable();
  },
  step: function(vec, x) {
    return vec.push(x);
  }
};

For starters - the use of a Symbol('transformer') is problematic as Symbol() creates a unique value (Symbol('transformer') !== Symbol('transformer')), and so you lose any interop when defined independently in multiple libraries.

I'd propose all transformer protocols be implemented as a @@transformer string (similar to @@iterator) until if/when the transformer is officially recognized in the well known symbols list.

I'd also propose that the spec behave similarly to an iterator, in that it is a function which returns the transformer, rather than as just an object proposed above:

Immutable.Vector.prototype['@@transformer'] = function() {
  return {
    init: function() {
      return new Immutable.Vector().asMutable();
    },
    result: function(vec) {
      return vec.asImmutable();
    },
    step: function(vec, x) {
      return vec.push(x);
    }
  };
}

This makes the implementation more useful, as it can refer to the current object in the init to know what value makes to init, should the object be subclassed:

SomeObject.prototype['@@transformer'] = function() {
  var obj = this;
  return {
    init: function() {
      return new obj.constructor()
    },
    step: function(result, arr) {
      return result.set(arr[0], arr[1]);
    },
    result: function(obj) {
      return obj;
    }    
  };
}

Curring "into" method, will duplicate list elements

Hi, I'm using transducer to transform an array from A to B. Source data is from a Promise, so, in order to keep code simple, I curried the "into" method.
The problem is that when I call the curried function more than once, the resulting list grows with duplicated elements.

this is my code

function Model(){
    //first i define the curried function
    var mapResults = curry(t.into)([], xf); 
    return{
         load:()=>asyncFetch(request).then(mapResults)
    }
}
//then later I call the load method
var m=Model();
m.load()

Looking at the transducer code, I saw that into method has this piece of code

if (com.cognitect.transducers.isArray(a)) {
    return com.cognitect.transducers.transduce(b, com.cognitect.transducers.arrayPush, a, c);
  }

a variable is the initial array. but in the curried version, it is always the same. So I changed a to a.slice(0) in order to create a copy of it.

Sorry for my english,
hope it can help.
marco

Change to DepsGenerator breaks bin/test

A change to DepsGenerator means that running bin/test throws:

Exception in thread "main" java.lang.IllegalArgumentException: No matching ctor found for class com.google.javascript.jscomp.deps.DepsGenerator, compiling:(/Users/jasonr/Projects/github/kainoa21/transducers-js/bin/closure_de
ps_graph.clj:18:5)

An easy fix can be seen here. This would break for anyone who has an older version of closure-compiler.

Also, the zip file referenced in bin/deps (https://github.com/kainoa21/transducers-js/blob/TRANSDUCERS-39/bin/closure_deps_graph.clj) now includes closure-compiler-version.jar instead of just compiler.jar which breaks the reference in bin/make_deps_js

Creating transformers ... "@@"??

Does the use of "@@" in the protocol mean that in es6 code I should actually do something like the following to define a transformer?

const init = Symbol.for('transducer/init');
const result = Symbol.for('transducer/result');
const step = Symbol.for('transducer/step');
function MapTransformer() {
  return {
    [init]() { return xf[init](); }
    [result](res) { return xf[result](res); }
    [step](res, input) { return xf[step](res, f(input)); }
  };
}

I'm confused what the "@@" are supposed to represent as it doesn't look like you have es6 support -- I don't suppose the above is actually interoperable with the library. But isn't "@@" for "well-known symbols" in es6?

Proposal: @@transducer/init should always be called, create an accumulator object.

CF -- jlongster/transducers.js#46 -- don't know which of these projects is more/less active.

Consider the following implementation of a "zip" transformer:

function zip() {
  return xf => Zip(xf);
}
const sub = Symbol('sub');
function Zip(xf) {
  return {
    ['@@transducer/init']() {
      const result = { [sub]: [] };
      // if init is not implemented in wrapped, ignore
      try {
        result.wrapped = xf['@@transducer/init']();
      }
      catch(err) { }
      return result;
    },
    ['@@transducer/result'](result) {
      if(result[sub] == null || result[sub].length === 0) {
        return result.wrapped || result;  
      }
      const wrappedResult = result[sub][0].reduce((acc, input, i)=>
        xf['@@transducer/step'](acc, result[sub].map((a)=>a[i]))
      , result.wrapped);
      return xf['@@transducer/result'](wrappedResult);
    },
    ['@@transducer/step'](result, input) {
      if(result[sub] == null) {
        // "into" - init not called
        const acc = this['@@transducer/init']();
        // pass the evil on to the wrapped accumulator
        acc.wrapped = result;
        result = acc;
      }
      result[sub].push(input);
      return result;
    }
  };
}

It "works" but does it by hackery. What it should do is create an accumulator object in init, then accumulate the subcollections. On finalization, it can feed the final objects to the downstream transformer.

I propose that @@transducer/init be responsible to create and return an accumulator object which wraps the downstream transformer accumulator. It could have signature:

['@@transducer/init'](finalAccumulator) { }

With default implementation:

['@@transducer/init'](finalAccumulator) { return xf['@@transducer/init](finalAccumulator); }

(Here xf is the downstream transformer -- could be this.xf depending on implementation.)

By default, as in zip, we could use the accumulator to store state. Eg. we could have a transducer that was calculating a histogram, then forwarding the bins onwards (etc.).

If an accumulator did wrap downstream state in its own state, it is then responsible for unwrapping the downstream state in step and result.

finalAccumulator is the thing into which the whole pipeline is writing. Normally we ignore it,
but special end of chain "output aware" transformers could use it (provided by the library).

Is it allowed to call step() inside init()?

In other words, is the following transducer correct?

function prependWith(x) {
  return function(xf) {
    return {
      init: function() {
        var res = xf.init();
        if (isReduced(res)) {
          return res;
        } else {
          return xf.step(res, x);
        }
      },
      step: function(res, input) {
        return xf.step(res, input);
      },
      result: function(res) {
        return xf.result(res);
      }
    };
  }
}

If I am adding transducers support to my library, should I consider this case?

Keep should return results of transform

I was reading through the documentation, and noticed that keep was basically the same as filter. Looking into it a bit more, I believe keep is supposed to return the results of the transform that are not null as opposed to the original values, as laid out at https://clojuredocs.org/clojure.core/keep .

var t = transducers;
var xf = t.keep(function(x) { if(typeof x == "string") return "cool"; });
t.into([], xf, [0,1,"foo",3,4,"bar"]); // ["foo","bar"] -> should return ['cool', 'cool']

Direction of functional composition with toFn?

The function comp for functional composition is applied using reverse ordering when used with the toFn() function.

I would expect that comp(f,g) would result in a function f(g(x)), as confirmed in the example given for the function:

var inc = function(n) { return n + 1 };
var double = function(n) { return n * 2 };
var incDouble = t.comp(double, inc);
incDouble(3); // 8

However, looking at the toFn function, the doc example (and experimentation) provide a result as g(f(x)):

var arr = [0,1,2,3,4,5],
var apush = function(arr, x) { arr.push(x); return arr; },
var xf = t.comp(t.map(inc),t.filter(isEven));
arr.reduce(t.toFn(xf, apush), []); // [2,4,6]

Were the composition consistent with the first example, the expected result of composing the mapping of increment to the even values of the input array should always be an array of exclusively odd values. Instead, it appears that we are filtering the even values of the incremented input array, i.e. it looks more like we should expect from t.comp(t.filter(isEven), t.map(inc)).

LazyTransformer fails when multiple values produced per step

Running

toIter([[1, 2], [3]], cat)

produces the error

TypeError: Cannot read property 'items' of undefined
    at Object.stepper.@@transducer/step (./node_modules/transducers.js/transducers.js:868:5)
    at Object.newxform.@@transducer/step (./node_modules/transducers.js/transducers.js:743:41)
    at reduce (./node_modules/transducers.js/transducers.js:149:42)
    at Cat.@@transducer/step (./node_modules/transducers.js/transducers.js:747:10)
    at Stepper.@@transducer/step (./node_modules/transducers.js/transducers.js:888:36)
    at LazyTransformer.@@transducer/step (./node_modules/transducers.js/transducers.js:918:38)
    at LazyTransformer.next (./node_modules/transducers.js/transducers.js:903:28)
    at reduce (./node_modules/transducers.js/transducers.js:160:20)
    at toArray (./node_modules/transducers.js/transducers.js:802:12)

where we would expect to get an iterable for the values [1, 2, 3]. For comparison,

toArray([[1, 2], [3]], cat)

produces [1, 2, 3] as expected.

In general, this occurs any time the transducer provided to the constructor of LazyTransformer causes @@transducers/step to be called more than once in its input transformer for a single call in its output transformer. With the transducers provided in this library, this is possible when using the cat and mapcat transducers.

How do I create a transformer with state?

I assume something like the following?

var MapWithState = function(f, xf) {
    return {
       "@@transducer/init": function() { 
           return {wraps: xf["@@transducer/init"](), mystate: {...}}; 
       },
       "@@transducer/result": function(result) { 
           return xf["@@transducer/result"](result.wraps); 
       },
       "@@transducer/step": function(result, input) {
           return xf["@@transducer/step"](result.wraps, f(result.mystate, input)); 
       }
    };
};
... Ie -- in result and step, I am guaranteed to be passed back whatever I created, and I guarantee that I pass back the wrapped state to the next transformer? ... However this protocol is supposed to work it should be documented....

Clarifying behavior of transducer/init

Over on transduce, @kevinbeaty and I have been trying to determine the responsibility of @@transducer/init: discussion

The purpose of init as a place for defining the initial value of a reduction is clear, but the question is whether it should account for any state or context the object might already retain. In short, does init return:

  1. a new empty value:
return new this.constructor()
  1. the value:
return this
  1. a new value retaining some notion of state:
return new this.constructor(into({}, this.data)) 

It seems like the cleanest/most useful of those would be number 3, and applied to the example of Immutable.js List, it'd just be a matter of making the current value mutable/immutable:

List.prototype['@@transducer/init'] = function() {
  return this.asMutable();
};

List.prototype['@@transducer/step'] = function(result, arr) {
  return result.set(arr[0], arr[1]);
};

List.prototype['@@transducer/result'] = function(obj) {
  return obj.asImmutable();
};

Thoughts?

Does the protocol support object keys at all?

var t = require('transducers-js');
t.into({}, t.map(x => x + 1), {foo: 1});

Expected: {foo:2} — Actual: {f: 'o'}

The "issue" is here:

transducers.objectReduce = function(xf, init, obj) {
    var acc = init;
    for(var p in obj) {
        if(obj.hasOwnProperty(p)) {
            acc = xf["@@transducer/step"](acc, [p, obj[p]]);
//                                             ^^^^^^^^^^^
//                                             The mapping function `x => x + 1` is applied to `['foo', 1]`
//                                             which returns `'foo,11'`
            if(transducers.isReduced(acc)) {
                acc = transducers.deref(acc);
                break;
            }
        }
    }
    return xf["@@transducer/result"](acc);
};

Then:

transducers.addEntry = function(obj, entry) {
  // entry: 'foo,11'
  // entry[0] = 'f'
  // entry[1] = 'o'
  obj[entry[0]] = entry[1];
  return obj;
};

This could be fixed by having the @@transducer/step function take three arguments:

  1. An accumulation
  2. A value
  3. A key (optional)

However I can't seem to find a lot of support for keys in the implementation. Just wondering whether this is by design or simply an oversight? Or just me not understanding something (most likely!)?

`into` seems to test the wrong argument

into is testing the type of the coll argument, but it seems like it should be looking at the type of the empty argument, since that's the thing the reducer (stringAppend/arrayPush/addEntry -- i think that's the correct terminology?) is operating on.

i'd be happy to file a PR if this is correct.

How to implement the protocol?

I have the following object:

function Identity(x) {
  if (!(this instanceof Identity)) {
    return new Identity(x);
  };

  this.runIdentity = function() {
    return x;
  };
}

Identity.prototype.map = function(f) {
  return Identity(f(this.runIdentity()));
};

Identity.prototype.toString = function() {
  return 'Identity(' + this.runIdentity().toString() + ')';
};

And I cannot figure out how to implement the transducers protocol.

I've tried the following:

Identity.prototype['@@transducer/init'] = function() {
  return this;
};

Identity.prototype['@@transducer/result'] = function(x) {
  return x;
};

Identity.prototype['@@transducer/step'] = function(_, y) {
  return Identity(y);
};

But then when I try to use it, it doesn't produce the output I'd expect:

var inc = function(n) { return n + 1; };

console.log('%s', t.into([], t.map(inc), Identity(3)));
➜  ~  node identity.js 
runIdentity,function () {
    return x;
  }1

It looks like it's creating the array, but putting the wrong values into it.

If I try to use an Identity as the thing to build stuff into, then I wind up with more unexpected behavior:

console.log('%s', t.into(Identity(0), t.map(inc), Identity(3)));
➜  ~  node identity.js
Identity(0)

It looks like it's ignoring the input, and just returning the thing to build into.

Can anyone provide some insight as to what is going on here? I feel like I cannot understand what the protocol wants, and that's what is causing my issues here.

collaboration

I'm trying to figure out the future of my lib (https://github.com/jlongster/transducers.js), the timing of our releases somewhat collided :) It's funny because I also converged on the same implementation internally.

The only thing I like in my lib is the integration points: https://github.com/jlongster/transducers.js#applying-transformations. What do you think about those methods, particularly how seq works? I think it's neat that if it's passed an iterator it will return a new iterator that lazily transforms.

If we continue to have 2 different libs, I may rename mine. Additionally, it'd be nice to figure out how get other libs to integrate transducers without specifically depending on one. I changed my objects-with-3-methods to have the same methods as yours (I only had to change finalize to result), but Reduced is a problem. In this issue (jlongster/transducers.js#2), someone proposed a way to detect a "reduced" object without specifically depending on a runtime instance.

How is takeWhile different from filter?

takeWhile seems to check the predicate, but I assumed that once the predicate returned non-true, it would forever ignore subsequent values. If you had input like [0, 1, 2, 3, 4, 3, 2, 1], you'd get [0, 1, 2, 2, 1] (using return n < 3).

Is takeWhile meant to be an alias for filter?

asynchrony

Can I return a promise from my step function whose value resolves to the result returned by the wrapped step function? ... hmm... but then in @@transducer/result at least I need to return an unwrapped value, I guess. Any suggestions on how to write an asynchronous step function?

Use case -- I am trying to transduce a stream of streams.

partitionBy bug

If you use a transform after partitionBy that can early terminate, it looks like partitionBy doesn't properly unwrap from the Reduced object:

var t = require('transducers-js');

console.log(
  t.into([], t.comp(t.partitionBy(function(x) { return x; }),
                    t.take(2)),
         [1, 1, 1, 2, 2, 3, 3, 3, 3])
)

Output:

{ __transducers_reduced__: true,
  value: [ [ 1, 1, 1 ], [ 2, 2 ] ] }

what is transducers

The readme should contain at least a couple of paragraphs about what this project actualy is and what it actualy does.

Just for the curious.......

Exports typo

I think there is a typo in the exports of transducers.js:

            keep: transducers.keep,
            Kemove: transducers.Keep,

should likely be:

            keep: transducers.keep,
            Keep: transducers.Keep,

Where does the convention "at-at-slash" come from?

As in @@transducer/init, for example. Just out of curiosity.

I've recently seen this convention used in other libraries (eg. here and here), but I saw it here first. I suppose it's a way of namespacing, but I couldn't find an explanation or explicit mention about it anywhere.

Thanks.

Add .type field to transducers

When creating a library which accepts transducers, it is useful to warn the users, if they mixed up the arguments. For example in

function createChannel(transducer = null) {}

it would be great if createChannel can (runtime) assert that it got the transducer and not e.g. string. For this purpose, .type attribute attached to transducer fn with value such as 'transducers-js' could be used. Another possibility is to add .@@__is_transducer__@@ attribute set to true (idea copied from https://github.com/facebook/immutable-js/blob/master/src/Iterable.js#L38)

Currently transducers are functions. This means we can distinguish them from strings, numbers and objects, but still, there is a lot of other functions that can be easily mixed up with transducers.

Finally, if someone wants to implement transducer protocol by themselves, they can assign the .type manually. This is however rather rare scenario.

"reduced" wrapper?

The documentation suggests that some method of wrapping values might be useful for avoiding unnecessary computation. But it seems very vague on what this is all about. Under what circumstances would one use this "reduced" wrapper?

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.