Giter VIP home page Giter VIP logo

bs-decode's Introduction

bs-decode

build status test coverage npm version license

Note

bs-decode has been stable and used in production for several years, so a v1 release makes sense. This is the final release that will be compatible with BuckleScript as we turn our attention to the newer OCaml features available in Melange.

Read the Documentation

Decode JSON values into structured ReasonML and OCaml types. Inspired by Elm's Json.Decode and the Decode Pipeline, bs-decode is an alternative to bs-json that focuses on structured, type-safe error handling, rather than exceptions. Additionally, bs-decode collects up everything that went wrong while parsing the JSON, rather than failing on the first error.

Installation

Install via npm:

npm install --save bs-decode relude bs-bastet

Update your bsconfig.json

"bs-dependencies": [
  "bs-bastet",
  "bs-decode",
  "relude"
],

Usage

The following is available to give you an idea of how the library works, but the complete documentation will probably be more useful if you want to write your own decoders.

// imagine you have a `user` type and `make` function to construct one
type user = {
  name: string,
  age: int,
  isAdmin: bool,
  lastLogin: option(Js.Date.t)
};

let make = (name, age, isAdmin, lastLogin) =>
  { name, age, isAdmin, lastLogin };

/**
 * Given a JSON value that looks like:
 * { "name": "Alice", "age": 44, "roles": ["admin"] }
 *
 * you can write a function to convert this JSON into a value of type `user`
 */
module Decode = Decode.AsResult.OfParseError; // module alias for brevity

let decode = json =>
  Decode.Pipeline.(
    succeed(make)
    |> field("name", string)
    |> field("age", intFromNumber)
    |> field("roles", list(string) |> map(List.contains("admin")))
    |> optionalField("lastLogin", date)
    |> run(json)
  );

let myUser = decode(json); /* Ok({ name: "Alice", ...}) */

Contributing

All contributions are welcome! This obviously includes code changes and documentation improvements (see CONTRIBUTING), but we also appreciate any feedback you want to provide (in the form of Github issues) about concepts that are confusing or poorly explained in the docs.

License

Released under the MIT license.

bs-decode's People

Contributors

alavkx avatar chinmayakcv avatar dependabot[bot] avatar emilios1995 avatar flash-gordon avatar hamza0867 avatar idkjs avatar johnhaley81 avatar mlms13 avatar phated avatar trite avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar

bs-decode's Issues

New version

Is it possible to release a new version? I am using master in various projects, but I would like to use a stable version across them.

Encoding?

Just a thought, it might be great to have Encoding in this library, even if it exactly matches the design of bs-json, so that we can have a one-stop shop for JSON decoding & encoding.

But that might necessitate a name change, too :o.

bs-jaysawn ๐Ÿ˜†.

bs-codec ๐Ÿ˜ด

bs-dragonglasses ๐Ÿ”ฅ ๐Ÿค“

`fallback` is only for fields?

This feels awkward, to the point I'm going to avoid documenting fallback for now. It seems like if you want to use fallback with a field, you should just D.fallback(D.field("foo", D.string), "default", json). That way you could also use fallback in combination with any simple JSON value as well.

I'm not quite sure how this will affect the Pipeline module, which has its own version of fallback.

Deprecate mapN, NonEmptyList, ParseError.ResultOf

Not sure it's worth making an additional release just to mark these things as deprecated, but:

  • map2 ... map5 can be removed with let+/and+ as an easy replacement
  • Decode_NonEmptyList should be deprecated in ...OfParseError.rei
  • Decode_ParseError.ResultOf can go, because this is only useful for building alternatives to the main Decode type

Idea for decoder type

This is purely for discussion/passing consideration, but I was poking around with the code, and I noticed it might be interesting to define a type for the decoder function Js.Json.t => M.t('a)

One way to do it would be like this (the type annotations were for me - you can leave them out):

  // inside DecodeBase
  type t('a) =
    | Decoder(Js.Json.t => M.t('a));

  // run function to apply the json input and produce the M result
  let run: (Js.Json.t, t('a)) => M.t('a) =
    (json, Decoder(decode)) => decode(json);

  module Functor: FUNCTOR with type t('a) = t('a) = {
    type nonrec t('a) = t('a);
    let map = (f, Decoder(decode)) => Decoder(decode >> M.map(f));
  };

  module Apply: APPLY with type t('a) = t('a) = {
    include Functor;
    let apply: (t('a => 'b), t('a)) => t('b) =
      (Decoder(f), Decoder(decode)) =>
        Decoder(json => M.apply(f(json), decode(json)));
  };

  module Applicative: APPLICATIVE with type t('a) = t('a) = {
    include Apply;
    let pure = a => Decoder(_ => M.pure(a));
  };

  module Monad: MONAD with type t('a) = t('a) = {
    include Applicative;
    let flat_map: (t('a), 'a => t('b)) => t('b) =
      (Decoder(decode), f) =>
        Decoder(json => M.flat_map(decode(json), a => f(a) |> run(json)));
  };
  
  // ... etc

Another way would be to leave out the data constructor Decoder and just make it an alias. I don't think you'd have to change much at all if you just did this.

type t('a) = Js.Json.t => M.t('a);

Doing that kind of cleans up the signatures for map/apply/bind/etc, and makes the implementations maybe a little more clear in that the Js.Json.t => M.t('a) bit would be hidden inside a type, rather than repeated. It might also make it more obvious what your core decoder type is for those new to the library - at its core it's just a Js.Json.t => M.t('a) function.

I'd also point out that the decoder function 'a => M.t('b) is basically the same as the definition of ReaderT. https://github.com/reazen/relude/blob/master/src/Relude_ReaderT.re#L10 - there might be some things to reuse from Relude or some general ideas that might come out of that. ReaderT is an abstraction where you can compose functions that will eventually be supplied some "environment" value in order to run and produce the result. In this case the "environment" is the json value.

Add a README

  • Installation instructions
    • Publish to npm
  • Usage
    • Elm Decode style
    • Elm decode pipeline style
    • Haskell applicative validation style
  • Explain error handling
    • Recursive structure
    • To nonempty list of error strings
    • To option

mention bs-bastet as a peer dependency over bs-abstract in the docs

As you can see in the picture below, the doc still mentions using bs-abstract as a peer dependency
Screenshot from 2020-04-12 15-19-04

But when I ran yarn install bs-decode, yarn told me bs-decode needs a peer dependency called bs-bastet

I believe the docs need a small upgrade in this regard

Thank you guys for the amazing work you put on this library

Nesting decoders is not obvious

I had a lot of trouble determining how to nest decoders. I believe this is due to the pipe function not being exported from the Pipeline module.

The following code works but it is certainly not obvious this is how it should be done.

module Meta = {
  type t = {
    id: string,
    comment: string,
  };

  let make = (id, comment) => { id, comment };

  let fromJson = json =>
    Decode.AsResult.OfParseError.Pipeline.(
      succeed(make)
      |> at(["meta", "id"], string)
      |> at(["meta", "comment"], string)
      |> run(json)
    );
};

type t = {
  meta: Meta.t,
  example: string,
  other: string,
};

let make = (meta, example, other) => { meta, example, other };

let fromJson = json =>
  Decode.AsResult.OfParseError.Pipeline.(
    succeed(make)
    |> map2((|>), Meta.fromJson) // This is very cryptic, copy of the `pipe` function.
    |> field("example", string)
    |> field("other", string)
    |> run(json)
  );

Is there a better way to do this? Should the pipe function be exposed?

Finish docs

Docs are endless and I'm pretty sure they'll never be "finished" but we're getting to a point where I'm going to merge the docs website into master. Here's what we're still missing:

  • Optional values and recovery
    • D.option
    • D.oneOf
  • Objects/Records
    • D.field
    • Pipeline decoding
    • Haskell-style decoding
  • Variants
    • Simple variants
    • Custom Decode extensions

Add dictJson and arrayJson

Currently, our dict and array decoders expect to also decode the inner values, but there are cases where you may want to preserve the inner JSON. Also, we could probably rewrite the existing dict and array (and maybe at and field) in terms of these new functions, and it might make them quite a bit simpler to understand.

Add `oneOf` to try multiple decoders

module D = Decode.AsResult.OfParseError;
type foobar = Foo(FooModule.t) | Bar(BarModule.t);

let decodeFoo = json => FooModule.decode(json) |> D.ResultUtil.map(v => Foo(v));
let decodeBar = json => BarModule.decode(json) |> D.ResultUtil.map(v => Bar(v));

let decodeFoobar = json =>
  D.oneOf([ decodeFoo, decodeBar ], json);

Internally, oneOf should probably be defined in terms of alt (<|>). The list of decoders will all need to return the same success type 'a. oneOf should try the list of decoders in order and return the first Ok. If all decoders fail, return the last Error is probably fine, although it might be neat to collect the errors in a structured way.

Add Map decoder

For simple Belt.Map types with string keys, we can reuse the dict decoder and convert pretty easily, but it would also be nice to take a compare function and a key decoder and return a Map with whatever key type you want (e.g. a Map where the keys are dates).

Needs bs-abstract upgrade

Just a little note-to-self that bs-decode is another project that can't be compiled with the latest bs-platform until bs-abstract make a release resolving the empty array issue.

Organize tests better

Part of the challenge of generating lots of modules from a single "Base" module is that we have lots of functions that do basically the same thing but no consistency in how those things get tested.

Proposal:

  • Use Decode_Option_test for testing basic success-vs-failure decode cases
    • Have a section for simple decoders like string, float, date, etc
    • Have a section for nested decoders like field, list, tuple etc
    • Have a section for decoder composition like map, apply, flatMap, alt
    • Decode records
      • ...via map2 and friends
      • ...using infix operators on the decoders (lazy)
      • ...via Pipeline
  • Use Decode_Result_OfParseError_test for checking the errors
    • Each failure produces the right type of error
    • Multiple failures are collected correctly
    • Basic test for converting structured error to debug string

No obvious tests for `at`

There appears to be a test buried in the "custom" Result test, but we should get some tests in the main test module.

Remove ResultUtil from public interface

Internally, we'll still depend on the ability to do certain operations with the output type (option vs result vs whatever), but I don't think we should continue to expose the ResultUtil module of helpers to the outside world. We're not in the business of being a stdlib replacement.

Deprecate tuple decoders

tupleFromFields should definitely go... there have been times I wanted that in the past, and even I didn't remember it exists. With the addition of let-ops, that one is way more easily expressed as:

let+ first = field("first", string)
and+ second = field("second", boolean);
(first, second);

And while thinking about that, I started to wonder if the helpers like tuple2 and tuple3 could also be simplified. I'm thinking something like:

let+ first = arrayAt(0, string)
and+ second = arrayAt(1, boolean)
and+ third = arrayAt(2, date);
(first, second, third);

The downside is that it's more verbose than what we currently have, but I see several upsides:

  • the syntax is consistent with decoding objects/records, so there are fewer concepts to remember
  • you aren't arbitrarily capped at a 5-tuple... tuples can get as big as you want
  • picking specific positions out of arrays might be generally useful even outside of tuples

JSON-encoded tuples aren't a supper common occurrence in my experience, and the current approach requires quite a bit of code to support. I'll have to think about it, and I'm happy to hear any feedback.

The Plan:

  • add arrayAt
  • deprecate tupleN
  • deprecate tupleAtLeastN

And when these are actually removed in 2.0, I'll make sure to add tuple examples to the docs and also include this change in an upgrade guide.

For clarity, add a .map and .flatMap for decoders

You clued me in on mapping the result of a decoded function in Discord, and I should have thought of that myself. But I think adding map and flatMap to the decoder library would be helpful. You mentioned that you wanted to, so I'm making an issue for it!

bs-decode no longer compiles v0.11.1

Hello guys,

Thank you for this amazing library. I just want to point out that bs-decode no longer compiles under bs-platform 7.3.2 or 8.0.2.

I believe the problem comes from this line.

Here is a reproducible repo.

Instructions to reproduce the error and a possible fix are in the repo.

Add a null decoder

null is a valid JSON value, so it makes sense to have a decoder that succeeds only if it encounters the literal null value.

On its own, it probably isn't the most useful decoder, but it should allow us to rewrite the optional decoder as

let optional =
  decoder => alt(null |> map(() => None), decoder |> map(Option.pure))

Bump dependencies

Github has been bugging me about a security vulnerability in merge. Not sure how much direct control we have over this, but in preparation for a release, we should bump the versions of our dependencies to whatever is latest and test aggressively.

No tests for D.dict

It's probably safe to assume that we're able to correctly decode JSON objects into dictionaries... the code looks right. Wouldn't hurt to have a test or two, though.

NonEmptyList usage

I saw some new issues/suggestions that mentioned NonEmptyList and it made me remember that working with oneOf is clumsy because it must be constructed with a NonEmptyList (which means the end-user has to install it as a top-level dependency, etc). Could methods that expect a NonEmptyList take a normal list and internally construct the NonEmptyList? I think it'd make the ergonomics nicer.

Fix the Docs website

The docs website finally seems to be working. There are still a few things to tweak, and I don't have the energy right now, so here's a list:

  • Make /bs-decode/docs/ not a 404 (change what-and-why to index?)
  • Remove all non-docs pages and redirect to docs
  • Fix Highlight JS to understand Reason (copying the Reason-React docs)
  • Add website to Github project description
  • Add "homepage" to package.json
  • Make docs link more prominent in the README
  • Proofread, I guess

Add `Alt` for decoder functions

We currently have examples in tests where we use <|> from Result or Option to essentially fall through multiple decoders until we find one that works. This isn't ideal because it requires you to run each decoder, even if the first one succeeds.

Arguably, Decode.oneOf already solves this use case, but in the spirit of making decoders more composable (#23), we could add Decode.alt that looks something like this:

type stringOrInt = S(string) | I(int);
let decode =
  Decode.alt(
    Decode.map(v => S(v), Decode.string),
    Decode.map(v => I(v), Decode.intFromNumber),
  );

// Ok(I(3));
let decoded = decode(Js.Json.number(3.0));
  • Since you'd be applying alt to the decode functions themselves (instead of the output option/result), we should be able to only run subsequent decoders if we haven't yet found a successful value
  • We can probably rewrite oneOf to use this alt function?
  • If possible, this is a good candidate to use the TriedMultiple error to track the failure from each decoder that has been attempted (#26)

Prepare for 1.0

There are a few smallish changes planned for 0.6, but once those are done, there's not a lot of immediate work that I have planned. We've been using bs-decode in production for some time, and the API has been staying pretty stable. I think I'll soon be ready-ish to tag a 1.0 release, but along with that, we should handle some of the fluff that comes along with real projects:

  • Set up CI
  • Track test coverage
  • Add a changelog
  • Add contributing
  • Add badges to the readme
  • add gitattributes
  • add npmignore
  • Add a link to "composable error handling" in the "which monad" section of the docs
  • Submit to redex
  • Remove deprecated functions

Rethink custom/extensible validations

The Problem:

One pervasive challenge in defining a decode library is the fact that OCaml/Reason's type system allows for different types than what JSON allows. This can be seen in the intFromNumber and date decoders, which impose rules on top of JSON values that JSON itself doesn't define.

It becomes an even more complex issue when dealing with Reason's "variants" which have no obvious mapping to/from JSON. In the past, bs-decode has provided two solutions to this:

  • the ExpectedValidOption error, which is sort of a catch-all, but it carries no useful error information
  • the ParseError.ResultOf and Decode.Make module functors, which is a powerful but complex solution that allows extending the underlying error type

Variant decoders have gotten easier to express with alt/oneOf and the recent addition of literal decoders, but the foundational literal decoders use ExpectedValidOption which means that don't give great error messages on failure.

The Proposal:

I haven't fully thought through this, but I'm considering adding a FailedValidation('e) error constructor.

Int and Date decoding would take advantage of this (eliminating the need for ExpectedInt and ExpectedValidDate). Literal decoders could use this, eliminating the need for ExpectedValidOption, and since the error carries a polymorphic payload, it would be easy to extend, hopefully without needing to get module functors involved.

The big thing I haven't fully figured out is how error reporting would work. Since we'll be using these validations internally, I think that means the 'e type will actually be an open polymorphic variant. For functions like ParseError.toString, you'll need to tell us how to convert your extensions to a string, but hopefully not the entire variant. :> might get involved, which is too bad, but I still think this is worth trying.

If implemented, we should:

  • Add the constructor in ParseError
  • Add a new validate function that is sort of like a combination of map and flatMap that lets the user return their own custom validations
  • Use validate for ints (and remove ExpectedInt)
  • Use validate for dates (and remove ExpectedValidDate)
  • Use validate for literal decoders (and remove ExpectedValidOption)
  • Probably remove ExpectedTuple (also see #121)
  • Document usage with really clear examples

Document decoding recursive structures

This may very well already work perfectly. If that's the case, we should just document it and maybe add a test. If it doesn't work, we should make it possible.

The Elm example uses nested comments as their recursive structure. They have a special helper function (lazy), but that might be because their decoders have a newtype wrapper, while ours are functions (and might be more naturally lazy as a result?).

Decode tuple from array

It would be handy to be able to decode JSON ["a", 0, true] into Ok(("a", 0, true)). You can emulate this now by decoding into an array of json, then flatmapping and checking the size of the array (then decoding each required position), but it's cumbersome, and in the world of ParseError, it isn't clear what kind of error you want to return if the size is wrong.

Proposal:

  • Add a new constructor ExpectedTuple(int, json)
  • Use size as a first pass, before even worrying about inner decoders
  • If size is ok, do the inner decodes, collecting errors as a normal ParseError.Arr
  • Debug string version is "Expected tuple of size <int> but found <json>
  • API looks like Decode.(tuple3(string, int, bool))
  • API provides helpers up to size... 5?
  • Fail if array is longer than expected tuple size
  • Provide an option to tolerate longer-than-expected arrays (tupleAtLeast4? not a great name...)

Update "decoding objects" docs

I don't have time at this exact moment, but there are some docs in need of updating: https://mlms13.github.io/bs-decode/docs/decoding-objects#pipeline-style

In particular:

  • It might be helpful to explain a bit more how the pipeline works (leveraging partial application, order of field decoders should match order of arguments to the make function)
  • The Haskell-like section refers to ResultUtil, which I think has been removed, since decode functions themselves are now members of Functor and Apply. That example should be a lot cleaner now

Don't use eager/parallel alt function

In thinking about an approach to #36, I was realizing that DecodeBase currently requires your output monad to implement Alt, but we also require you to include a lazyAlt function (which should only be run if the first monad is unsuccessful.

The normal Alt implementation assumes we're going to actually run two decoders and pick the first success, which isn't great for performance, especially if you're working with large json objects. We should drop any dependency on Alt.alt and just use lazyAlt everywhere.

  • When implementing #36, make sure we use the lazy version of alt
  • Internally, rewrite the date decoder (and any others that currently use alt?) to take advantage of these changes
  • Remove the need to construct DecodeBase with an Alt module
  • Write tests to ensure that subsequent decoders aren't run once a success is found

Delete all deprecated features

First things first:

  • Move all tests out of the "option" and "string nel" -specific test files
  • Add an "Upgrading from v1" guide to the docs website
  • #127

Then the deletions:

  • Delete Decode_AsResult_OfStringNel
  • Delete Decode_AsOption
  • Delete Decode_NonEmptyList
  • Delete Decode_Base.Pipeline
  • Delete Decode.Make
  • Delete Decode_ParseError.ResultOf (and related helper modules)
  • Delete mapN helpers
  • Delete tupleN, tupleAtLeastN decoders
  • Delete stringMap
  • Delete variantFrom*

For each deletion, make sure the upgrade path is clearly outlined and that any recommended new functions are documented in a guide on the website.

Add `unit` decoder

In some cases, it may be useful to have a decoder that takes in JSON and always returns a successful (). For example:

  • When using the new tuple decoders, if you have a JSON array like: ["A", 0, someReallyComplexObject, false], even if you don't care about the complex object, you have to write a decoder for it if you want to pick out the other 3 items
  • If you receive an array of some complex structure and the only thing you care about is its length, it would be nice to Decode.(array(unit) |> map (Array.length)) without needing to write a real decoder for the inner structure

Clean up top-level modules

  • Decode is a bad name for the top-level module because it could easily conflict with other Decode modules (e.g. in a downstream project that depends on bs-decode)
  • Decode.ParseError might not be a very useful alias in the top-level module, because parse errors are only useful inside the context of Decode.AsResult.OfParseError, which provides its own alias
  • DecodeBase isn't aliased anywhere, so if you want to construct your own custom decoders, you have to use a second global module

Proposal:

  • Rename DecodeBase to Decode_Base
  • Alias Decode_Base inside the top-level Decode as Base
  • While we're renaming things, maybe Base should be Make
  • Update some docs, probably

This will be a breaking change (but one that should be a super easy migration for most projects), so I'd like to get it in before 1.0.

decoding-variants#complex-variants

Hi folks, tried to run this example complex-variants

module R =
  Decode.ParseError.ResultOf({
    type t = [ Decode.ParseError.base | `InvalidColor | `InvalidShape];
    let handle = x => (x :> t);
  });

module D = Decode.Base.Make(R.TransformError, R);
type color = Blue | Red | Green;

let parseColor =
  fun
  | "Blue" => Ok(Blue)
  | "Red" => Ok(Red)
  | "Green" => Ok(Green)
  | str => Error(Decode.ParseError.Val(`InvalidColor, Js.Json.string(str)));

let decodeColor = json => D.string |> Relude.Result.flatMap(parseColor);

Got the following error:

/DecodingVariantsComplex.re 18:39-71
  
  16 โ”‚   | str => Error(Decode.ParseError.Val(`InvalidColor, Js.Json.string(s
       tr)));
  17 โ”‚ 
  18 โ”‚ let decodeColor = json => D.string |> Relude.Result.flatMap(parseColor
       );
  
  This value might need to be wrapped in a function that takes an extra
  parameter of type Js.Json.t
  
  Here's the original error message
  This has type:
    Relude.Result.t(string, Decode.ParseError.t(([> `InvalidColor ] as 'a))) =>
    Relude.Result.t(color, Decode.ParseError.t('a))
  But somewhere wanted:
    (Js.Json.t => R.t(Js.String.t)) => 'b
  
  The incompatible parts:
    Relude.Result.t(string, Decode.ParseError.t('a)) (defined as
      Belt_Result.t(string, Decode.ParseError.t('a)))
    vs
    Js.Json.t => R.t(Js.String.t)
  
FAILED: subcommand failed.
>>>> Finish compiling(exit: 1)

Tried digging into the source but i am missing something. Thank you sir.

Implement MonadThrow

While it's probably not a super common use case, it would make sense to add a throw function that constructs a decoder that always fails with the given error. This is the error version of pure.

Plan for v2

Now that v1 is out the door, it's time to starting thinking about all the changes we might want for a v2. At a high level, the goal of a 2.0 release will be:

  • Modernizing the stack
  • Picking a preferred path (decoding into a result with a structured error type) and optimizing for that use case

So with those goals in mind, the roadmap:

Build/CI

  • Switch to Melange for compilation
  • Switch from npm to yarn
  • Use Github Actions for CI
  • Do way better at caching inside of CI
  • Add a CI step to publish to npm

Library features

  • #127
  • #126
  • #122
  • Make ParseError's string-ification less confusing

Docs/website

  • Migrate the website to Docusaurus 2
  • Add a CI step to publish the website
  • #128
  • Build docs in CI
  • Embed odoc documentation in the website

Add interface files

Even if I have to write them by hand, it seems pretty important at this point. Editor completions are confusing, compiler errors are unhelpful (referencing things like BsAbstract.Option.Monad.t instead of option), and in some cases, Decode.AsOption seems to have a hard time working with existing option utilities because of the way the types are being inferred.

Fix `toDebugString`

I think it's pretty close, but...

  • Make a version that can be called with only the error
  • Make an example that takes advantage of the extensible nature of DecodeBase.failure
  • Mention this in the readme

Allow named arguments in record constructors

First of all, I'm really loving this library. It's been extremely useful for several projects I'm working on. So thanks for putting in the time to build this!

I just wanted to suggest a potential enhancement. When constructing large records of data, it can be cumbersome and error-prone to pass 20+ arguments to that constructor, especially when half of them are just string or bool types, so I often use named arguments to alleviate that issue a bit.

It would be nice if I could pass that same make function into a decoder pipeline without any errors. As it stands, I need to duplicate the make function with the arguments left unnamed, and use that for decoding.

It's not a huge issue at all, more of a 'nice-to-have' feature. But I also don't know if this is actually feasible or even possible at all. I'd be interested to hear your thoughts on that.

Make `decodeInt` fail on float

Currently we use OCaml's int_of_float which is happy to drop the fractional part. Elm decided that it's more appropriate to reject floats when parsing for int, and I tend to agree.

Maybe something like this?

let floatToMaybeInt = f => switch (mod_float(f, floor(f)) {
| 0. => Some(int_of_float(f))
| _ => None
}

Suggestion: Function to return a Result of list of values

Wow, loving what you've done with this library!

Here's a little feature suggestion:

I often end up with an array(Js.Json.t) from my database library. Each individual row is already parsed into its own JSON.

So I can map over the array, and apply the decoder. Then I get array(Belt.Result.t(v, err)). But most of the time that I'm decoding JSON, I'm already in the context of the Result monad, and I'd like to end up with an inverted type: Belt.Result.t(array(v), array(err)). Because, practically, if I can't decode all of the items, I'd like to fail my outer context, and just log that entire list of decode errors.

I'm writing a utility function right now in my own code to perform that flip. But it'd be really convenient if there were a function to return a type like that already in this library.

Another workaround I could use is to convert the database result back into JSON, and then just make my decoder into a list.

Thanks!

Production users, roadmap?

Hi!

Thanks for implementing this library. I am debating using this and I wanted to understand it's plans for ongoing maintenance. Is this library planning to follow ReScript closely or is it focused on reasonml? Also, it would help if you list known production users in the readme.

I reviewed the relude documentation.

Thanks

Decoding a JSON array with 8000 items blows the stack in Chrome

At the moment, I don't have proof that this issue is coming from bs-decode, but when we updated from v0.3.3 to v0.6.1 in our production app, we started getting "maximum call stack size exceeded" failures in Chrome that seem to be coming from inside bs-decode (it's all minified, so hard to tell for sure at the moment).

I can't think of what specifically would have changed to cause this between 0.3.3 and 0.6.1, but I know we're using List.map from bs-abstract, which uses the version from the OCaml stdlib, which isn't tail-recursive.

  • Do some debugging in code that hasn't been minified to get a better idea of what might be going on
  • Try to make a small, reproducible test case
  • Tweak the array decode function until it works?

Add a new ParseError constructor "TriedMultiple"

This constructor should be used when we attempt multiple decoders for a single value and all fail (for example oneOf, which currently only tracks the final failure).

  • Construct this with a NonEmptyList of parse errors
  • Use this for oneOf
  • Use this for Decode.alt if that exists?
  • Update ParseError docs
  • Probably tests, I guess ๐Ÿ˜›

Alias "base" decoders in the Pipeline module

This is mostly to allow for easier local opens, e.g.:

let decode = json =>
  Decode.AsResult.OfParseError.Pipeline.(
    succeed(make)
    |> field("name", string)
    |> field("age", intFromNumber)
    |> run(json)
  );

Expose type for `Nel`?

I don't have the non-empty list library added to my project, so I discovered that I can't explicitly type the return type of a decoder function.

It'd be nice either to expose the non-empty list type, for explicit typing, or maybe to expose a type-alias for Decode.resultOfStringNelDecoder('a) or something like that. What do you think?

Report errors as JSON

One common use case for bs-decode is a webserver that decodes the payload of a request. In this case, it would be desirable to send back a 400 response with a response that includes structured errors that are related to the structure of the request.

Something like:

{
  "first_name": {
    "error": "MissingField",
    "description": "Field \"first_name\" is required but was not present"
  },
  "order_ids": [
    {
      "position": 3,
      "error": "InvalidValue",
      "description": "Expected number but found \"3\""
    }
  ]
}

Not sure what the exact structure should be, but it should be something predictable and easy to describe, so that your webserver can document the errors it will return and clients can theoretically do something useful with that data structure.

We already have examples of converting the structured ParseError type to a string in a way that preserves some of the nesting structure, so I think this should be pretty doable.

Polish up Decode_ParseError

  • If you alias like this: module Decode = Decode.AsResult.OfParseError, there's no way to get to the ParseError module. We should probably alias it in ...OfParseError.
  • No rei files exist for Decode_ParseError. I cleaned up some unused junk, so at this point, we can probably just generate the interface and stick to that.
  • Add a test/demo of building a custom collection of errors with Decode.ParseError.ResultOf. We used to have something like this, but it got lost in the test refactor (and apparently wasn't needed for full coverage)

Add optional, fallback, and hardcoded fields to the pipeline

Currently we only have required fields in the Decode_Pipeline version. I imagine it should be pretty easy to build in a few helpers. Some of these functions can probably be written in terms of others... maybe? Like, optional maps a success to Some and adds a fallback of None.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.