Giter VIP home page Giter VIP logo

Comments (8)

gregsdennis avatar gregsdennis commented on August 15, 2024 1

a lot of the benefit of JSON Schema results when interoperability is achieved

That's absolutely correct. We should minimize what we leave undefined. A specification defines a set of behaviors which an implementation must exhibit. That means that users of the implementation can rely on that set of behaviors. None of this is in question here.

This issue is about the language used to define the behavior. I think it makes more sense for the language to define restrictions rather than give permission.

Saying

A JSON Schema MAY contain properties which are not schema keywords.

is a permission, but saying

Implementations MUST NOT error when encountering properties which are not schema keywords.

is a restriction.

It's defining a "compliance box". Within the box, implementations are expected to behave a certain way. They can still operate outside of the box if they choose, but users shouldn't expect such operation to be interoperable because the spec doesn't address it.

@jdesrosiers said it well, I think:

I think the spec should be a lot like JSON Schema itself, by default anything is allowed and the constraints put limits on what implementations can do.

from json-schema-spec.

gregsdennis avatar gregsdennis commented on August 15, 2024 1

I completely agree, @awwright. My concern isn't with the BCP 14 keywords, but rather how the requirements are defined.

Currently there's a mix of "implmentations MAY do X" and "implmenetations MUST do Y". In these phrases, "MUST" creates a boundary: a line that implementations can't cross (without operating outside of the spec). However, "MAY" is giving permission to do something.

The way I see it, if you define a boundary of behavior, implementations MAY do whatever they want within that boundary without any kind of explicit permission to do so. The only reason to use "MAY", then, is to allow for a behavior that exists outside of the defined boundary. I'd just prefer to define the boundary correctly from the beginning.

some construct might be prohibited in schemas ("MUST NOT") because of known interoperability issues

I also think we should avoid language that places requirements on schema authors. Such requirements have no benefit unless an implementation provides behavior to enforce it. So putting a requirement on the author necessarily creates an implicit requirement for the implementation, and such requirements can be difficult to identify. Instead, we should be defining direct requirements on implementations that enforce a particular behavior from authors. (e.g. "Don't create that construct because the implementation will error.")

So for this example, we'd put a requirement on the implementation to detect and disallow this construct. Schema authors will naturally fall in line. However, implementations could still offer an opt-in to support the construct, though. But that's the key for me: such behavior needs to be opt-in.

from json-schema-spec.

gregsdennis avatar gregsdennis commented on August 15, 2024

Another style consideration is that in several places, we add requirements on schema authors, e.g. "Schema authors SHOULD NOT use pointers that cross resource boundaries." Would it be better to instead place limitations of support on implementations, e.g. "Implementations SHOULD NOT support pointers that cross resource boundaries"?

By placing the requirement on the implementation, it forces schema authors to conform.

from json-schema-spec.

jdesrosiers avatar jdesrosiers commented on August 15, 2024

I agree that what you call "restrictive language" is better. I advocate for this all the time, although I've never described it that way. I think the spec should be a lot like JSON Schema itself, by default anything is allowed and the constraints put limits on what implementations can do. For example, pointers crossing resource boundaries shouldn't be allowed or disallowed, just undefined. Implementations can handle this however they want, but schema authors shouldn't rely on that behavior. I think sticking to restrictive language could help with the bloat the spec has accumulated as well.

As for places where we put requirements on schemas authors, that's another thing that bothers me as well. We definitely shouldn't be placing requirements on schema authors. That doesn't make sense. The spec is for implementations.

from json-schema-spec.

jviotti avatar jviotti commented on August 15, 2024

For example, pointers crossing resource boundaries shouldn't be allowed or disallowed, just undefined

I never wrote specs myself, so wondering what is the benefit of leaving things open in this way? Seems to be that a lot of the benefit of JSON Schema results when interoperability is achieved, and these gray areas tend to be where people get really confused. If we can help it, wouldn't it be better to reduce undefined behaviour?

from json-schema-spec.

jdesrosiers avatar jdesrosiers commented on August 15, 2024

I never wrote specs myself, so wondering what is the benefit of leaving things open in this way?

Sometimes it's just to not invalidate the behavior of existing, well established implementations. For example, in JSON, the behavior of duplicate keys in an object is undefined. Different implementations handled that in different ways and it's nonsense anyway, so saying it's undefined allows existing implementations to be compliant rather than insisting on a specific behavior for something that people couldn't use in a way that made sense anyway.

I personally, see the pointers crossing resource boundaries exactly that way. It's nonsense and people should never do it even if it happens to work, but requiring that it produce an error could require a significant change in the architecture of many existing implementations. Placing that burden on existing implementations isn't necessary for something that doesn't make sense anyway.

from json-schema-spec.

jviotti avatar jviotti commented on August 15, 2024

Makes sense!

I personally, see the pointers crossing resource boundaries exactly that way. It's nonsense and people should never do it even if it happens to work, but requiring that it produce an error could require a significant change in the architecture of many existing implementations. Placing that burden on existing implementations isn't necessary for something that doesn't make sense anyway.

Do we have a list of things that are possible but we think they are non-sense and people should never do? I would love to write linter rules for these

from json-schema-spec.

awwright avatar awwright commented on August 15, 2024

The normative language could definitely be reviewed and streamlined for sure. The relevant specification is BCP 14. Though some amount of mixed language may be necessary. The most important purpose of the all-caps BCP 14 language is interoperability, and selection of prescriptive vs. proscriptive language will still be mixed when defining what will make the protocol or format interoperable or forward compatible. You will often find statements in complimentary pairs like "A validator MUST reject schemas that..." and "Schemas MUST NOT specify..." due to the fact that normative requirements usually target one party at a time; and truly prohibiting something necessitates normative language on both parties.

For example, some construct might be prohibited in schemas ("MUST NOT") because of known interoperability issues. But this doesn't impact validators; the specification might still permit validators to handle the construct, or it might require an error ("MUST reject"), because any new usage would harm interoperability. A prohibition on schemas doesn't imply what validators ought to do one way or the other.

from json-schema-spec.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.