Giter VIP home page Giter VIP logo

avro-util's People

Contributors

abhisheknath2011 avatar ahmedahamid avatar ajinkya-dande-git avatar alex-dubrouski avatar asautins avatar cbrentharris avatar dg-builder avatar felixgv avatar flowenol avatar gaojieliu avatar guawang avatar idanz avatar karthikrg avatar kcepmada avatar kehuum avatar kloong87li avatar krisso-rtb avatar li-ukumar avatar lincong avatar maciejkowalczyk avatar majisourav avatar majisourav99 avatar neel24 avatar pugupta-linkedin avatar radai-rosenblatt avatar sixpluszero avatar srramach avatar volauvent avatar xiaoyu-yang-gh avatar yuzheng21 avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

avro-util's Issues

incompatible types: String cannot be converted to Utf8

Using FastSpecificDatumWriter throws the following errors:

/var/folders/wy/_4jkfljn7hbb5ltnzlm4jggh0000gn/T/generated10204258197036940757/com/linkedin/avro/fastserde/generated/serialization/AVRO_1_8/User_SpecificSerializer_2352263923167790722.java:43: error: incompatible types: String cannot be converted to Utf8
            if (favorite_color0 instanceof Utf8) {
                ^
/var/folders/wy/_4jkfljn7hbb5ltnzlm4jggh0000gn/T/generated10204258197036940757/com/linkedin/avro/fastserde/generated/serialization/AVRO_1_8/User_SpecificSerializer_2352263923167790722.java:44: error: incompatible types: String cannot be converted to Utf8
                (encoder).writeString(((Utf8) favorite_color0));
                                              ^
2 errors
23:52:59.819 [avro-fastserde-compile-thread-1] WARN com.linkedin.avro.fastserde.FastSerdeCache - Serializer class instantiation exception
com.linkedin.avro.fastserde.FastSerdeGeneratorException: com.linkedin.avro.fastserde.FastSerdeGeneratorException: Unable to compile:User_SpecificSerializer_2352263923167790722
	at com.linkedin.avro.fastserde.FastSerializerGenerator.generateSerializer(FastSerializerGenerator.java:87)
	at com.linkedin.avro.fastserde.FastSerdeCache.buildFastSpecificSerializer(FastSerdeCache.java:403)
	at com.linkedin.avro.fastserde.FastSerdeCache.buildSpecificSerializer(FastSerdeCache.java:410)
	at com.linkedin.avro.fastserde.FastSerdeCache.lambda$getFastSpecificSerializer$4(FastSerdeCache.java:260)
	at java.base/java.util.concurrent.CompletableFuture$AsyncSupply.run(CompletableFuture.java:1700)
	at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
	at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
	at java.base/java.lang.Thread.run(Thread.java:834)
Caused by: com.linkedin.avro.fastserde.FastSerdeGeneratorException: Unable to compile:User_SpecificSerializer_2352263923167790722
	at com.linkedin.avro.fastserde.FastSerdeBase.compileClass(FastSerdeBase.java:135)
	at com.linkedin.avro.fastserde.FastSerializerGenerator.generateSerializer(FastSerializerGenerator.java:82)
	... 7 common frames omitted

Looking at the code, it seems that syntactically wrong code has been generated as follows:

 40         String favorite_color0 = ((String) data.get(2));
 41         if (favorite_color0 instanceof String) {
 42             (encoder).writeIndex(0);
 43             if (favorite_color0 instanceof Utf8) {
 44                 (encoder).writeString(((Utf8) favorite_color0));
 45             } else {
 46                 (encoder).writeString(favorite_color0 .toString());
 47             }
 48         } else {
 49             if (favorite_color0 == null) {
 50                 (encoder).writeIndex(1);
 51                 (encoder).writeNull();
 52             }
 53         }

This is a test demonstrating the failure: https://github.com/izeye/samples-java-branches/blob/avro-util/src/test/java/learningtest/avro/UserTests.java#L136

avro-fastserde: No support for "avro.java.string" schema property.

Hi,

The current implementation does not support the avro.java.string property which gives the ability to use plain java Strings in place of Utf8 wrappers. This is useful especially with specific record generated classes. This feature is very important for us and blocks the migration from the original avro-fastserde branch. Below I have prepared a gist with an example test case that replicates the problem.

test case

slf4j logging API

Hi,

Is there any reason why can't we just use the slf4j API for logging? Currently we are using the apache log4j logger via the dependency for slf4j-log4j12 binding and thus forcing the users to use the log4j binding for slf4j. This is a bit inconvenient because , for example, in our case this binding is conflicting with log4j over slf4j bridge as we choose to delegate the log4j calls via slf4j to logback in our scenarios. The plain use of slf4j apis does not force users to use any specific flavor for logging.

This unfortunately forces us to exclude the "slf4-log4j12" dependency whenever we are using the avro-fastserde.

[helper] Don't use the existence of one class to infer things about OTHER classes.

In Avro17Utils.java, we check for the existence of org.apache.avro.JsonProperties to see if our runtime Avro is v1.7.3 or later. Assuming that's true, we proceed to operate on the Schema.Field that we are given, expecting that the Field is a subclass of JsonProperties.

But that can throw an exception if an older version of Avro is earlier on the classpath, and Avro 1.7.3+ is merely present later on the classpath. I.e., Schema.Field comes from Avro 1.7.2 or earlier, and thus does not extend JsonProperties. But the class loader will find JsonProperties, since it does exist on the classpath.

Here's an example of the exception that's thrown:

Exception in thread "main" java.lang.IllegalStateException: java.lang.IllegalArgumentException: object is not an instance of declaring class
	at com.linkedin.avroutil1.compatibility.avro17.Avro17Utils.getJsonProp(Avro17Utils.java:80)
	at com.linkedin.avroutil1.compatibility.avro17.Avro17Adapter.getFieldPropAsJsonString(Avro17Adapter.java:361)
	at com.linkedin.avroutil1.compatibility.AvroCompatibilityHelper.getFieldPropAsJsonString(AvroCompatibilityHelper.java:728)
	at com.linkedin.avroutil1.compatibility.AvroCompatibilityHelper.getFieldPropAsJsonString(AvroCompatibilityHelper.java:713)
	...

This is not an uncommon scenario. In our company's codebase, we do often see classpaths where multiple versions of Avro are present.

There may be other instances of such incorrect determination, elsewhere in the compat helper codebase. I'll update this bug as I find them.

Limit of total number of generated (de)serializer

Currently, there is no limit of the total number of de/serializer classes fast-avro could generate and load during runtime. So, the generated code may saturate the application code cache if there are too many schemas, and it might cause GC issues.

We can set a hard-limit or leverage a LRU cache to limit the maximum number of de/serializer classes that can be loaded during runtime.

NPE in FastDeserializerGenerator.java:920

com.linkedin.avro.fastserde.FastDeserializerGeneratorException: java.lang.NullPointerException at com.linkedin.avro.fastserde.FastDeserializerGenerator.generateDeserializer(FastDeserializerGenerator.java:156) ~[avro-fastserde-0.2.29.jar:?] at com.linkedin.avro.fastserde.FastSerdeCache.buildFastSpecificDeserializer(FastSerdeCache.java:315) ~[avro-fastserde-0.2.29.jar:?] at com.linkedin.avro.fastserde.FastSerdeCache.buildSpecificDeserializer(FastSerdeCache.java:340) ~[avro-fastserde-0.2.29.jar:?] at com.linkedin.avro.fastserde.FastSerdeCache.lambda$getFastSpecificDeserializer$0(FastSerdeCache.java:213) ~[avro-fastserde-0.2.29.jar:?] at java.util.concurrent.CompletableFuture$AsyncSupply.run(CompletableFuture.java:1700) [?:?] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) [?:?] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) [?:?] at java.lang.Thread.run(Thread.java:834) [?:?] Caused by: java.lang.NullPointerException at com.linkedin.avro.fastserde.FastDeserializerGenerator.processEnum(FastDeserializerGenerator.java:920) ~[avro-fastserde-0.2.29.jar:?] at com.linkedin.avro.fastserde.FastDeserializerGenerator.processSimpleType(FastDeserializerGenerator.java:189) ~[avro-fastserde-0.2.29.jar:?] at com.linkedin.avro.fastserde.FastDeserializerGenerator.processArray(FastDeserializerGenerator.java:716) ~[avro-fastserde-0.2.29.jar:?] at com.linkedin.avro.fastserde.FastDeserializerGenerator.processComplexType(FastDeserializerGenerator.java:169) ~[avro-fastserde-0.2.29.jar:?] at com.linkedin.avro.fastserde.FastDeserializerGenerator.processRecord(FastDeserializerGenerator.java:314) ~[avro-fastserde-0.2.29.jar:?] at com.linkedin.avro.fastserde.FastDeserializerGenerator.processComplexType(FastDeserializerGenerator.java:165) ~[avro-fastserde-0.2.29.jar:?] at com.linkedin.avro.fastserde.FastDeserializerGenerator.processArray(FastDeserializerGenerator.java:713) ~[avro-fastserde-0.2.29.jar:?] at com.linkedin.avro.fastserde.FastDeserializerGenerator.processComplexType(FastDeserializerGenerator.java:169) ~[avro-fastserde-0.2.29.jar:?] at com.linkedin.avro.fastserde.FastDeserializerGenerator.processUnion(FastDeserializerGenerator.java:591) ~[avro-fastserde-0.2.29.jar:?] at com.linkedin.avro.fastserde.FastDeserializerGenerator.processComplexType(FastDeserializerGenerator.java:177) ~[avro-fastserde-0.2.29.jar:?] at com.linkedin.avro.fastserde.FastDeserializerGenerator.processRecord(FastDeserializerGenerator.java:314) ~[avro-fastserde-0.2.29.jar:?] at com.linkedin.avro.fastserde.FastDeserializerGenerator.generateDeserializer(FastDeserializerGenerator.java:118) ~[avro-fastserde-0.2.29.jar:?]

Provide version agnostic helper for fetching field properties

We need a version agnostic helper method to access the avro.JSONProperties of avro Schema and Field. That is, we need helper methods to get/add properties to Schema and Field i.e. addAllProps, getObjectProp, getObjectProps, getProp, hasProps (example given using avro 1.9.2 API). If it is too much to provide helpers for all these functions, for our current use case we could use any of the following avro version agnostic helper function that would:

  1. get a list of property names
  2. getProps
  3. clone a record schema from another while preserving all the properties

Regression in EnumArray fast-avro serialization under Avro 1.4

Fast-avro serialization is slower than regular avro for EnumArray schema under Avro 1.4. Check out below benchmark results and schema pattern. It was faster before version 0.1.9
The serialization under Avro 1.8 is fine.

Avro 1.4 benchmark resut:

Benchmark                                                                           Mode  Cnt     Score       Error   Units
FastAvroSerdesBenchmark.testAvroSerialization                                       avgt    5  7431.495 ±   906.343   ns/op
FastAvroSerdesBenchmark.testAvroSerialization:·gc.alloc.rate                        avgt    5   413.867 ±    48.562  MB/sec
FastAvroSerdesBenchmark.testAvroSerialization:·gc.alloc.rate.norm                   avgt    5  3384.000 ±     0.001    B/op
FastAvroSerdesBenchmark.testAvroSerialization:·gc.churn.PS_Eden_Space               avgt    5   389.459 ±   865.511  MB/sec
FastAvroSerdesBenchmark.testAvroSerialization:·gc.churn.PS_Eden_Space.norm          avgt    5  3143.086 ±  6991.629    B/op
FastAvroSerdesBenchmark.testAvroSerialization:·gc.churn.PS_Survivor_Space           avgt    5     0.044 ±     0.315  MB/sec
FastAvroSerdesBenchmark.testAvroSerialization:·gc.churn.PS_Survivor_Space.norm      avgt    5     0.359 ±     2.556    B/op
FastAvroSerdesBenchmark.testAvroSerialization:·gc.count                             avgt    5     4.000              counts
FastAvroSerdesBenchmark.testAvroSerialization:·gc.time                              avgt    5    15.000                  ms
FastAvroSerdesBenchmark.testFastAvroSerialization                                   avgt    5  9664.812 ±  1175.680   ns/op
FastAvroSerdesBenchmark.testFastAvroSerialization:·gc.alloc.rate                    avgt    5   315.128 ±    36.635  MB/sec
FastAvroSerdesBenchmark.testFastAvroSerialization:·gc.alloc.rate.norm               avgt    5  3352.000 ±     0.001    B/op
FastAvroSerdesBenchmark.testFastAvroSerialization:·gc.churn.PS_Eden_Space           avgt    5   259.297 ±   933.239  MB/sec
FastAvroSerdesBenchmark.testFastAvroSerialization:·gc.churn.PS_Eden_Space.norm      avgt    5  2794.250 ± 10193.706    B/op
FastAvroSerdesBenchmark.testFastAvroSerialization:·gc.churn.PS_Survivor_Space       avgt    5     0.202 ±     1.727  MB/sec
FastAvroSerdesBenchmark.testFastAvroSerialization:·gc.churn.PS_Survivor_Space.norm  avgt    5     2.272 ±    19.417    B/op
FastAvroSerdesBenchmark.testFastAvroSerialization:·gc.count                         avgt    5     3.000              counts
FastAvroSerdesBenchmark.testFastAvroSerialization:·gc.time                          avgt    5    35.000                  ms

Avro 1.8 benchmark result:

Benchmark                                                                           Mode  Cnt     Score      Error   Units
FastAvroSerdesBenchmark.testAvroSerialization                                       avgt    5  8829.567 ± 4540.217   ns/op
FastAvroSerdesBenchmark.testAvroSerialization:·gc.alloc.rate                        avgt    5   350.364 ±  171.892  MB/sec
FastAvroSerdesBenchmark.testAvroSerialization:·gc.alloc.rate.norm                   avgt    5  3360.000 ±    0.001    B/op
FastAvroSerdesBenchmark.testAvroSerialization:·gc.churn.PS_Eden_Space               avgt    5   363.535 ±  820.904  MB/sec
FastAvroSerdesBenchmark.testAvroSerialization:·gc.churn.PS_Eden_Space.norm          avgt    5  3451.205 ± 8388.244    B/op
FastAvroSerdesBenchmark.testAvroSerialization:·gc.churn.PS_Survivor_Space           avgt    5     0.046 ±    0.344  MB/sec
FastAvroSerdesBenchmark.testAvroSerialization:·gc.churn.PS_Survivor_Space.norm      avgt    5     0.504 ±    3.896    B/op
FastAvroSerdesBenchmark.testAvroSerialization:·gc.count                             avgt    5     4.000             counts
FastAvroSerdesBenchmark.testAvroSerialization:·gc.time                              avgt    5    18.000                 ms
FastAvroSerdesBenchmark.testFastAvroSerialization                                   avgt    5  6423.710 ± 3154.787   ns/op
FastAvroSerdesBenchmark.testFastAvroSerialization:·gc.alloc.rate                    avgt    5   476.969 ±  243.782  MB/sec
FastAvroSerdesBenchmark.testFastAvroSerialization:·gc.alloc.rate.norm               avgt    5  3328.000 ±    0.001    B/op
FastAvroSerdesBenchmark.testFastAvroSerialization:·gc.churn.PS_Eden_Space           avgt    5   491.521 ±  218.395  MB/sec
FastAvroSerdesBenchmark.testFastAvroSerialization:·gc.churn.PS_Eden_Space.norm      avgt    5  3467.017 ± 2125.681    B/op
FastAvroSerdesBenchmark.testFastAvroSerialization:·gc.churn.PS_Survivor_Space       avgt    5     0.198 ±    1.649  MB/sec
FastAvroSerdesBenchmark.testFastAvroSerialization:·gc.churn.PS_Survivor_Space.norm  avgt    5     1.595 ±   13.316    B/op
FastAvroSerdesBenchmark.testFastAvroSerialization:·gc.count                         avgt    5     5.000             counts
FastAvroSerdesBenchmark.testFastAvroSerialization:·gc.time                          avgt    5    27.000                 ms

Schema (Test record array contains 200 enum elements)

{
  "type": "record",
  "name": "BenchmarkSchema",
  "namespace": "com.linkedin.avro.fastserde.generated.avro",
  "doc": "Enum array",
  "fields": [
    {
      "name": "EnumArray",
      "type": {
        "type": "array",
        "items": {
          "type": "enum",
          "name": "Method",
          "symbols": [
            "OPTIONS",
            "GET",
            "HEAD",
            "POST",
            "PUT",
            "DELETE",
            "TRACE",
            "CONNECT",
            "EXTENSION"
          ],
          "doc": "HTTP method of the request.",
          "compliance": "NONE"
        }
      }
    }
  ]
}

avro-fastserde: Not compatible with Java 11.

The avro-fastserde-0.2.1.jar includes package org.apache.avro.generic which is also defined in the avro jar (in this case avro-1.9.2.jar). Java 11 modules does not allow the same package to be used in different jars and results in the error

The package org.apache.avro.generic is accessible from more than one module: avro.fastserde, org.apache.avro

The older rtbhouse version of avro-fastserde did not include this package and did not have this issue.

[helper] Remove commons-text dependency.

It's only used in AvroCompatibilityHelper to unescape inner JSON. If we instead shunt that responsibility through to the adapters, they can do the job using whatever Jackson they already have, allowing us to get rid of this dependency.

getSchemaPropAsJsonString returns JSON string with quotes.

The API that getSchemaPropAsJsonString replaces (getProp) returns a JSON string without quotes, i.e. {...}. However, getSchemaPropAsJsonString is returning a string with quotes, i.e. "{...}". The behavior of getSchemaPropAsJsonString should match getProp if possible. Thanks!

find solution to the genericWriter + frankenObject problem

if you create a genericRecord, populate some of its nested fields with specific objects (specifically you need an enum somewhere in the graph) when try and encode the object avro fails.
root cause is the specific enum instance will not be considered an enum by class GenericData.

this is very likely an avro bug, but this exists up to at least 1.10.1 and so a solution to existing avro is needed

[helper] Support addProp().

They are needed at least for Schema.

It looks like Schema.Field also extends JsonProperties, so we might need to support them in Schema.Field also?

Maven cannot resolve avroutil1:helper:0.2.32 dependency

I get the following error from Maven when including avro-codegen version 0.2.32:

Cannot resolve com.linkedin.avroutil1:helper:0.2.32

My dependency is specified as:

<dependency>
    <groupId>com.linkedin.avroutil1</groupId>
    <artifactId>avro-codegen</artifactId>
    <version>0.2.32</version>
    <type>pom</type>
</dependency>

And the repository as:

<repository>
    <id>linkedin</id>
    <name>linkedin</name>
    <url>https://linkedin.bintray.com/maven/</url>
</repository>

Here is the relevant snippet from the avro-codegen POM that maven downloads. Should helper be helper-all here?

<dependency>
      <groupId>com.linkedin.avroutil1</groupId>
      <artifactId>helper</artifactId>
      <version>0.2.32</version>
      <scope>runtime</scope>
</dependency>

Thanks!

write own code-gen

there's only so much we can do with post-processing after avro.
at some point it becomes easier to support stuff like builders cross-avro by dumping the avro gen copleteely and running our own template engine

Add constructor for JsonDecoder in AvroCompatibilityHelper

Currently, there is no constructor for JsonDecoder that is available in AvroCompatibilityHelper which accepts (Symbol, InputStream) as arguments.
org.apache.avro.io.parsing.Symbol is in all avro versions so adding it will help out MPs that require this for their Avro migration.

Publishing artifacts to MavenCentral

Hi,

We have seen a significant improvement in performance after using avro-fastserde, thank you for this amazing library.
I noticed that the artifacts are published to bintray which is on the deprecation path, I wanted to check if there are any plans to publish artifacts on any other repositories like MavenCentral.

Thanks,
Atul

Let fast-avro align with vanilla avro to handle unknown enum value

Right now, Fast-Avro would fail fast during fast-class generation when encountering an unknown enum value, which is different from the vanilla Avro, which would fail the de-serialization when encountering an unknown enum value at de-serialization time.
Avro Enum default value is only supported in Avro-1.9+, so having such kind of alignment could help users, who are still using older Avro versions.

Array items are not re-used in fastserde

This fork of fastserde provides support for object re-use, but items in arrays are not re-used as in regular Avro.

This should be fixed so that the garbage collection characteristics of fastserde are the same as that of regular Avro.

Issue with importing the lib

I tried adding the module (group: "com.linkedin.avroutil", name: "avro-fastserde", version:"0.1.5") in my gradle project. But the dependency doesn't resolve. I also tried out different version but doesn't help.

add support for parsing json avro payloads in a compatible way

we know the way json payloads are encoded has changes in an incompatible way between 1.4 and 1.5 (enum branches encoded as simple record names vs FQCNs).

add json deserialization (and maybe serialization?) that allows parsing and emitting any json format regardless of runtime avro version

NPE in OldSchemaConstructableUsageDetector

java.lang.NullPointerException At com.linkedin.avroutil1.spotbugs.OldSchemaConstructableUsageDetector.lookForSchemaConstructableVariables(OldSchemaConstructableUsageDetector.java:101) At com.linkedin.avroutil1.spotbugs.OldSchemaConstructableUsageDetector.visitClassContext(OldSchemaConstructableUsageDetector.java:40) At edu.umd.cs.findbugs.DetectorToDetector2Adapter.visitClass(DetectorToDetector2Adapter.java:76) At edu.umd.cs.findbugs.FindBugs2.analyzeApplication(FindBugs2.java:1080) At edu.umd.cs.findbugs.FindBugs2.execute(FindBugs2.java:281) At com.github.spotbugs.internal.spotbugs.SpotBugsRunner.run(SpotBugsRunner.java:40) At org.gradle.workers.internal.AdapterWorkAction.execute(AdapterWorkAction.java:50) At org.gradle.workers.internal.DefaultWorkerServer.execute(DefaultWorkerServer.java:47) At org.gradle.workers.internal.AbstractClassLoaderWorker$1.create(AbstractClassLoaderWorker.java:46) At org.gradle.workers.internal.AbstractClassLoaderWorker$1.create(AbstractClassLoaderWorker.java:36) At org.gradle.internal.classloader.ClassLoaderUtils.executeInClassloader(ClassLoaderUtils.java:98) At org.gradle.workers.internal.AbstractClassLoaderWorker.executeInClassLoader(AbstractClassLoaderWorker.java:36) At org.gradle.workers.internal.IsolatedClassloaderWorker.execute(IsolatedClassloaderWorker.java:54) At org.gradle.workers.internal.WorkerDaemonServer.execute(WorkerDaemonServer.java:56) At sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)

Feature request: Add a method to get specificRecord from genericRecord

As in title, I am looking for a util method with signature as:

import org.apache.avro.generic.GenericRecord;
import org.apache.avro.specific.SpecificRecord;

public static <T extends SpecificRecord> T genericToSpecificRecord(GenericRecord genericRecord, Class<T> specificClass);

Expected behavior:

  1. The conversion tries to match the field schemas, by name and type typically, of the generic and specific records.
  2. The method should work for nested IndexedRecord cases.
  3. The method should work in any Avro version (if impossible, try the way with the least limitation).

Thank you.

Expose a helper method to return default value as a Json string

I want to compare the default values of a field between 2 avro schemas as a Json object. Can we add a helper method in AvroCompatibilityHelper to get the default value as a Json string?

Since equals implementation of GenericData.Record [type of object returned from getGenericDefaultValue ] also compares the schemas and field properties, equality based on the returned object from getGenericDefaultValue / getSpecificDefaultValue fails for default values of record type because the 2 schemas have some differences in the field properties [one of the schemas specifies an additional custom property]. We have a custom schema comparison logic for comparing the schemas that ignores those properties - so when comparing the default values, I only want to compare the field values [like when comparing as a Json] and not really the type information.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.