linkedin / avro-util Goto Github PK
View Code? Open in Web Editor NEWCollection of utilities to allow writing java code that operates across a wide range of avro versions.
License: BSD 2-Clause "Simplified" License
Collection of utilities to allow writing java code that operates across a wide range of avro versions.
License: BSD 2-Clause "Simplified" License
for example - keep the builders
Using FastSpecificDatumWriter
throws the following errors:
/var/folders/wy/_4jkfljn7hbb5ltnzlm4jggh0000gn/T/generated10204258197036940757/com/linkedin/avro/fastserde/generated/serialization/AVRO_1_8/User_SpecificSerializer_2352263923167790722.java:43: error: incompatible types: String cannot be converted to Utf8
if (favorite_color0 instanceof Utf8) {
^
/var/folders/wy/_4jkfljn7hbb5ltnzlm4jggh0000gn/T/generated10204258197036940757/com/linkedin/avro/fastserde/generated/serialization/AVRO_1_8/User_SpecificSerializer_2352263923167790722.java:44: error: incompatible types: String cannot be converted to Utf8
(encoder).writeString(((Utf8) favorite_color0));
^
2 errors
23:52:59.819 [avro-fastserde-compile-thread-1] WARN com.linkedin.avro.fastserde.FastSerdeCache - Serializer class instantiation exception
com.linkedin.avro.fastserde.FastSerdeGeneratorException: com.linkedin.avro.fastserde.FastSerdeGeneratorException: Unable to compile:User_SpecificSerializer_2352263923167790722
at com.linkedin.avro.fastserde.FastSerializerGenerator.generateSerializer(FastSerializerGenerator.java:87)
at com.linkedin.avro.fastserde.FastSerdeCache.buildFastSpecificSerializer(FastSerdeCache.java:403)
at com.linkedin.avro.fastserde.FastSerdeCache.buildSpecificSerializer(FastSerdeCache.java:410)
at com.linkedin.avro.fastserde.FastSerdeCache.lambda$getFastSpecificSerializer$4(FastSerdeCache.java:260)
at java.base/java.util.concurrent.CompletableFuture$AsyncSupply.run(CompletableFuture.java:1700)
at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
at java.base/java.lang.Thread.run(Thread.java:834)
Caused by: com.linkedin.avro.fastserde.FastSerdeGeneratorException: Unable to compile:User_SpecificSerializer_2352263923167790722
at com.linkedin.avro.fastserde.FastSerdeBase.compileClass(FastSerdeBase.java:135)
at com.linkedin.avro.fastserde.FastSerializerGenerator.generateSerializer(FastSerializerGenerator.java:82)
... 7 common frames omitted
Looking at the code, it seems that syntactically wrong code has been generated as follows:
40 String favorite_color0 = ((String) data.get(2));
41 if (favorite_color0 instanceof String) {
42 (encoder).writeIndex(0);
43 if (favorite_color0 instanceof Utf8) {
44 (encoder).writeString(((Utf8) favorite_color0));
45 } else {
46 (encoder).writeString(favorite_color0 .toString());
47 }
48 } else {
49 if (favorite_color0 == null) {
50 (encoder).writeIndex(1);
51 (encoder).writeNull();
52 }
53 }
This is a test demonstrating the failure: https://github.com/izeye/samples-java-branches/blob/avro-util/src/test/java/learningtest/avro/UserTests.java#L136
Hi,
The current implementation does not support the avro.java.string property which gives the ability to use plain java Strings in place of Utf8 wrappers. This is useful especially with specific record generated classes. This feature is very important for us and blocks the migration from the original avro-fastserde branch. Below I have prepared a gist with an example test case that replicates the problem.
Hi,
Is there any reason why can't we just use the slf4j API for logging? Currently we are using the apache log4j logger via the dependency for slf4j-log4j12 binding and thus forcing the users to use the log4j binding for slf4j. This is a bit inconvenient because , for example, in our case this binding is conflicting with log4j over slf4j bridge as we choose to delegate the log4j calls via slf4j to logback in our scenarios. The plain use of slf4j apis does not force users to use any specific flavor for logging.
This unfortunately forces us to exclude the "slf4-log4j12" dependency whenever we are using the avro-fastserde.
In Avro17Utils.java, we check for the existence of org.apache.avro.JsonProperties
to see if our runtime Avro is v1.7.3 or later. Assuming that's true, we proceed to operate on the Schema.Field
that we are given, expecting that the Field is a subclass of JsonProperties.
But that can throw an exception if an older version of Avro is earlier on the classpath, and Avro 1.7.3+ is merely present later on the classpath. I.e., Schema.Field comes from Avro 1.7.2 or earlier, and thus does not extend JsonProperties. But the class loader will find JsonProperties, since it does exist on the classpath.
Here's an example of the exception that's thrown:
Exception in thread "main" java.lang.IllegalStateException: java.lang.IllegalArgumentException: object is not an instance of declaring class
at com.linkedin.avroutil1.compatibility.avro17.Avro17Utils.getJsonProp(Avro17Utils.java:80)
at com.linkedin.avroutil1.compatibility.avro17.Avro17Adapter.getFieldPropAsJsonString(Avro17Adapter.java:361)
at com.linkedin.avroutil1.compatibility.AvroCompatibilityHelper.getFieldPropAsJsonString(AvroCompatibilityHelper.java:728)
at com.linkedin.avroutil1.compatibility.AvroCompatibilityHelper.getFieldPropAsJsonString(AvroCompatibilityHelper.java:713)
...
This is not an uncommon scenario. In our company's codebase, we do often see classpaths where multiple versions of Avro are present.
There may be other instances of such incorrect determination, elsewhere in the compat helper codebase. I'll update this bug as I find them.
Currently, there is no limit of the total number of de/serializer classes fast-avro could generate and load during runtime. So, the generated code may saturate the application code cache if there are too many schemas, and it might cause GC issues.
We can set a hard-limit or leverage a LRU cache to limit the maximum number of de/serializer classes that can be loaded during runtime.
It will be better to support AvroCompatibilityHelper.parse(Instream in) function since it is a useful polymorphism to replace Schema.parse(Instream in) function.
com.linkedin.avro.fastserde.FastDeserializerGeneratorException: java.lang.NullPointerException at com.linkedin.avro.fastserde.FastDeserializerGenerator.generateDeserializer(FastDeserializerGenerator.java:156) ~[avro-fastserde-0.2.29.jar:?] at com.linkedin.avro.fastserde.FastSerdeCache.buildFastSpecificDeserializer(FastSerdeCache.java:315) ~[avro-fastserde-0.2.29.jar:?] at com.linkedin.avro.fastserde.FastSerdeCache.buildSpecificDeserializer(FastSerdeCache.java:340) ~[avro-fastserde-0.2.29.jar:?] at com.linkedin.avro.fastserde.FastSerdeCache.lambda$getFastSpecificDeserializer$0(FastSerdeCache.java:213) ~[avro-fastserde-0.2.29.jar:?] at java.util.concurrent.CompletableFuture$AsyncSupply.run(CompletableFuture.java:1700) [?:?] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) [?:?] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) [?:?] at java.lang.Thread.run(Thread.java:834) [?:?] Caused by: java.lang.NullPointerException at com.linkedin.avro.fastserde.FastDeserializerGenerator.processEnum(FastDeserializerGenerator.java:920) ~[avro-fastserde-0.2.29.jar:?] at com.linkedin.avro.fastserde.FastDeserializerGenerator.processSimpleType(FastDeserializerGenerator.java:189) ~[avro-fastserde-0.2.29.jar:?] at com.linkedin.avro.fastserde.FastDeserializerGenerator.processArray(FastDeserializerGenerator.java:716) ~[avro-fastserde-0.2.29.jar:?] at com.linkedin.avro.fastserde.FastDeserializerGenerator.processComplexType(FastDeserializerGenerator.java:169) ~[avro-fastserde-0.2.29.jar:?] at com.linkedin.avro.fastserde.FastDeserializerGenerator.processRecord(FastDeserializerGenerator.java:314) ~[avro-fastserde-0.2.29.jar:?] at com.linkedin.avro.fastserde.FastDeserializerGenerator.processComplexType(FastDeserializerGenerator.java:165) ~[avro-fastserde-0.2.29.jar:?] at com.linkedin.avro.fastserde.FastDeserializerGenerator.processArray(FastDeserializerGenerator.java:713) ~[avro-fastserde-0.2.29.jar:?] at com.linkedin.avro.fastserde.FastDeserializerGenerator.processComplexType(FastDeserializerGenerator.java:169) ~[avro-fastserde-0.2.29.jar:?] at com.linkedin.avro.fastserde.FastDeserializerGenerator.processUnion(FastDeserializerGenerator.java:591) ~[avro-fastserde-0.2.29.jar:?] at com.linkedin.avro.fastserde.FastDeserializerGenerator.processComplexType(FastDeserializerGenerator.java:177) ~[avro-fastserde-0.2.29.jar:?] at com.linkedin.avro.fastserde.FastDeserializerGenerator.processRecord(FastDeserializerGenerator.java:314) ~[avro-fastserde-0.2.29.jar:?] at com.linkedin.avro.fastserde.FastDeserializerGenerator.generateDeserializer(FastDeserializerGenerator.java:118) ~[avro-fastserde-0.2.29.jar:?]
We need a version agnostic helper method to access the avro.JSONProperties of avro Schema and Field. That is, we need helper methods to get/add properties to Schema and Field i.e. addAllProps, getObjectProp, getObjectProps, getProp, hasProps (example given using avro 1.9.2 API). If it is too much to provide helpers for all these functions, for our current use case we could use any of the following avro version agnostic helper function that would:
Fast-avro serialization is slower than regular avro for EnumArray schema under Avro 1.4. Check out below benchmark results and schema pattern. It was faster before version 0.1.9
The serialization under Avro 1.8 is fine.
Avro 1.4 benchmark resut:
Benchmark Mode Cnt Score Error Units
FastAvroSerdesBenchmark.testAvroSerialization avgt 5 7431.495 ± 906.343 ns/op
FastAvroSerdesBenchmark.testAvroSerialization:·gc.alloc.rate avgt 5 413.867 ± 48.562 MB/sec
FastAvroSerdesBenchmark.testAvroSerialization:·gc.alloc.rate.norm avgt 5 3384.000 ± 0.001 B/op
FastAvroSerdesBenchmark.testAvroSerialization:·gc.churn.PS_Eden_Space avgt 5 389.459 ± 865.511 MB/sec
FastAvroSerdesBenchmark.testAvroSerialization:·gc.churn.PS_Eden_Space.norm avgt 5 3143.086 ± 6991.629 B/op
FastAvroSerdesBenchmark.testAvroSerialization:·gc.churn.PS_Survivor_Space avgt 5 0.044 ± 0.315 MB/sec
FastAvroSerdesBenchmark.testAvroSerialization:·gc.churn.PS_Survivor_Space.norm avgt 5 0.359 ± 2.556 B/op
FastAvroSerdesBenchmark.testAvroSerialization:·gc.count avgt 5 4.000 counts
FastAvroSerdesBenchmark.testAvroSerialization:·gc.time avgt 5 15.000 ms
FastAvroSerdesBenchmark.testFastAvroSerialization avgt 5 9664.812 ± 1175.680 ns/op
FastAvroSerdesBenchmark.testFastAvroSerialization:·gc.alloc.rate avgt 5 315.128 ± 36.635 MB/sec
FastAvroSerdesBenchmark.testFastAvroSerialization:·gc.alloc.rate.norm avgt 5 3352.000 ± 0.001 B/op
FastAvroSerdesBenchmark.testFastAvroSerialization:·gc.churn.PS_Eden_Space avgt 5 259.297 ± 933.239 MB/sec
FastAvroSerdesBenchmark.testFastAvroSerialization:·gc.churn.PS_Eden_Space.norm avgt 5 2794.250 ± 10193.706 B/op
FastAvroSerdesBenchmark.testFastAvroSerialization:·gc.churn.PS_Survivor_Space avgt 5 0.202 ± 1.727 MB/sec
FastAvroSerdesBenchmark.testFastAvroSerialization:·gc.churn.PS_Survivor_Space.norm avgt 5 2.272 ± 19.417 B/op
FastAvroSerdesBenchmark.testFastAvroSerialization:·gc.count avgt 5 3.000 counts
FastAvroSerdesBenchmark.testFastAvroSerialization:·gc.time avgt 5 35.000 ms
Avro 1.8 benchmark result:
Benchmark Mode Cnt Score Error Units
FastAvroSerdesBenchmark.testAvroSerialization avgt 5 8829.567 ± 4540.217 ns/op
FastAvroSerdesBenchmark.testAvroSerialization:·gc.alloc.rate avgt 5 350.364 ± 171.892 MB/sec
FastAvroSerdesBenchmark.testAvroSerialization:·gc.alloc.rate.norm avgt 5 3360.000 ± 0.001 B/op
FastAvroSerdesBenchmark.testAvroSerialization:·gc.churn.PS_Eden_Space avgt 5 363.535 ± 820.904 MB/sec
FastAvroSerdesBenchmark.testAvroSerialization:·gc.churn.PS_Eden_Space.norm avgt 5 3451.205 ± 8388.244 B/op
FastAvroSerdesBenchmark.testAvroSerialization:·gc.churn.PS_Survivor_Space avgt 5 0.046 ± 0.344 MB/sec
FastAvroSerdesBenchmark.testAvroSerialization:·gc.churn.PS_Survivor_Space.norm avgt 5 0.504 ± 3.896 B/op
FastAvroSerdesBenchmark.testAvroSerialization:·gc.count avgt 5 4.000 counts
FastAvroSerdesBenchmark.testAvroSerialization:·gc.time avgt 5 18.000 ms
FastAvroSerdesBenchmark.testFastAvroSerialization avgt 5 6423.710 ± 3154.787 ns/op
FastAvroSerdesBenchmark.testFastAvroSerialization:·gc.alloc.rate avgt 5 476.969 ± 243.782 MB/sec
FastAvroSerdesBenchmark.testFastAvroSerialization:·gc.alloc.rate.norm avgt 5 3328.000 ± 0.001 B/op
FastAvroSerdesBenchmark.testFastAvroSerialization:·gc.churn.PS_Eden_Space avgt 5 491.521 ± 218.395 MB/sec
FastAvroSerdesBenchmark.testFastAvroSerialization:·gc.churn.PS_Eden_Space.norm avgt 5 3467.017 ± 2125.681 B/op
FastAvroSerdesBenchmark.testFastAvroSerialization:·gc.churn.PS_Survivor_Space avgt 5 0.198 ± 1.649 MB/sec
FastAvroSerdesBenchmark.testFastAvroSerialization:·gc.churn.PS_Survivor_Space.norm avgt 5 1.595 ± 13.316 B/op
FastAvroSerdesBenchmark.testFastAvroSerialization:·gc.count avgt 5 5.000 counts
FastAvroSerdesBenchmark.testFastAvroSerialization:·gc.time avgt 5 27.000 ms
Schema (Test record array contains 200 enum elements)
{
"type": "record",
"name": "BenchmarkSchema",
"namespace": "com.linkedin.avro.fastserde.generated.avro",
"doc": "Enum array",
"fields": [
{
"name": "EnumArray",
"type": {
"type": "array",
"items": {
"type": "enum",
"name": "Method",
"symbols": [
"OPTIONS",
"GET",
"HEAD",
"POST",
"PUT",
"DELETE",
"TRACE",
"CONNECT",
"EXTENSION"
],
"doc": "HTTP method of the request.",
"compliance": "NONE"
}
}
}
]
}
code generated under avro 1.9+ has a catch clause for AvroMissingFieldException in Builder.build() methods.
this exception class does not exist prior to 1.9 so generated code causes issues when run under avro < 1.9.
solution is to strip out the catch clause.
offers a significant performance gain
The avro-fastserde-0.2.1.jar includes package org.apache.avro.generic which is also defined in the avro jar (in this case avro-1.9.2.jar). Java 11 modules does not allow the same package to be used in different jars and results in the error
The package org.apache.avro.generic is accessible from more than one module: avro.fastserde, org.apache.avro
The older rtbhouse version of avro-fastserde did not include this package and did not have this issue.
support generating code for "large" collections of schemas (~thousands of avsc files)
only exists in avro 1.8+
only exists in 1.6+
It's only used in AvroCompatibilityHelper to unescape inner JSON. If we instead shunt that responsibility through to the adapters, they can do the job using whatever Jackson they already have, allowing us to get rid of this dependency.
The API that getSchemaPropAsJsonString
replaces (getProp
) returns a JSON string without quotes, i.e. {...}
. However, getSchemaPropAsJsonString
is returning a string with quotes, i.e. "{...}"
. The behavior of getSchemaPropAsJsonString
should match getProp
if possible. Thanks!
if you create a genericRecord, populate some of its nested fields with specific objects (specifically you need an enum somewhere in the graph) when try and encode the object avro fails.
root cause is the specific enum instance will not be considered an enum by class GenericData.
this is very likely an avro bug, but this exists up to at least 1.10.1 and so a solution to existing avro is needed
sometimes its due to something else (not avro-compiler jar) being excluded from dependencies.
They are needed at least for Schema
.
It looks like Schema.Field
also extends JsonProperties, so we might need to support them in Schema.Field also?
meaning calls that do not supply both reader and writer schemas, like new GenericDatumReader<>(oneSchema)
I get the following error from Maven when including avro-codegen version 0.2.32:
Cannot resolve com.linkedin.avroutil1:helper:0.2.32
My dependency is specified as:
<dependency>
<groupId>com.linkedin.avroutil1</groupId>
<artifactId>avro-codegen</artifactId>
<version>0.2.32</version>
<type>pom</type>
</dependency>
And the repository as:
<repository>
<id>linkedin</id>
<name>linkedin</name>
<url>https://linkedin.bintray.com/maven/</url>
</repository>
Here is the relevant snippet from the avro-codegen POM that maven downloads. Should helper
be helper-all
here?
<dependency>
<groupId>com.linkedin.avroutil1</groupId>
<artifactId>helper</artifactId>
<version>0.2.32</version>
<scope>runtime</scope>
</dependency>
Thanks!
there's only so much we can do with post-processing after avro.
at some point it becomes easier to support stuff like builders cross-avro by dumping the avro gen copleteely and running our own template engine
`
FAILURE: Build failed with an exception.
Could not find a default branch to fall back on.
`
from https://github.com/linkedin/avro-util/runs/1205858809
possible solution - https://community.sonarsource.com/t/sonar-maven-plugin-could-not-find-a-default-branch-to-fall-back-on/23973/11
split BadClass up into several such classes, to be compiled under various avro versions.
tests themselves could remain in one place
oops
Currently, there is no constructor for JsonDecoder that is available in AvroCompatibilityHelper which accepts (Symbol, InputStream) as arguments.
org.apache.avro.io.parsing.Symbol is in all avro versions so adding it will help out MPs that require this for their Avro migration.
Hi,
We have seen a significant improvement in performance after using avro-fastserde, thank you for this amazing library.
I noticed that the artifacts are published to bintray which is on the deprecation path, I wanted to check if there are any plans to publish artifacts on any other repositories like MavenCentral.
Thanks,
Atul
Right now, Fast-Avro would fail fast during fast-class generation when encountering an unknown enum value, which is different from the vanilla Avro, which would fail the de-serialization when encountering an unknown enum value at de-serialization time.
Avro Enum default value is only supported in Avro-1.9+, so having such kind of alignment could help users, who are still using older Avro versions.
to begin with to allow control over the type used for Strings - Utf8 vs java.lang.String
to protect against cases where the build modifies files resulting in an accidental snapshot version
This fork of fastserde provides support for object re-use, but items in arrays are not re-used as in regular Avro.
This should be fixed so that the garbage collection characteristics of fastserde are the same as that of regular Avro.
I tried adding the module (group: "com.linkedin.avroutil", name: "avro-fastserde", version:"0.1.5") in my gradle project. But the dependency doesn't resolve. I also tried out different version but doesn't help.
private static org.apache.avro.specific.SpecificData MODEL$ = org.apache.avro.specific.SpecificData.get();
^
symbol: variable apache
location: variable org of type CharSequence
solution is either import some org/com classes (imperfect, since cant import everything, see https://issues.apache.org/jira/browse/AVRO-1901).
alternatively, since this seems limited to the RHS of expressions ... reflection?
These older Avro versions (1.7.0 through 1.7.2) have a different implementation for props, as compared to later versions (1.7.3 through 1.7.7). Today, the tests in helper/tests/helper-tests-17
bring in Avro 1.7.7, and so test only the newer impl. Add tests to cover the older impl as well.
we know the way json payloads are encoded has changes in an incompatible way between 1.4 and 1.5 (enum branches encoded as simple record names vs FQCNs).
add json deserialization (and maybe serialization?) that allows parsing and emitting any json format regardless of runtime avro version
java.lang.NullPointerException At com.linkedin.avroutil1.spotbugs.OldSchemaConstructableUsageDetector.lookForSchemaConstructableVariables(OldSchemaConstructableUsageDetector.java:101) At com.linkedin.avroutil1.spotbugs.OldSchemaConstructableUsageDetector.visitClassContext(OldSchemaConstructableUsageDetector.java:40) At edu.umd.cs.findbugs.DetectorToDetector2Adapter.visitClass(DetectorToDetector2Adapter.java:76) At edu.umd.cs.findbugs.FindBugs2.analyzeApplication(FindBugs2.java:1080) At edu.umd.cs.findbugs.FindBugs2.execute(FindBugs2.java:281) At com.github.spotbugs.internal.spotbugs.SpotBugsRunner.run(SpotBugsRunner.java:40) At org.gradle.workers.internal.AdapterWorkAction.execute(AdapterWorkAction.java:50) At org.gradle.workers.internal.DefaultWorkerServer.execute(DefaultWorkerServer.java:47) At org.gradle.workers.internal.AbstractClassLoaderWorker$1.create(AbstractClassLoaderWorker.java:46) At org.gradle.workers.internal.AbstractClassLoaderWorker$1.create(AbstractClassLoaderWorker.java:36) At org.gradle.internal.classloader.ClassLoaderUtils.executeInClassloader(ClassLoaderUtils.java:98) At org.gradle.workers.internal.AbstractClassLoaderWorker.executeInClassLoader(AbstractClassLoaderWorker.java:36) At org.gradle.workers.internal.IsolatedClassloaderWorker.execute(IsolatedClassloaderWorker.java:54) At org.gradle.workers.internal.WorkerDaemonServer.execute(WorkerDaemonServer.java:56) At sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
they only exist in avro 1.8+
maybe more generally find generated classes not generated via the helper?
As in title, I am looking for a util method with signature as:
import org.apache.avro.generic.GenericRecord;
import org.apache.avro.specific.SpecificRecord;
public static <T extends SpecificRecord> T genericToSpecificRecord(GenericRecord genericRecord, Class<T> specificClass);
Expected behavior:
Thank you.
External applications should have a way if fast or vanilla avro is being used for ser/deser. Logging/metrics can be enabled with this information. Right now this detail is hidden away.
we currently only allow it via cloning an existing field
I want to compare the default values of a field between 2 avro schemas as a Json object. Can we add a helper method in AvroCompatibilityHelper to get the default value as a Json string?
Since equals
implementation of GenericData.Record [type of object returned from getGenericDefaultValue ] also compares the schemas and field properties, equality based on the returned object from getGenericDefaultValue / getSpecificDefaultValue fails for default values of record type because the 2 schemas have some differences in the field properties [one of the schemas specifies an additional custom property]. We have a custom schema comparison logic for comparing the schemas that ignores those properties - so when comparing the default values, I only want to compare the field values [like when comparing as a Json] and not really the type information.
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.