Giter VIP home page Giter VIP logo

fetch-blob's Introduction

fetch-blob

npm version build status coverage status install size

A Blob implementation in Node.js, originally from node-fetch.

Use the built-in Blob in Node.js 18 and later.

Installation

npm install fetch-blob
Upgrading from 2x to 3x

Updating from 2 to 3 should be a breeze since there is not many changes to the blob specification. The major cause of a major release is coding standards. - internal WeakMaps was replaced with private fields - internal Buffer.from was replaced with TextEncoder/Decoder - internal buffers was replaced with Uint8Arrays - CommonJS was replaced with ESM - The node stream returned by calling blob.stream() was replaced with whatwg streams - (Read "Differences from other blobs" for more info.)

Differences from other Blobs
  • Unlike NodeJS buffer.Blob (Added in: v15.7.0) and browser native Blob this polyfilled version can't be sent via PostMessage
  • This blob version is more arbitrary, it can be constructed with blob parts that isn't a instance of itself it has to look and behave as a blob to be accepted as a blob part.
    • The benefit of this is that you can create other types of blobs that don't contain any internal data that has to be read in other ways, such as the BlobDataItem created in from.js that wraps a file path into a blob-like item and read lazily (nodejs plans to implement this as well)
  • The blob.stream() is the most noticeable differences. It returns a WHATWG stream now. to keep it as a node stream you would have to do:
  import {Readable} from 'stream'
  const stream = Readable.from(blob.stream())

Usage

// Ways to import
import { Blob } from 'fetch-blob'
import { File } from 'fetch-blob/file.js'

const { Blob } = await import('fetch-blob')


// Ways to read the blob:
const blob = new Blob(['hello, world'])

await blob.text()
await blob.arrayBuffer()
for await (let chunk of  blob.stream()) { ... }
blob.stream().getReader().read()
blob.stream().getReader({mode: 'byob'}).read(view)

Blob part backed up by filesystem

fetch-blob/from.js comes packed with tools to convert any filepath into either a Blob or a File It will not read the content into memory. It will only stat the file for last modified date and file size.

// The default export is sync and use fs.stat to retrieve size & last modified as a blob
import {File, Blob, blobFrom, blobFromSync, fileFrom, fileFromSync} from 'fetch-blob/from.js'

const fsFile = fileFromSync('./2-GiB-file.bin', 'application/octet-stream')
const fsBlob = await blobFrom('./2-GiB-file.mp4')

// Not a 4 GiB memory snapshot, just holds references
// points to where data is located on the disk
const blob = new Blob([fsFile, fsBlob, 'memory', new Uint8Array(10)])
console.log(blob.size) // ~4 GiB

blobFrom|blobFromSync|fileFrom|fileFromSync(path, [mimetype])

Creating a temporary file on the disk

(requires FinalizationRegistry - node v14.6)

When using both createTemporaryBlob and createTemporaryFile then you will write data to the temporary folder in their respective OS. The arguments can be anything that fsPromises.writeFile supports. NodeJS v14.17.0+ also supports writing (async)Iterable streams and passing in a AbortSignal, so both NodeJS stream and whatwg streams are supported. When the file have been written it will return a Blob/File handle with a references to this temporary location on the disk. When you no longer have a references to this Blob/File anymore and it have been GC then it will automatically be deleted.

This files are also unlinked upon exiting the process.

import { createTemporaryBlob, createTemporaryFile } from 'fetch-blob/from.js'

const req = new Request('https://httpbin.org/image/png')
const res = await fetch(req)
const type = res.headers.get('content-type')
const signal = req.signal
let blob = await createTemporaryBlob(res.body, { type, signal })
// const file = createTemporaryBlob(res.body, 'img.png', { type, signal })
blob = undefined // loosing references will delete the file from disk
  • createTemporaryBlob(data, { type, signal })
  • createTemporaryFile(data, FileName, { type, signal, lastModified })

Creating Blobs backed up by other async sources

Our Blob & File class are more generic then any other polyfills in the way that it can accept any blob look-a-like item An example of this is that our blob implementation can be constructed with parts coming from BlobDataItem (aka a filepath) or from buffer.Blob, It dose not have to implement all the methods - just enough that it can be read/understood by our Blob implementation. The minium requirements is that it has Symbol.toStringTag, size, slice(), stream() methods (the stream method can be as simple as being a sync or async iterator that yields Uint8Arrays. If you then wrap it in our Blob or File new Blob([blobDataItem]) then you get all of the other methods that should be implemented in a blob or file (aka: text(), arrayBuffer() and type and a ReadableStream)

An example of this could be to create a file or blob like item coming from a remote HTTP request. Or from a DataBase

See the MDN documentation and tests for more details of how to use the Blob.

fetch-blob's People

Contributors

benmccann avatar bitinn avatar dependabot-preview[bot] avatar dependabot[bot] avatar gozala avatar jimmywarting avatar linusu avatar octet-stream avatar pimlie avatar richienb avatar tinovyatkin avatar xxczaki avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar

fetch-blob's Issues

Make this deno compatible?

I have recently been thinking how to make this Deno compatible...
two think need to happen then.

  1. have to change to es6 module so this can be imported into any Deno code using import statement.
  2. make node-stream a optional dependency
stream () {
  if (globalThis.ReadableStream?.from) {
    return ReadableStream.from(read(...))
  } else {
    import('stream').then(({Readable}) => Readable.from(read(...)))
  }
}

require "stream/web" produces warning

Hi,
On node v16 it produces experimental warning.

(node:18876) ExperimentalWarning: stream/web is an experimental featur

Could you please use polyfill until stream/web feature will get stable status?

Also instead of using try/catch for load stream/web you can check process.version and load stream/web if node version is >= 16.5.0.

exec `npm run test test-wpt` with github workflows

The test uses experimental http loader so perhaps it should only be executed with node v16+

Our own test can still co exist but i guess they are not that important anymore. Just some basic test to see if everything runs fine with different node versions...

Run test against WPT

Would be nicer to rely on the actual test run by browsers
Don't know how to impl doe...

Revisit DOMException

Okey, so i have been playing around with blobFrom. It's annoying to have to install DOMException
and it's easy to forget to add that optional dependency

So i investigated in nodejs core, (cuz i know they have been implementing some web-idl stuff that depends on DOMexception to be thrown) and found this hack

They don't expose it in anyway but you can still obtain it!

import { MessageChannel } from 'worker_threads'

if (!globalThis.DOMException) {
  const port = new MessageChannel().port1
  const ab = new ArrayBuffer()
  try { port.postMessage(ab, [ab, ab]) }
  catch (err) { globalThis.DOMException = err.constructor }
}

I was thinking instead of depending on the DOMException package, then we could go with this hack instead - it's way smaller in size, and you will use the same instance as NodeJS.

Thoughts?

Add `exports` map with ESM variant

It'd be nice to offer a supplemental ESM build that's linked via:

  • "module" in package.json
  • the import key inside each exports entry (inside package.json too)

This'd end up looking something like:

{
  // ...
  "module": "dist/index.mjs",
  "main": "dist/index.js",
  // ...
  "exports": {
    ".": {
      "require": "./dist/index.js",
      "import": "./dist/index.mjs"
    },
    "./from": {
      "require": "./from/index.js",
      "import": "./from/index.mjs"
    }
  }
  // ...
}

This allows for native ESM resolution in Node.js (12.x and >=14) in supporting environments w/o sacrificing CommonJS (default) support. Also allows bundler(s) to pick up the ESM entry, too.

There's one gotcha โ€“ Node 13.0 thru 13.7 don't support this version of exports.
This can be patched โ€“ but only if you decide to keep the default exports (module.exports / export default) that you currently have. This means that the the addition remains non-breaking, even w/ these Node versions.


I can PR these changes if desired.

Implementing a mutable append only blob type?

in this video https://youtu.be/6EDaayYnw6M?t=1202 he talks about returning a blob from the fetch api.
in theory you can return a blob early if you know the content-length or size of the blob. The content dose not have to be known immediately.
you could for example make a request to a 4gb large file and have the blob returned just right after you get the http response without having all data at hand. That's it to say: the response has a content-length and isn't compressed.

This idea was brought up way long before in NodeJS by jasnell about 4y ago

For Blob in general, it is really nothing more than a persistent allocated chunk of memory. It would be possible to create a Blob from one or more TypedArray objects. I'm sketching out additional APIs for the http and http2 modules that would allow a response to draw data from a Blob rather than through the Streams API. There is already something analogous in the http2 implementation in the form of the respondWithFile() and respondWithFD() APIs in the http2 side. Basically, the idea would be to prepare chunks of allocated memory at the native layer, with data that never passes into the JS layer (unless absolutely necessary to do so), then use those to source the data for responses. In early benchmarking this yields a massive boost in throughput without the usual backpressure control issues.

I'm still interested in this idea also, but i have no ide/clue of how to sketch this up or how to best implment it.

I mean i built this HTTP File-like class that operates on byte-range, partial request and having a known content-length
The goal of it all was to have a zip from a remote source, passing it to a zip parser that could slice and read the central directory so it could retrieve a list of all the files that was included and jump/seek within the blob for the stuff you needed without having to download the hole zip file. This meant that it would make multiple partial http request for each file later on
it's a pretty cool concept of optimizing

use a byte sequence instead of a buffer

imagine how you would go about solving this:

const File = require('fetch-file') // creates a FIle-like IDL wrapper around fs
const Blob = require('fetch-blob')

const file = File.from('./package.json')
new Blob([ file ])

the blob's constructor spec says:

  1. Let bytes be an empty sequence of bytes. (it's not like a long ArrayBuffer holding all data)
  2. For each element in parts:
    1. ...
    2. ...
    3. If element is a Blob, append the bytes it represents to bytes.

2.3 don't say read the the blob and copy the content to a new instance. Beside the fetch-blob can't read the content of the File in a synchronous way to be able to successfully handle the blob parts. so the slicing method is also wrong, it should not slice the Buffer like fetch-blob dose today. it should instead update the byte sequences and and create a new offset of where it should start reading stuff from. This also means that arrayBuffer, text and stream is wrong too. fetch-blob should instead read each blobPart individually at a later stage to successfully be able to handle the File - the constructor should not read the file content.


when you take 2x2 GiB of files from a <input type="file" multiple> and concatenate b = new Blob([fileA, fileB]) then you won't end up with 4 GiB of RAM, it really means that you have a new blob container with two references point to where a snapshot is located. And if you slice it b.slice(0, -500) then you have one references to fileA and another reference to fileB with a offset from what it should represent. [[start to end of fileA], [start to end minus 500 of fileB]]

image taken from: https://docs.google.com/presentation/d/1MOm-8kacXAon1L2tF6VthesNjXgx0fp5AP17L7XDPSM/edit

Originally posted by @jimmywarting in node-fetch/node-fetch#835 (comment)

things needs fixing:

  • this[BUFFER] should be this[PARTS] since you can't concatenate all parts
  • read methods should loop over all parts and concatenate all parts into one stream/text/arraybuffer and taking the internal offset into consideration
  • slicing becomes more complicated since you are now dealing with parts instead of a buffer
    • what really should be done is just copying all parts (clone the blob) and kinda change the internal this[OFFSET] from where it should start reading from.
var b1 = new Blob([new Uint8Array(1000)])
var b2 = b1.slice(500) // uses the same parts with an offset=500
// b2 should not take up 500 byte more ram... 

Please consider web compatible `stream()` method.

Currently implementation deliberately diverges from the Blob spec by using node Readable stream instead web ReadableStream. This is problematic because code that runs both in browser and node can not use blob.stream() without having to check what the result is.

How about either implementing web compatible stream method, or not implementing it at all.

Adding a File class maybe?

would it be worth to have a file class that extends the blob?

It wouldn't have to be more adv than this:

// file.js
import Blob from './index.js'

export class File extends Blob {
  constructor (blobParts, fileName, options = {}) {
    const { lastModified = Date.now(), ...blobPropertyBag } = options
    super(blobParts, blobPropertyBag)
    this.name = (''+fileName).replace(/\u002F/g, "\u003A")
    this.lastModified = +lastModified
    this.lastModifiedDate = new Date(lastModified)
  }

  [Symbol.toStringTag] = 'File'
}

Support Node.js v12

Currently, only Node.js v14+ is supported:

fetch-blob/package.json

Lines 28 to 30 in 4747497

"engines": {
"node": ">=14.0.0"
},

I didn't realize this when I tried to update to the latest version and ran into problems with optional chaining:

Many packages at the moment, including all of mine, support Node.js ^12.20 || >= 14.13. E.g:

https://github.com/jaydenseric/apollo-upload-client/blob/e19deb9f1fc950f35a2f652c02135408bbe2a38a/package.json#L43-L45

This range supports most of the important Node.js ESM and package exports field features.

Until Node.js v12 is EOL we'll be stuck on the last fetch-blob version to support it; would you consider expanding the current level of support to accept Node.js ^12.20 || >= 14.13?

Possibility to remove web-streams-polyfill dependency?

web-streams-polyfill is used as a dependency of this package, which is only used in streams.cjs as a polyfill fallback for users in <v16.5.0 for ReadableStreams.

However, I personally find it extremely overblown that, in this case, we're pulling a >7 MB dependency for something that might or might not be used, while we're also not using all the available po(l/n)yfills but ponyfill.es2018.js, which considerably (outside of depending of an external package) increases the size of the dependants, such as node-fetch.

Is it possible to include this file inside fetch-blob, or take any other possible solution (while also keeping compatibility with older Node.js versions)? I understand this is relatively low-priority, but I thought I'd at least raise it up for consideration.

Module not found: Error: Can't resolve 'stream/web'

Reproduction

Module not found: Error: Can't resolve 'stream/web' in '/Users/alan/project/duapp/vscode/packages/du-i18n/node_modules/fetch-blob'

image

Steps to reproduce the behavior:

  1. yarn
  2. yarn start
  3. start fail

Expected behavior

start success

Screenshots

Your Environment
vscode extension development

software version
node-fetch 3.0.0
node v14.17.3
npm 6.14.13
Operating System macos

Additional context

Doesn't work either with latest stable and beta version of node-fetch

@jimmywarting As I mentioned here, I'm running into the problem when using this package with node-fetch. The problem remains both in the latest stable and beta. With [email protected] and [email protected].

I was trying to test an example for my form-data-encoder where the encoder targeting Blob as you did in formdata-polyfill:

import {Readable} from "stream"

import {FormData, fileFromPath} from "formdata-node"
import {Encoder} from "form-data-encoder"

import Blob from "fetch-blob"
import fetch from "node-fetch"

async function toBlob(form) {
  const encoder = new Encoder(form)
  const chunks = []

  for await (const chunk of encoder) {
    chunks.push(chunk)
  }

  return new Blob(chunks, {type: encoder.contentType})
}

const fd = new FormData()

fd.set("name", "John Doe")
fd.set("avatar", await fileFromPath("avatar.png"))

const options = {
  method: "post",
  body: await toBlob(fd)
}

const response = await fetch("https://httpbin.org/post", options)

console.log(await response.text())

When I run this code node test.mjs this happens:

  1. With fetch-blob I get the following error: body.stream().pipe(dest);. This error appears in node-fetch/src/body.js:374:17
  2. When I run the same code, but with [email protected] it will lowercase the value of Blob#type which breaks boundary string returned by form-data-encoder.

Release with Typings?

I see typings in the current version of the repository, but not when I install from NPM. What is the timeline for a release that includes typings?

Move check to the top

fetch-blob/index.js

Lines 178 to 181 in 5ba554e

// don't add the overflow to new blobParts
if (added >= span) {
break;
}

I think there might be some error about slicing blob (or BlobDataItem, aka fileFromPath) that would normally end with a zero size.
have not yet tested it out. I used the same related slice logic elsewhere that resulted in a error

A simple test could be to do:

const blob = fileFromPathSync('./package.json') // or
const blob = new Blob(['hello'])
await blob.slice(0, 0).text()

whether or not it's working as intended i think it would be best if this check was moved to the beginning of the loop

for (const part of parts) { 
  // don't add the overflow to new blobParts 
  if (added >= span) { 
    break; 
  } 
  ...

fileFromSync just part of a file

Hey there! Nice work!

I am trying to append on a form just part of a file,
Is there some way that I could append fom 0 to 1024 bytes on fileFromSync(0,1024,/path/to/file)
Thanks
Pucci

text() is fatal on non UTF-8 data, while in browser it works without any issue

The following code works fine in the browser:

const artwork = new File(
      [
        Uint8Array.from([
          0x75, 0xab, 0x5a, 0x8a, 0x66, 0xa0, 0x7b, 0xf8, 0xe9, 0x7a, 0x06, 0xda, 0xb1, 0xee, 0xb8,
          0xff, 0xd8, 0xff, 0xe0, 0x00, 0x10, 0x4a, 0x46, 0x49, 0x46, 0x00, 0x01, 0x01, 0x00, 0x00,
          0x01, 0x00, 0x01, 0x00, 0x00, 0xff, 0xdb, 0x00, 0x43, 0x00, 0x02, 0x01, 0x01, 0x01, 0x01,
          0x01, 0x02, 0x01, 0x01, 0x01, 0x02, 0x02, 0x02, 0x02, 0x02, 0x04, 0x03, 0x02, 0x02, 0x02,
          0x02, 0x05, 0x04, 0x04, 0x03, 0x04, 0x06, 0x05, 0x06, 0x06, 0x06, 0x05, 0x06, 0x06, 0x06,
          0x07, 0x09, 0x08, 0x06, 0x07, 0x09, 0x07, 0x06, 0x06, 0x08, 0x0b, 0x08, 0x09, 0x0a, 0x0a,
          0x0a, 0x0a, 0x0a, 0x06, 0x08, 0x0b, 0x0c, 0x0b, 0x0a, 0x0c, 0x09, 0x0a, 0x0a, 0x0a, 0xff,
          0xc2, 0x00, 0x0b, 0x08, 0x00, 0x10, 0x00, 0x10, 0x01, 0x01, 0x11, 0x00, 0xff, 0xc4, 0x00,
          0x16, 0x00, 0x01, 0x01, 0x01, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00,
          0x00, 0x00, 0x00, 0x05, 0x03, 0x09, 0xff, 0xda, 0x00, 0x08, 0x01, 0x01, 0x00, 0x00, 0x00,
          0x00, 0xc6, 0xea, 0x1a, 0x4f, 0xff, 0xc4, 0x00, 0x22, 0x10, 0x00, 0x01, 0x04, 0x02, 0x02,
          0x02, 0x03, 0x01, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x02, 0x01, 0x04, 0x05,
          0x06, 0x03, 0x07, 0x08, 0x11, 0x00, 0x22, 0x12, 0x13, 0x21, 0x42, 0xff, 0xda, 0x00, 0x08,
          0x01, 0x01, 0x00, 0x01, 0x3f, 0x00, 0xd1, 0x1c, 0x47, 0xd9, 0xbc, 0x92, 0x7e, 0xdb, 0x05,
          0x5e, 0x66, 0xbd, 0x02, 0xd6, 0x46, 0x67, 0x1c, 0x3c, 0x54, 0xbd, 0xbe, 0x61, 0x18, 0x34,
          0x7f, 0x26, 0x6a, 0x1f, 0x06, 0x38, 0x32, 0x10, 0x97, 0xd9, 0x9f, 0xac, 0x80, 0x4a, 0x08,
          0x9f, 0x88, 0x48, 0xab, 0xd7, 0x69, 0xde, 0xf9, 0xe0, 0x4e, 0xe0, 0xd0, 0x3a, 0xbe, 0x4b,
          0x6d, 0x59, 0xee, 0x14, 0x99, 0x68, 0xe8, 0x5b, 0xa2, 0x55, 0x27, 0xdb, 0x56, 0x6c, 0xe0,
          0xf5, 0xcc, 0x5c, 0xaa, 0x83, 0x83, 0x4c, 0x19, 0xc0, 0x47, 0xd7, 0xd5, 0xae, 0x6f, 0xd4,
          0x55, 0xf2, 0xbf, 0xca, 0xad, 0xb9, 0x45, 0xa3, 0x54, 0xb5, 0xdc, 0x03, 0xe6, 0x23, 0x19,
          0x47, 0xd8, 0x4b, 0x74, 0x80, 0x0c, 0xac, 0x44, 0xcc, 0x25, 0x54, 0x1b, 0x87, 0xcc, 0xcb,
          0xbf, 0x7c, 0x7d, 0x35, 0xc5, 0xe9, 0xe5, 0xe3, 0x95, 0x1b, 0x72, 0xfd, 0x45, 0xb7, 0xeb,
          0xab, 0x0b, 0xe6, 0x25, 0x19, 0x76, 0xd8, 0x63, 0x75, 0x9d, 0x0c, 0x2c, 0x44, 0x0c, 0xe5,
          0x50, 0x1d, 0x02, 0x10, 0x17, 0xf1, 0x8f, 0xa7, 0x79, 0xbd, 0x3c, 0xff, 0xd9,
        ]),
      ],
      'metaplex.jpg',
      { type: 'image/jpeg' },
    );
    await artwork.text();

However in fetch-blob it throws an error. This is due to TextDecoder. I suggest setting fatal to false, to avoid this inconsistency between browser & node.

object key naming should not contain '#'

Problem

in a source file, some keys of a class have # in its name. This may be ok in most situations(I haven't verified tho), but it breaks when using webpack. The error message being:

Module parse failed: Unexpected character '#' (48:1)

How can I implement the below code in older version ?

How can I implement the below code in older version ?

node-fetch/node-fetch#1340 (comment)

import fetch from 'node-fetch'
import blobFromSync from 'fetch-blob/from.js'

const file = blobFromSync('./largeMovie.mkv')

const chunkSize = 40000
const url = 'https://httpbin.org/post'

for (let start = 0; start < file.size; start += chunkSize) {
  const chunk = file.slice(start, start + chunkSize + 1)
  await fetch(url, {method: 'post', body: chunk}).then(res => res.text())
}

Consider exposing intenal `read` method as separate export

In our code base we abstract between node & browser via function that reads contents as AsyncIterable<Uint8Array>. This library internally has read that pretty much does that, which then is wrapped in a node Readable.

Maybe instead of providing web incompatible stream() method internal read() could be exposed instead ?

Behavior of .slice() is incorrect when using blobs backed by a file on the disk

When using something like the fileFrom(), the behavior is incorrect. .slice() breaks if used more than once.

blob.slice(10, 40).slice(10, 20) should be equivalent to blob.slice(20, 30), and it is if you use a blob that's backed by an arraybuffer, however when using a blob backed by a file on the disk, it becomes equivalent to blob.slice(10, 20) instead.

v2.1.2 contains breaking changes.

I don't have a clue about this, all my related test cases failed on 2.1.2.

In addition, v2.1.2 does not have a git tag, and 2.1.1 and 2.1.0 are also very confusing in the commits.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.