Giter VIP home page Giter VIP logo

node-fetch's Introduction

Node Fetch

A light-weight module that brings Fetch API to Node.js.

Build status Coverage status Current version Install size Mentioned in Awesome Node.js Discord

Consider supporting us on our Open Collective:

Open Collective

You might be looking for the v2 docs

Motivation

Instead of implementing XMLHttpRequest in Node.js to run browser-specific Fetch polyfill, why not go from native http to fetch API directly? Hence, node-fetch, minimal code for a window.fetch compatible API on Node.js runtime.

See Jason Miller's isomorphic-unfetch or Leonardo Quixada's cross-fetch for isomorphic usage (exports node-fetch for server-side, whatwg-fetch for client-side).

Features

  • Stay consistent with window.fetch API.
  • Make conscious trade-off when following WHATWG fetch spec and stream spec implementation details, document known differences.
  • Use native promise and async functions.
  • Use native Node streams for body, on both request and response.
  • Decode content encoding (gzip/deflate/brotli) properly, and convert string output (such as res.text() and res.json()) to UTF-8 automatically.
  • Useful extensions such as redirect limit, response size limit, explicit errors for troubleshooting.

Difference from client-side fetch

  • See known differences:
  • If you happen to use a missing feature that window.fetch offers, feel free to open an issue.
  • Pull requests are welcomed too!

Installation

Current stable release (3.x) requires at least Node.js 12.20.0.

npm install node-fetch

Loading and configuring the module

ES Modules (ESM)

import fetch from 'node-fetch';

CommonJS

node-fetch from v3 is an ESM-only module - you are not able to import it with require().

If you cannot switch to ESM, please use v2 which remains compatible with CommonJS. Critical bug fixes will continue to be published for v2.

npm install node-fetch@2

Alternatively, you can use the async import() function from CommonJS to load node-fetch asynchronously:

// mod.cjs
const fetch = (...args) => import('node-fetch').then(({default: fetch}) => fetch(...args));

Providing global access

To use fetch() without importing it, you can patch the global object in node:

// fetch-polyfill.js
import fetch, {
  Blob,
  blobFrom,
  blobFromSync,
  File,
  fileFrom,
  fileFromSync,
  FormData,
  Headers,
  Request,
  Response,
} from 'node-fetch'

if (!globalThis.fetch) {
  globalThis.fetch = fetch
  globalThis.Headers = Headers
  globalThis.Request = Request
  globalThis.Response = Response
}

// index.js
import './fetch-polyfill'

// ...

Upgrading

Using an old version of node-fetch? Check out the following files:

Common Usage

NOTE: The documentation below is up-to-date with 3.x releases, if you are using an older version, please check how to upgrade.

Plain text or HTML

import fetch from 'node-fetch';

const response = await fetch('https://github.com/');
const body = await response.text();

console.log(body);

JSON

import fetch from 'node-fetch';

const response = await fetch('https://api.github.com/users/github');
const data = await response.json();

console.log(data);

Simple Post

import fetch from 'node-fetch';

const response = await fetch('https://httpbin.org/post', {method: 'POST', body: 'a=1'});
const data = await response.json();

console.log(data);

Post with JSON

import fetch from 'node-fetch';

const body = {a: 1};

const response = await fetch('https://httpbin.org/post', {
	method: 'post',
	body: JSON.stringify(body),
	headers: {'Content-Type': 'application/json'}
});
const data = await response.json();

console.log(data);

Post with form parameters

URLSearchParams is available on the global object in Node.js as of v10.0.0. See official documentation for more usage methods.

NOTE: The Content-Type header is only set automatically to x-www-form-urlencoded when an instance of URLSearchParams is given as such:

import fetch from 'node-fetch';

const params = new URLSearchParams();
params.append('a', 1);

const response = await fetch('https://httpbin.org/post', {method: 'POST', body: params});
const data = await response.json();

console.log(data);

Handling exceptions

NOTE: 3xx-5xx responses are NOT exceptions, and should be handled in then(), see the next section.

Wrapping the fetch function into a try/catch block will catch all exceptions, such as errors originating from node core libraries, like network errors, and operational errors which are instances of FetchError. See the error handling document for more details.

import fetch from 'node-fetch';

try {
	await fetch('https://domain.invalid/');
} catch (error) {
	console.log(error);
}

Handling client and server errors

It is common to create a helper function to check that the response contains no client (4xx) or server (5xx) error responses:

import fetch from 'node-fetch';

class HTTPResponseError extends Error {
	constructor(response) {
		super(`HTTP Error Response: ${response.status} ${response.statusText}`);
		this.response = response;
	}
}

const checkStatus = response => {
	if (response.ok) {
		// response.status >= 200 && response.status < 300
		return response;
	} else {
		throw new HTTPResponseError(response);
	}
}

const response = await fetch('https://httpbin.org/status/400');

try {
	checkStatus(response);
} catch (error) {
	console.error(error);

	const errorBody = await error.response.text();
	console.error(`Error body: ${errorBody}`);
}

Handling cookies

Cookies are not stored by default. However, cookies can be extracted and passed by manipulating request and response headers. See Extract Set-Cookie Header for details.

Advanced Usage

Streams

The "Node.js way" is to use streams when possible. You can pipe res.body to another stream. This example uses stream.pipeline to attach stream error handlers and wait for the download to complete.

import {createWriteStream} from 'node:fs';
import {pipeline} from 'node:stream';
import {promisify} from 'node:util'
import fetch from 'node-fetch';

const streamPipeline = promisify(pipeline);

const response = await fetch('https://github.githubassets.com/images/modules/logos_page/Octocat.png');

if (!response.ok) throw new Error(`unexpected response ${response.statusText}`);

await streamPipeline(response.body, createWriteStream('./octocat.png'));

In Node.js 14 you can also use async iterators to read body; however, be careful to catch errors -- the longer a response runs, the more likely it is to encounter an error.

import fetch from 'node-fetch';

const response = await fetch('https://httpbin.org/stream/3');

try {
	for await (const chunk of response.body) {
		console.dir(JSON.parse(chunk.toString()));
	}
} catch (err) {
	console.error(err.stack);
}

In Node.js 12 you can also use async iterators to read body; however, async iterators with streams did not mature until Node.js 14, so you need to do some extra work to ensure you handle errors directly from the stream and wait on it response to fully close.

import fetch from 'node-fetch';

const read = async body => {
	let error;
	body.on('error', err => {
		error = err;
	});

	for await (const chunk of body) {
		console.dir(JSON.parse(chunk.toString()));
	}

	return new Promise((resolve, reject) => {
		body.on('close', () => {
			error ? reject(error) : resolve();
		});
	});
};

try {
	const response = await fetch('https://httpbin.org/stream/3');
	await read(response.body);
} catch (err) {
	console.error(err.stack);
}

Accessing Headers and other Metadata

import fetch from 'node-fetch';

const response = await fetch('https://github.com/');

console.log(response.ok);
console.log(response.status);
console.log(response.statusText);
console.log(response.headers.raw());
console.log(response.headers.get('content-type'));

Extract Set-Cookie Header

Unlike browsers, you can access raw Set-Cookie headers manually using Headers.raw(). This is a node-fetch only API.

import fetch from 'node-fetch';

const response = await fetch('https://example.com');

// Returns an array of values, instead of a string of comma-separated values
console.log(response.headers.raw()['set-cookie']);

Post data using a file

import fetch, {
  Blob,
  blobFrom,
  blobFromSync,
  File,
  fileFrom,
  fileFromSync,
} from 'node-fetch'

const mimetype = 'text/plain'
const blob = fileFromSync('./input.txt', mimetype)
const url = 'https://httpbin.org/post'

const response = await fetch(url, { method: 'POST', body: blob })
const data = await response.json()

console.log(data)

node-fetch comes with a spec-compliant FormData implementations for posting multipart/form-data payloads

import fetch, { FormData, File, fileFrom } from 'node-fetch'

const httpbin = 'https://httpbin.org/post'
const formData = new FormData()
const binary = new Uint8Array([ 97, 98, 99 ])
const abc = new File([binary], 'abc.txt', { type: 'text/plain' })

formData.set('greeting', 'Hello, world!')
formData.set('file-upload', abc, 'new name.txt')

const response = await fetch(httpbin, { method: 'POST', body: formData })
const data = await response.json()

console.log(data)

If you for some reason need to post a stream coming from any arbitrary place, then you can append a Blob or a File look-a-like item.

The minimum requirement is that it has:

  1. A Symbol.toStringTag getter or property that is either Blob or File
  2. A known size.
  3. And either a stream() method or a arrayBuffer() method that returns a ArrayBuffer.

The stream() must return any async iterable object as long as it yields Uint8Array (or Buffer) so Node.Readable streams and whatwg streams works just fine.

formData.append('upload', {
	[Symbol.toStringTag]: 'Blob',
	size: 3,
  *stream() {
    yield new Uint8Array([97, 98, 99])
	},
	arrayBuffer() {
		return new Uint8Array([97, 98, 99]).buffer
	}
}, 'abc.txt')

Request cancellation with AbortSignal

You may cancel requests with AbortController. A suggested implementation is abort-controller.

An example of timing out a request after 150ms could be achieved as the following:

import fetch, { AbortError } from 'node-fetch';

// AbortController was added in node v14.17.0 globally
const AbortController = globalThis.AbortController || await import('abort-controller')

const controller = new AbortController();
const timeout = setTimeout(() => {
	controller.abort();
}, 150);

try {
	const response = await fetch('https://example.com', {signal: controller.signal});
	const data = await response.json();
} catch (error) {
	if (error instanceof AbortError) {
		console.log('request was aborted');
	}
} finally {
	clearTimeout(timeout);
}

See test cases for more examples.

API

fetch(url[, options])

  • url A string representing the URL for fetching
  • options Options for the HTTP(S) request
  • Returns: Promise<Response>

Perform an HTTP(S) fetch.

url should be an absolute URL, such as https://example.com/. A path-relative URL (/file/under/root) or protocol-relative URL (//can-be-http-or-https.com/) will result in a rejected Promise.

Options

The default values are shown after each option key.

{
	// These properties are part of the Fetch Standard
	method: 'GET',
	headers: {},            // Request headers. format is the identical to that accepted by the Headers constructor (see below)
	body: null,             // Request body. can be null, or a Node.js Readable stream
	redirect: 'follow',     // Set to `manual` to extract redirect headers, `error` to reject redirect
	signal: null,           // Pass an instance of AbortSignal to optionally abort requests

	// The following properties are node-fetch extensions
	follow: 20,             // maximum redirect count. 0 to not follow redirect
	compress: true,         // support gzip/deflate content encoding. false to disable
	size: 0,                // maximum response body size in bytes. 0 to disable
	agent: null,            // http(s).Agent instance or function that returns an instance (see below)
	highWaterMark: 16384,   // the maximum number of bytes to store in the internal buffer before ceasing to read from the underlying resource.
	insecureHTTPParser: false	// Use an insecure HTTP parser that accepts invalid HTTP headers when `true`.
}

Default Headers

If no values are set, the following request headers will be sent automatically:

Header Value
Accept-Encoding gzip, deflate, br (when options.compress === true)
Accept */*
Content-Length (automatically calculated, if possible)
Host (host and port information from the target URI)
Transfer-Encoding chunked (when req.body is a stream)
User-Agent node-fetch

Note: when body is a Stream, Content-Length is not set automatically.

Custom Agent

The agent option allows you to specify networking related options which are out of the scope of Fetch, including and not limited to the following:

  • Support self-signed certificate
  • Use only IPv4 or IPv6
  • Custom DNS Lookup

See http.Agent for more information.

If no agent is specified, the default agent provided by Node.js is used. Note that this changed in Node.js 19 to have keepalive true by default. If you wish to enable keepalive in an earlier version of Node.js, you can override the agent as per the following code sample.

In addition, the agent option accepts a function that returns http(s).Agent instance given current URL, this is useful during a redirection chain across HTTP and HTTPS protocol.

import http from 'node:http';
import https from 'node:https';

const httpAgent = new http.Agent({
	keepAlive: true
});
const httpsAgent = new https.Agent({
	keepAlive: true
});

const options = {
	agent: function(_parsedURL) {
		if (_parsedURL.protocol == 'http:') {
			return httpAgent;
		} else {
			return httpsAgent;
		}
	}
};

Custom highWaterMark

Stream on Node.js have a smaller internal buffer size (16kB, aka highWaterMark) from client-side browsers (>1MB, not consistent across browsers). Because of that, when you are writing an isomorphic app and using res.clone(), it will hang with large response in Node.

The recommended way to fix this problem is to resolve cloned response in parallel:

import fetch from 'node-fetch';

const response = await fetch('https://example.com');
const r1 = response.clone();

const results = await Promise.all([response.json(), r1.text()]);

console.log(results[0]);
console.log(results[1]);

If for some reason you don't like the solution above, since 3.x you are able to modify the highWaterMark option:

import fetch from 'node-fetch';

const response = await fetch('https://example.com', {
	// About 1MB
	highWaterMark: 1024 * 1024
});

const result = await res.clone().arrayBuffer();
console.dir(result);

Insecure HTTP Parser

Passed through to the insecureHTTPParser option on http(s).request. See http.request for more information.

Manual Redirect

The redirect: 'manual' option for node-fetch is different from the browser & specification, which results in an opaque-redirect filtered response. node-fetch gives you the typical basic filtered response instead.

import fetch from 'node-fetch';

const response = await fetch('https://httpbin.org/status/301', { redirect: 'manual' });

if (response.status === 301 || response.status === 302) {
	const locationURL = new URL(response.headers.get('location'), response.url);
	const response2 = await fetch(locationURL, { redirect: 'manual' });
	console.dir(response2);
}

Class: Request

An HTTP(S) request containing information about URL, method, headers, and the body. This class implements the Body interface.

Due to the nature of Node.js, the following properties are not implemented at this moment:

  • type
  • destination
  • mode
  • credentials
  • cache
  • integrity
  • keepalive

The following node-fetch extension properties are provided:

  • follow
  • compress
  • counter
  • agent
  • highWaterMark

See options for exact meaning of these extensions.

new Request(input[, options])

(spec-compliant)

  • input A string representing a URL, or another Request (which will be cloned)
  • options Options for the HTTP(S) request

Constructs a new Request object. The constructor is identical to that in the browser.

In most cases, directly fetch(url, options) is simpler than creating a Request object.

Class: Response

An HTTP(S) response. This class implements the Body interface.

The following properties are not implemented in node-fetch at this moment:

  • trailer

new Response([body[, options]])

(spec-compliant)

Constructs a new Response object. The constructor is identical to that in the browser.

Because Node.js does not implement service workers (for which this class was designed), one rarely has to construct a Response directly.

response.ok

(spec-compliant)

Convenience property representing if the request ended normally. Will evaluate to true if the response status was greater than or equal to 200 but smaller than 300.

response.redirected

(spec-compliant)

Convenience property representing if the request has been redirected at least once. Will evaluate to true if the internal redirect counter is greater than 0.

response.type

(deviation from spec)

Convenience property representing the response's type. node-fetch only supports 'default' and 'error' and does not make use of filtered responses.

Class: Headers

This class allows manipulating and iterating over a set of HTTP headers. All methods specified in the Fetch Standard are implemented.

new Headers([init])

(spec-compliant)

  • init Optional argument to pre-fill the Headers object

Construct a new Headers object. init can be either null, a Headers object, an key-value map object or any iterable object.

// Example adapted from https://fetch.spec.whatwg.org/#example-headers-class
import {Headers} from 'node-fetch';

const meta = {
	'Content-Type': 'text/xml'
};
const headers = new Headers(meta);

// The above is equivalent to
const meta = [['Content-Type', 'text/xml']];
const headers = new Headers(meta);

// You can in fact use any iterable objects, like a Map or even another Headers
const meta = new Map();
meta.set('Content-Type', 'text/xml');
const headers = new Headers(meta);
const copyOfHeaders = new Headers(headers);

Interface: Body

Body is an abstract interface with methods that are applicable to both Request and Response classes.

body.body

(deviation from spec)

Data are encapsulated in the Body object. Note that while the Fetch Standard requires the property to always be a WHATWG ReadableStream, in node-fetch it is a Node.js Readable stream.

body.bodyUsed

(spec-compliant)

  • Boolean

A boolean property for if this body has been consumed. Per the specs, a consumed body cannot be used again.

body.arrayBuffer()

body.formData()

body.blob()

body.json()

body.text()

fetch comes with methods to parse multipart/form-data payloads as well as x-www-form-urlencoded bodies using .formData() this comes from the idea that Service Worker can intercept such messages before it's sent to the server to alter them. This is useful for anybody building a server so you can use it to parse & consume payloads.

Code example
import http from 'node:http'
import { Response } from 'node-fetch'

http.createServer(async function (req, res) {
  const formData = await new Response(req, {
    headers: req.headers // Pass along the boundary value
  }).formData()
  const allFields = [...formData]

  const file = formData.get('uploaded-files')
  const arrayBuffer = await file.arrayBuffer()
  const text = await file.text()
  const whatwgReadableStream = file.stream()

  // other was to consume the request could be to do:
  const json = await new Response(req).json()
  const text = await new Response(req).text()
  const arrayBuffer = await new Response(req).arrayBuffer()
  const blob = await new Response(req, {
    headers: req.headers // So that `type` inherits `Content-Type`
  }.blob()
})

Class: FetchError

(node-fetch extension)

An operational error in the fetching process. See ERROR-HANDLING.md for more info.

Class: AbortError

(node-fetch extension)

An Error thrown when the request is aborted in response to an AbortSignal's abort event. It has a name property of AbortError. See ERROR-HANDLING.MD for more info.

TypeScript

Since 3.x types are bundled with node-fetch, so you don't need to install any additional packages.

For older versions please use the type definitions from DefinitelyTyped:

npm install --save-dev @types/[email protected]

Acknowledgement

Thanks to github/fetch for providing a solid implementation reference.

Team

David Frank Jimmy Wärting Antoni Kepinski Richie Bendall Gregor Martynus
David Frank Jimmy Wärting Antoni Kepinski Richie Bendall Gregor Martynus
Former

License

MIT

node-fetch's People

Contributors

alextes avatar bitinn avatar bkw avatar dandv avatar dependabot-preview[bot] avatar dependabot[bot] avatar dmikis avatar dnalborczyk avatar gr2m avatar jimmywarting avatar jiralite avatar jkantr avatar jstewmon avatar justinbeckwith avatar khafradev avatar kirill-konshin avatar m59peacemaker avatar markherhold avatar matthew-andrews avatar maxim-mazurok avatar notmoni avatar richienb avatar serverwentdown avatar tekwiz avatar timothygu avatar tinovyatkin avatar tootallnate avatar wheresrhys avatar xxczaki avatar zkat avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

node-fetch's Issues

Expose and document Response and Headers

I'm working on a utility for making unit testing code that uses fetch easier https://github.com/wheresrhys/fetch-mock. I'm relying on your Response and Headers classes, but as these are not exposed directly I'm having to include them via `require('node-fetch/lib/Response'). I'd be a lot more comfortable relying on them if the path to these files was documented or they were exposed as properties on the Fetch constructor. Would you be open to doing this?

Transfer encoding in HTTP headers

Hello,

It seems that window.fetch in the browser will treat Content-Length and Transfer-Encoding differently compared to node-fetch. This causes some headache as a receiving server will have to treat these results differently (stream vs no stream).

This is how I'm calling fetch (both on the client and the server)

fetch("http://localhost:5000/users", { method: "POST" });

Using node-fetch in Node 4.1 yields the following headers on the receiving server:

Transfer-Encoding: chunked
Content-Length:

Where as using window.fetch in Chrome 45.0.2454.85 produces the following:

Content-Length: 0

I'm not sure about this, but my guess is that Transfer-Encoding: chunkedshould only be applied in the cases where you're providing a stream as body. As in this instance: https://github.com/bitinn/node-fetch/blob/master/test/test.js#L436

Otherwise, it should set the content-length header. I.e something like this:

options.headers['Content-Length'] = Buffer.byteLength(body);

I may have misunderstood the spec, so let me know if this fits within the scope of this project or not. If so, I'm happy to submit a PR.

Regards,
Daniel

Support for safari?

Hello,
I am using safari 5.1.7 on my windows machine and I am getting the error: "Can't find variable: Promise" I read that safari does not support promises. So, Is there any way I can offer promise support on my app for safari users?

I am using babel 6.

EDIT Please remove this post. It does not belong here. Thank you.

.blob() support

Hi

So I have been using node-fetch for a while now and for the first time I needed to request an image and get its size, now in the client I can do.

fetch('imageURL').then(function(response) {
    return response.blob();
}).then(function(resp){
    console.log(resp.size)
});

But unfortunately blob isn't supported by node-fetch so this won't work

Thanks

Jonathan

res.text() doesn't work after res.json() fail

Hi, I would like to get the response raw text when res.json() throws an error.

Example scenario: expected json response is not json but an html error. Therefore, res.json() throws an error.

var response;
fetch(url, {
  headers: headers,                                                                                                                                                                           
  body: JSON.stringify(payload),
  method: method.toUpperCase()
}).then(function(res) {
  response = res;
  return res.json();
}).then(function(json) {
     // do stuff
  }
}).catch(function(err) {
   response.text().then(function(html) {                               // <---- doesnt work
    console.log("instead of json, I got this response:");
    return console.log(html);
  });
  return console.error(err);
});

What's the best way of printing 'corrupt json' responses?

Note #1 : Im not asking how to check the header (http status codes etc)
Note #2 : This is a pretty straightforward scenario imho, so an example of this would be very handy in README.md

tasks for considering (2.0 release)

just collecting some common complain about 1.x release

  • provide some guideline on how to handle rejected promise (failed fetch), this can be caused by many reasons: url, headers.location, network timeout, request config etc; currently the only way to distinguish them is to check error message.
  • provide some alternatives to extract redirect headers, it's not possible with current fetch spec, only the final url will be expose to user when following redirects.
  • some way for cookies to persist when you do a POST request that redirects (which should be followed by a GET request, often with cookie as credential). Though if we implement extraction for redirect headers we can make this possible (keep the cookie headers for your next GET request).
  • expose Request/Headers/Response interfaces, we don't currently have a Request interface due to the fact that no one are using them.
  • normalize Headers .set and .append to prevent non-string input, it's recommended to already cast your input to String type first.
  • Headers/Response interface should support ES6 features, such as keys() and entries()

fr: error on not ok

would be nice to have an option to error when res.ok is false. how minimal does this library want to be?

Headers get wrapped

https://github.com/bitinn/node-fetch/blob/master/index.js#L83 wraps existing headers in another headers instance. The existing headers get lost. Effectively, this is doing:

require('es6-promise').polyfill();
global.Headers = require('../node_modules/node-fetch/lib/headers.js');
var headers = new Headers({Accept: 'application/json'})
console.log(new Headers(headers).raw());

Resulting in:

{}

The spec says that headers passed into the fetch options should implement the Headers interface. The present node-fetch implementation treats it as a hash.

Use case for res.arrayBuffer

I'm currently working on a .docx file parser that should work on both browsers and NodeJS environments.

https://github.com/MrOrz/hacktabl-parser/blob/fcfc7626c533b0434d11215e4c3c1d305990db26/src/fetchDoc.js

.docx files are a bunch of XML files archived in a zip format, thus I am using JSZip to extract XML files from .docx. Since node-fetch does not support res.arrayBuffer(), I cannot use isomorphic-fetch to grab .docx files on NodeJS.

If node-fetch were to support res.arrayBuffer(), the large if-statement that differentiates server & browser behavior could be gone. http and https library don't have to be required in the code anymore, leading us to a smaller bundle when the code is compiled using webpack for use in browsers.

Critical dependencies: request of a dependency is an expression in encoding.js

Hello

I get an error when I try to build universal react app with redux and icomorphic-fetch.

WARNING in (server) ./~/isomorphic-fetch/~/node-fetch/~/encoding/lib/encoding.js
Critical dependencies:
9:12-34 the request of a dependency is an expression
 @ ./~/isomorphic-fetch/~/node-fetch/~/encoding/lib/encoding.js 9:12-34

It's this piece of code in encoding.js that is causing the problems

try {
    // this is to fool browserify so it doesn't try (in vain) to install iconv.
    var iconv_package = 'iconv';
    Iconv = require(iconv_package).Iconv;
} catch (E) {
    // node-iconv not present
}

Any ideas on how to fix this?

file:// URLs

Are there plans to support local filesystem URLs?

Disable timeout

I'm trying to make a call to an external API. The response could take up to a couple minutes, so I need to disable the timeout on fetch. Setting the timeout option to 0 does not disable it, and it timeouts after about 20 seconds. Am I missing something?

POST redirection

Hi there!

Im doing a fetch to an URL with POST (a login), which will redirect to another url if the login is successfull.

When this happends, fetch will try to follow the new url with the POST method too.
Shouldnt it follow the URL with a GET?

like in here: https://github.com/bitinn/node-fetch/blob/master/index.js#L127
Adding options.method='GET' just before that line.

Thanks!

Memory leak when body is not used

I wanted to test a URL for validity; verifying that it returns a 200 response.

Using this code:

const fetch = require('node-fetch');

module.exports = (url) => fetch(url)
    .then(res => res.status === 200 && url);

...I found it caused a memory leak. Node that I"m not calling .body() or .json(). Profiling the memory usage over time, I was suspicious that lots of stream objects were being keep in memory. I speculatively added {method: 'HEAD'} to the options (which is of course the right thing to do in this case anyway), which fixed the memory leak.

Is this a bug in node-fetch? Should memory be freed whether or not the body of the response is ever used? If it's not a bug, I wonder if the documentation should make it clear that responses with a body must be cleared in some way to free memory?

Thanks for a great library!

Warning with webpack

WARNING in ../~/node-fetch/~/encoding/lib/encoding.js
Critical dependencies:
9:12-34 the request of a dependency is an expression
 @ ../~/node-fetch/~/encoding/lib/encoding.js 9:12-34

Absolute URL check should be a part of fetch() but not Request class

It should be possible to create an instance of Request with any URL (non-absolute or non http(s) protocol) because it's just a container, and such constraint is not the responsibility of Request class:

var req = new Request('foo'); // this is legit

Absolute URLs constraint is a responsibility of fetch() function, as it can't send such Requests, but Requests themselves are valid.

Definition as a global

Is it possible to set a flag global: true to have this fetch method exposed?

Reason being is that if we don't have something like this, then writing tests for this is a pain with mocha, and we have to define our own global.fetch = require('fetch');

Tag releases?

E.g. for the latest release, this will do the trick:

git tag -a v1.3.3 fd3f89fcd94
git push --tags

This makes it easier to tell which commit corresponds to which NPM release.

Offer caching

node-fetch is a pleasure to use and has made my api fetching much easier to work with and manage. Thanks!

I was wondering if it would be possible to add request caching support? While I'm sure such an implementation isn't trivial, it would save lots of resources and speed up fetching for already known responses. Thanks!

(Im not sure it's of any use, but here is a caching implementation offered for superaget)

request <url> failed, reason: self signed certificate

Today, when I'm using node-fetch to request a facebook api end-point, I received "self signed certificate" error.

Error: request to https://graph.facebook.com/v2.3/<POST_ID>?fields=id,from,created_time,updated_time,picture&access_token=<FB_TOKEN> failed, reason: self signed certificate
    at ClientRequest.<anonymous> (/home/<PROJECT_PATH>/node_modules/node-fetch/index.js:116:11)
    at emitOne (events.js:77:13)
    at ClientRequest.emit (events.js:166:7)
    at TLSSocket.socketErrorListener (_http_client.js:254:9)
    at emitOne (events.js:77:13)
    at TLSSocket.emit (events.js:166:7)
    at TLSSocket.<anonymous> (_tls_wrap.js:931:16)
    at emitNone (events.js:67:13)
    at TLSSocket.emit (events.js:163:7)
    at TLSSocket._finishInit (_tls_wrap.js:506:8)'

How to pass OAuth credentials?

Can this module be used with OAuth credentials? I figured out how to do Basic Auth, e.g.

fetch('https://awesome-service.com/api/v1/metrics.json', { headers: { Authorization: 'Basic xxxxxxxxxxxxxxxxxxxx' } })

But what is the API for OAuth authentication? Thanks :)

More easily checkable errors

Would you be open to a pull request that creates more specific errors than just Error. If I remember correctly the whatwg-fetch spec leaves errors up to implementations to decide how to handle so I think we should be free to do what we want…

It would be really helpful if there was something like a FetchError for all errors (although a different type per error would be OK too) so we can do things like

var fetch = require('node-fetch');
var FetchError = require('node-fetch').FetchError;

fetch('https://time.out/error')
  .catch(function(err) {
    if (err instanceof FetchError) {
      // send the failure to our metrics system but don't raise it as a fault
    } else {
      // send the error to our error aggregation system
    }
  });

progress indicator

i'd like to see the current progress of my requests, to check if they've stalled. there doesn't seem to be an official solution for this.

Helper methods

What are your thoughts on providing helper methods such as fetch.post and fetch.put? They would just automatically add { method: 'WHATEVER' } to the options argument.

Support of http proxy

If I want to use http proxy,I will

 require('http').request({
    host:'127.0.0.1',
    port:8580,
    method:'POST',
    path:'https://www.google.com',
},function(res){
    //...
})

But I use node-fetch,

fetch(obj).then(function(res){
    ...
})

It throw TypeError: Parameter 'url' must be a string, not object

It does not support URL objects

Unhandled error in deflate compression handling

Okay, I have to admit that I have no idea what's going on here, sorry for the vague issue.

Node v0.12.7

var fetch = require('node-fetch')
fetch('http://time-attendance.co.uk')

gives

events.js:85
      throw er; // Unhandled 'error' event
            ^
Error: incorrect header check
    at Zlib._handle.onerror (zlib.js:366:17)

...
The fetch results in a ok response... The promise is not rejected. But I still get this error thrown, somewhere.. Confused. :/

Missing forEach method in Headers object

whatwg-fetch implements forEach method in Headers object prototype. I've tried to use node-fetch as a backend for restful.js, because I'm using isomorphic-fetch package. Unfortunately, restful.js uses forEach method, so it doesn't work on backend.

The question is: Should Headers object provide forEach method? Is it in specification? I couldn't find anything.

I'm happy to provide a PR with patch unless there's a reason why Headers doesn't have forEach.

ENETUNREACH error for ipv6

Hi there, new issue unfortunately with this library.

Node: v0.12.7

On Linux (Linux version 3.13.0-57-generic (buildd@brownie) (gcc version 4.8.2 (Ubuntu 4.8.2-19ubuntu1) ) #95-Ubuntu SMP Fri Jun 19 09:28:15 UTC 2015) when I fetch url: http://healthy-america.com.co/ I get error code ENETUNREACH - whereas on osx it works fine with same code.

Going back to curl I get Network is unreachable on my Linux server, unless I add flag --ipv4 - then it works. Any suggestions on what I need to do in node-fetch to make requests like these succeed on my Linux server?

Thanks

node domains are not being propagated within promises.

Ok, so in my app I use deprecated node domains (I know, but until valid substitution is proposed I kinda need them) and it seems that node-fetch looses current domain within promise.

given I substitute native promise with bluebird:

fetch.Promise = require('bluebird');

node domains become available within promise.

I can produce a proper test + a fix, but the question will you accept this fix given node domains are deprecated?

Request for Collaborators

There are quite a few issues and minor features I intend to work on or review (eg. #8 and existing PRs) but haven't had the time, I guess being CEO of a startup has finally taken its toll on me.

Yes, we still heavily use node-fetch in production, and I still intend to drive its development. But I would much appreciate your help on this effort.

Given the number of projects depending on node-fetch, I believe it's the right thing to do.

My hope for this change:

  • We continue to stick with our feature guideline, Fetch as described in spec: we are not going to replace request anytime soon. https://github.com/bitinn/node-fetch#features
  • We continue to heavily test our implementation when adding features and bugfixes, my contribution during 1.x development was mostly answering issues, adding more edgecase tests, and document new code.

If you are interested in becoming a collaborator, leave a message to let me know :)

PS: I think next release warrants a minor version bump?

Encountered error "Error: no window object present" when prerendering

Hi,

I use node-fetch in a reacts + rails project. I use https://github.com/shakacode/react_on_rails to glue between reactjs and Rails. Everything works fine except when I do server side rendering it complains the error "Encountered error "Error: no window object present" when prerendering". I know this error is caused by the module because when I not load it the error disappears.

I am willing to work a PR, just need some directions. Thanks!

can I use it with es7 async/await?

example:

let fetch = require('node-fetch');

(async () => {
  try {
    // request
    let response = await fetch('http://localhost:8088/api/v1/hooy');
    // parsing
    let data = await response.json();
    console.log('data: ', data);
  } catch (error) {
    console.log('error: ', error);
  }
}());

sh session:

❯ es6 test1.js
error:  [TypeError: Cannot read property 'json' of undefined]

where es6 alias is:

function es6() { babel --stage=0 --experimental "$@" | iojs; }

POST requests not working

I don't seem to be able to make post requests. I've posted the example in the docs to PostCatcher and it's showing that no data is being sent.

Post catcher URL: http://postcatcher.in/catchers/554104bdc5959603000001bc

My code:

fetch = require('node-fetch');

fetch('http://postcatcher.in/catchers/554104bdc5959603000001bc', { method: 'POST', body: 'a=1' })
  .then(function(res) {
      return res.json();
  }).then(function(json) {
      console.log(json);
  });

Add status process method

Would nice to have that handy checkStatus function included for Handling HTTP error statuses...

function checkStatus(response) {
  if (response.status >= 200 && response.status < 300) {
    return response
  } else {
    var error = new Error(response.statusText)
    error.response = response
    throw error
  }
}

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.