Giter VIP home page Giter VIP logo

azure-functions-nodejs-library's Introduction

Azure Functions Node.js Programming Model

Branch Status Support level Node.js Versions
v4.x (default) Build Status Test Status GA 20, 18
v3.x Build Status Test Status GA 20, 18, 16, 14

Version 4 is Generally Available! ๐ŸŽ‰โœจ Read our blog post and let us know what you think by reacting or commenting on our GA discussion thread

Install

npm install @azure/functions

Documentation

Considerations

  • The Node.js "programming model" shouldn't be confused with the Azure Functions "runtime".
    • Programming model: Defines how you author your code and is specific to JavaScript and TypeScript.
    • Runtime: Defines underlying behavior of Azure Functions and is shared across all languages.
  • The programming model version is strictly tied to the version of the @azure/functions npm package, and is versioned independently of the runtime. Both the runtime and the programming model use "4" as their latest major version, but that is purely a coincidence.
  • You can't mix the v3 and v4 programming models in the same function app. As soon as you register one v4 function in your app, any v3 functions registered in function.json files are ignored.

Usage

TypeScript

import { app, HttpRequest, HttpResponseInit, InvocationContext } from "@azure/functions";

export async function httpTrigger1(request: HttpRequest, context: InvocationContext): Promise<HttpResponseInit> {
    context.log(`Http function processed request for url "${request.url}"`);

    const name = request.query.get('name') || await request.text() || 'world';

    return { body: `Hello, ${name}!` };
};

app.http('httpTrigger1', {
    methods: ['GET', 'POST'],
    authLevel: 'anonymous',
    handler: httpTrigger1
});

JavaScript

const { app } = require('@azure/functions');

app.http('httpTrigger1', {
    methods: ['GET', 'POST'],
    authLevel: 'anonymous',
    handler: async (request, context) => {
        context.log(`Http function processed request for url "${request.url}"`);

        const name = request.query.get('name') || await request.text() || 'world';

        return { body: `Hello, ${name}!` };
    }
});

azure-functions-nodejs-library's People

Contributors

aaronpowell avatar alrod avatar castrodd avatar davidmrdavid avatar dependabot[bot] avatar ejizba avatar eliaslopezgt avatar francisco-gamino avatar g-rath avatar hannesne avatar hossam-nasr avatar irnc avatar jamiehaywood avatar kashimiz avatar mhoeger avatar microsoft-github-policy-service[bot] avatar microsoftopensource avatar msftgits avatar nelak avatar pragnagopa avatar yojagad avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

azure-functions-nodejs-library's Issues

Multiple calls to `context.res.end()` affect each other

From CRI

In the HttpResponseFull object context.res that is provided to HTTP functions, the context.res.end() method does two things: First, it sets the body property to whatever argument is provided, then they call context.done().

If context.res.end() is called more than once, even though subsequent calls to context.done() should be ignored, the response.body property is still modified, since the RPC converters copy the input by reference not by value. This means that subsequent calls to context.res.end() will still modify the .body property of the HTTP output in the invocation response, even though semantically it should not, and also bypassing the RPC conversions, causing a crash.

Even though multiple calls to context.done() are not supported, this should still be fixed because:

  1. Semantically, subsequent calls should not have any effect
  2. It is dangerous to allow our response to the host to be modified after passing our checks
  3. Such an error shouldn't completely break customers

Therefore, RPC converters should make sure that they copy all provided arguments by value.

Repro steps

  1. Create an HTTP triggered function that calls context.res.end() twice, once without an argument and once with a string argument:
const httpTrigger: AzureFunction = function (context: Context, req: HttpRequest): void {
    context.log('HTTP trigger function processed a request.');
    context.res.end();
    context.res.end('test test')
};
  1. Trigger the function
  2. You see an error like this:
System.Private.CoreLib: Exception while executing function: Functions.index. System.Private.CoreLib: node exited with code 1
     at Object.callErrorFromStatus (C:\Users\hossamnasr\ms\azure\nodejs-worker\azure-functions-nodejs-worker\node_modules\@grpc\grpc-js\build\src\call.js:31:26),  details: 'Request message serialization failure: .AzureFunctionsRpcMessages.RpcHttp.body: object expected',,LanguageWorkerConsoleLog[error] Worker c4d2aa66-b3c5-4e81-90c1-dfbe44b82f92 uncaught exception: Error: Error: 13 INTERNAL: Request message serialization failure: .AzureFunctionsRpcMessages.RpcHttp.body: object expected     at ClientDuplexStreamImpl.<anonymous> (C:\Users\hossamnasr\ms\azure\nodejs-worker\azure-functions-nodejs-worker\dist\src\setupEventStream.js:38:15)     at ClientDuplexStreamImpl.emit (node:events:390:28)     at Object.onReceiveStatus (C:\Users\hossamnasr\ms\azure\nodejs-worker\azure-functions-nodejs-worker\node_modules\@grpc\grpc-js\build\src\client.js:390:28)     at Object.onReceiveStatus (C:\Users\hossamnasr\ms\azure\nodejs-worker\azure-functions-nodejs-worker\node_modules\@grpc\grpc-js\build\src\client-interceptors.js:299:181)     at C:\Users\hossamnasr\ms\azure\nodejs-worker\azure-functions-nodejs-worker\node_modules\@grpc\grpc-js\build\src\call-stream.js:145:78     at processTicksAndRejections (node:internal/process/task_queues:78:11).

Expected behavior

  • Function triggers without an error
  • The second call to context.res.end() is ignored

Actual behavior

  • Function throws the error above
  • The second call to context.res.end() modified the response.body field after RPC conversions are made, and RPC conversions aren't initiated again, leading the actual invocation response object to contain an RPC HTTP output object with incompatible types, throwing the above error.

Known workarounds

  • Don't use multiple context.done() calls, which is unsupported anyway
  • Pin to Host version 3.5.2.0

Support streams (azure resources)

The Blob Trigger binding for NodeJS functions always binds data as a Buffer even if 'dataType' is set to 'stream' or 'string'.

Repro steps

  1. Create a node-based function using the blob trigger
  2. Set "dataType" to "stream" in function.json:
{
  "bindings": [
    {
      "name": "fileBlob",
      "type": "blobTrigger",
      "direction": "in",
      "path": "container/{fileName}.csv",
      "connection": "AzureWebJobsStorage",
      "dataType": "stream"
    }
  ],
  "scriptFile": "../dist/myFunction/index.js"
}

Expected behavior

fileBlob input parameter should be some kind of Stream type

Actual behavior

fileBlob input parameter is a Buffer

Known workarounds

None

Related information

Functions runtime ~3
Node ~14

Model v4 - Hot reloading not working

I am testing out the new Model v4 functions using nodejs (not TypeScript). I am using VSCode and in previous versions npm start would run func start which in turn would enable hot reloading. I could edit files, save, and the changes would apply.

Currently I have to stop the running process after every change, then restart it to get the new code to apply...

After every save in vscode I am getting Worker process started and initialized. Which previously indicated that the new code was in play.

package.json

  "name": "fa-modelv4",
  "version": "1.0.0",
  "description": "",
  "scripts": {
    "start": "func start",
    "test": "echo \"No tests yet...\""
  },
  "dependencies": {
    "@azure/functions": "^4.0.0-alpha.9"
  },
  "devDependencies": {},
  "main": "src/functions/*.js"
}

npm start

> fa-modelv4@1.0.0 start
> func start


Azure Functions Core Tools
Core Tools Version:       4.0.5095 Commit hash: N/A  (64-bit)
Function Runtime Version: 4.16.5.20396

[2023-03-29T10:26:46.998Z] Worker process started and initialized.

Functions:

        httpTrigger1: [GET,POST] http://localhost:7071/api/httpTrigger1

Any thoughts?

Function Name parameter seems confusing when providing a `HttpFunctionOptions`

Since the Function Name parameter to a HTTP Trigger registration (in v4 programming model) is used to define the route, when you provide a HttpFunctionOptions argument rather than a callback, it seems confusing as to why you might want to provide both a name and route.

I'd prefer to be able to do the following:

// name becomes route
app.get('hello-world-1', (_, req) => ({ body: "Hello World" });

// explicit route, with name derived from route
app.get({
  route: '/hello-world-2',
  handler: (_, req) => ({ body: "Hello World" })
});

// name and route defined
app.get({
  route: '/hello-world-3',
  handler: (_, req) => ({ body: "Hello World" }),
  name: "Hello World 3
});

ApplicationInsights:Failed to propagate context in Azure Functions [

Investigative information

Please provide the following:

  • Timestamp: Between 16 Jan, 12:14 PM CST (2023-01-16T20:14:00Z) and 17 Jan 12:57 CST (2023-01-17T20:57:00Z)
  • Invocation ID: 32fd51cf-df2e-4920-8eca-25b7abf15ad4
  • Region: West US 2

Repro steps

Provide the steps required to reproduce the problem:

  1. Create and deploy a function app
  2. In an http trigger attempt to log a JSON object as the message:
const healthTrigger: AzureFunction = async function(context: Context) {
  context.log.info('{"method":"healthTrigger","message":"triggered"}');
  context.res = { body: 'OK', status: 200 };
};

Expected behavior

Message is logged to AI and no errors are generated.

Actual behavior

The following error is written to the console:

ApplicationInsights:Failed to propagate context in Azure Functions [
  at AzureFunctionsHook.<anonymous> (D:\home\site\wwwroot\node_modules\[redacted]\node_modules\applicationinsights\out\AutoCollection\AzureFunctionsHook.js:95:38)
  at AzureFunctionsHook._propagateContext (D:\home\site\wwwroot\node_modules\[redacted]\node_modules\applicationinsights\out\AutoCollection\AzureFunctionsHook.js:87:16)
  at AzureFunctionsHook._propagateContext (D:\home\site\wwwroot\node_modules\[redacted]\node_modules\applicationinsights\out\AutoCollection\AzureFunctionsHook.js:87:16)
  at step (D:\home\site\wwwroot\node_modules\[redacted]\node_modules\applicationinsights\out\AutoCollection\AzureFunctionsHook.js:33:23)
  at Object.next (D:\home\site\wwwroot\node_modules\[redacted]\node_modules\applicationinsights\out\AutoCollection\AzureFunctionsHook.js:14:53)
  at new Promise (<anonymous>)
  at __awaiter (D:\home\site\wwwroot\node_modules\[redacted]\node_modules\applicationinsights\out\AutoCollection\AzureFunctionsHook.js:4:12)
  at t.InvocationModel.<anonymous> (D:\Program Files (x86)\SiteExtensions\Functions\4.14.0\workers\node\dist\src\worker-bundle.js:2:51947)
  at new Promise (<anonymous>)
  at y.<anonymous> (D:\Program Files (x86)\SiteExtensions\Functions\4.14.0\workers\node\dist\src\worker-bundle.js:2:34778)
]

Related information

Provide any related information

  • Programming language used: Typescript

v4 programming model - req.json is returning Body is unusable for post with JSON

Local development with new Node.js programming model.

I'm sending in JSON in a curl command:

curl --location 'http://localhost:7071/api/worldFromText' --data '{"id":"abc"}' --header 'Content-Type: application/json' --verbose

I've also tried it with a data file from cURL as well as PostMan. I must be missing something. I have 2 functions, 1 uses req.text and parses the JSON which works. 1 reads the req.json and throws an error. Where am I going wrong with this? The undici-fetch error makes it look like I can't do a POST in the new runtime with JSON because their fix is to downgrade to Node 16.

import { app, HttpRequest, HttpResponseInit, InvocationContext } from "@azure/functions";

const worlds: string[] = [];

function generateRandomNumber() {
    return Math.floor((Math.random() * 100) + 1);
}

function processError(err: unknown): any {
    if (typeof err === 'string') {
        return { body: err?.toUpperCase(), status: 500 };
    } else if (
        err['stack'] &&
        process.env?.NODE_ENV?.toLowerCase() !== 'production'
    ) {
        return { jsonBody: { stack: err['stack'], message: err['message'] } };
    } else if (err instanceof Error) {
        return { body: err?.message, status: 500 };
    } else {
        return { body: JSON.stringify(err) };
    }
}
// curl --location 'http://localhost:7071/api/worldFromText' --data '{"id":"abc"}' --header 'Content-Type: application/json' --verbose
app.post("addWorldText", {
    route: "worldFromText",
    handler: async (req, context) => {
        try {

            const bodyAsText = await req.text();
            const id:string = JSON.parse(bodyAsText)?.id;

            worlds.push(id);
            return {
                jsonBody: { id }
            }
        } catch (err: unknown) {
            return processError(err);
        }

    }
})
// curl --location 'http://localhost:7071/api/worldFromJson' --data '{"id":"abc"}' --header 'Content-Type: application/json' --verbose
// curl --location 'http://localhost:7071/api/worldFromJson' --data @mydata.json --header 'Content-Type: application/json' --verbose
type WorldJsonBody = {
    id: string;
}
app.post("addWorldJson", {
    route: "worldFromJson",
    handler: async (req, context) => {
        try {

            // debug only to see what came in
            const allText = await req.text();
            context.log(allText)

            const body: WorldJsonBody = await req.json() as WorldJsonBody;

            worlds.push(body?.id);
            return {
                jsonBody: { id: body?.id }
            }
        } catch (err: unknown) {
            return processError(err);
        }

    }
})
// curl --location 'http://localhost:7071/api/worlds2' 
app.get("getWorlds2", {
    route: "worlds2",
    handler: (req, context) => {
        try {
            return {
                jsonBody: {
                    worlds
                }
            }
        } catch (err: unknown) {
            return processError(err);
        }
    }
})

Complete error:

node โžœ /workspaces/20230410 $ curl --location 'http://localhost:7071/api/worldFromJson' --data @mydata.json --header 'Content-Type: application/json' --verbose
*   Trying 127.0.0.1:7071...
* Connected to localhost (127.0.0.1) port 7071 (#0)
> POST /api/worldFromJson HTTP/1.1
> Host: localhost:7071
> User-Agent: curl/7.74.0
> Accept: */*
> Content-Type: application/json
> Content-Length: 12
> 
* upload completely sent off: 12 out of 12 bytes
* Mark bundle as not supporting multiuse
< HTTP/1.1 200 OK
< Content-Type: application/json
< Date: Mon, 10 Apr 2023 21:21:06 GMT
< Server: Kestrel
< Transfer-Encoding: chunked
< 
* Connection #0 to host localhost left intact
{"stack":"TypeError: Body is unusable\n    at specConsumeBody (/workspaces/20230410/node_modules/undici/lib/fetch/body.js:497:11)\n    at Request.json (/workspaces/20230410/node_modules/undici/lib/fetch/body.js:360:14)\n    at HttpRequest.<anonymous> (/workspaces/20230410/node_modules/@azure/functions/dist/azure-functions.js:1236:73)\n    at Generator.next (<anonymous>)\n    at /workspaces/20230410/node_modules/@azure/functions/dist/azure-functions.js:1155:71\n    at new Promise (<anonymous>)\n    at __webpack_modules__../src/http/HttpRequest.ts.__awaiter (/workspaces/20230410/node_modules/@azure/functions/dist/azure-functions.js:1151:12)\n    at HttpRequest.json (/workspaces/20230410/node_modules/@azure/functions/dist/azure-functions.js:1235:16)\n    at /workspaces/20230410/dist/src/functions/helloworld2.js:58:36\n    at Generator.next (<anonymous>)","message":"Body is unusable"}node โžœ /workspaces/20230410 $ 

Pass data from a Hook to the Function being invoked

Split out from Azure/azure-functions-nodejs-worker#664.

One use-case for hooks is to be able to add additional stuff to a Function, say a CosmosClient so that the Function can do operations against CosmosDB that aren't possible via a binding.

Initially, it might seem like that's something you could use the hookData property for, but that's only persisted across the hooks, and not provided to the Function.

It is possible to use the InvocationContext as a way to pass stuff, but it's typed as unknown (but this is something we could address with #7) so you have to cast it to something else to work with. Here's how I did it with the new programming model:

registerHook("preInvocation", async (context) => {
  const client = new CosmosClient(process.env.CosmosConnectionString);
  const { database } = await client.databases.createIfNotExists({
    id: "TodoList",
  });
  const { container } = await database.containers.createIfNotExists({
    id: "Items",
    partitionKey: { paths: ["/id"] },
  });

  const ctx = context.invocationContext as any;
  ctx.extraInputs.set("cosmosContainer", container);
});

But if instead we had an explicit method, like setFunctionData on the hook context object, then we could write hooks that are simplified across the different programming models.

Create a true "express.js" programming model package

The idea is we would create a separate npm package (@azure/functions-express?) that lets you bring your existing express.js app and it "just works" on Azure Functions. I assume this model would focus just on http, and not other azure triggers

How do I reference parameters of output bindings in v4

I would like to make an app with scheduled data source azure functions that queue up the data for later processing. I would like each function to scrape data, and then upload the results to a blob with the name products/{name}/{date}.json and post a queue message with the following structure:

{
  "blobPath": "products/{name}/{date}.json",
  "retries": 0
}

My current output bindings look like so:

const queueOutput = output.storageQueue({
  connection: "AzureWebJobsStorage",
  queueName: "pricex",
});

const blobOutput = output.storageBlob({
  connection: "AzureWebJobsStorage",
  path: "products/{name}/{date}.json",
});

How would I access the runtime path of the blob binding? The documentation is scarce for v4 programming model bindings and the typings are unknown in many places.

Known workarounds

manage the blob connection with @azure/storage-blob instead of bindings.

Create a decorator programming model package

Creating a rollup issue to start collecting all the feedback we've been getting on decorators (there's a lot, and it's often conflicting). The idea is we can create a separate npm package (@azure/functions-decorators?) that lets you register functions using decorators.

As a bit of background, Node.js has a stage 3 proposal for decorators here. TypeScript has older support for decorators, which is still labeled experimental but has received a decent amount of adoption already. As the Node.js proposal puts it:

Unfortunately, we're in the classic trap of, "The old thing is deprecated, and the new thing is not ready yet!" For now, best to keep using the old thing.

We pretty quickly decided we would not be able to use decorators for the first iteration of the new programming model package because of the above complicated state. However, we could still explore a preview package with decorator support.

Here is a summary of some of the conflicting feedback we've received:

For

  • Feedback from @sinedied here

    I would suggest thinking of how the new API model could leverage TS to make it a first class citizen. For example, I would push forward TS decorators as a way to define bindings and other functions settings. I know decorators' TC39 proposal is currently only stage 2, and TS decorators are already diverging a bit from that proposal. But as syntactic sugar, they're unmatched and AFAIK all most popular TS-first frameworks tools use decorators: Angular, NestJS, TypeORM, TypeGraphQL, Loopback, Ts.ED...

    the fact is that decorators are hugely popular in all TS-first Node.js libs. This is what I think is the most important point here.

Against

  • Feedback from @ChuckJonas here

    IMO, decorators don't bring much other than just making the code read closer what whats done in C#. It feels like the ts community has shifted away from OOO in favor of more functional code (decorators are only supported on classes). Current approach seems more flexible and composable.

  • Even some .NET people don't want decorators - see here

    I came to this need because I find myself defining many functions very similar to each other, having the same binding types but requiring different values in the bindings, e.g.
    ...
    So, instead of copying the functions over and over again, I'd love to automate this process.

Ability to provide the type information for params/query/body

These three parts of the HTTP request object are something that would be really useful to have type information provided to them, so that when they are used, we can get good type safety on their access.

I'd like to be able to write something like this:

type QS = {
  limit?: number
  offset: number
}
app.get('typed-qs', function<QS>(_, { query }) => {
  const limit = query.get('limit');
  const offset = query.get('offset');

  const invalid = query.get('invalid');  // this results in a TypeScript compiler error as `invalid` isn't a member of `QS`
});

type P = { id: string };
app.get('person/{slug}', function<P>(_, { params }) => {
  const { slug } = params;

  const invalid = params.invalid; // this results in a TypeScript compiler error as `invalid` isn't a member of `P`
});

type B = { id: string };
app.post('person', async function<B>(_, req) => {
  const body = await req.json(); // json method return Promise<B>
});

Add hooks api to framework package

At the moment hooks are only exposed in the core api. However, we want the new programming model packages to wrap the core api and expose a "nicer" version of the hooks api (better intellisense, maybe better naming, full docs, etc.)

Uncatchable exception parsing formData

This might just be a really basic thing - but I feel like I've bumped into an issue right at the start of trying out the new V4 API. I'm just trying to parse some data out of a POST call, and return useful information to the caller if the right parameters are not present. I've followed the documents but there is no exemplar version of the code, and the behaviour seems wrong to me. I'm getting 500s for calls that are not quite right, but I don't want these to log out as failures - as they will then show up in the monitoring as application failures. I want the app to handle these cases.

Repro steps

I have a basic function that works until I try to parse POST data. It looks like this:

export async function TestFunction(request: HttpRequest, context: InvocationContext): Promise<HttpResponseInit> {
    let body;
    try {
        body = await request.formData();
    } catch (e: any) {
        return {
            status: 400,
            body: 'Cannot parse form body'
        }
    }
}

Expected behavior

I expect the API to return a 400 if there is no POST data and/or the data is malformed?

Actual behavior

The API returns an HTTP 500 response.

Error output in logs:

Exception: Request.formData: Could not parse content as FormData.
Stack: TypeError: Request.formData: Could not parse content as FormData.
    at webidl.errors.exception (C:\Users\...\node_modules\undici\lib\fetch\webidl.js:13:10)
    at Request.formData (C:\Users\...\node_modules\undici\lib\fetch\body.js:468:29)
    at process.processTicksAndRejections (node:internal/process/task_queues:95:5).

The exception doesn't appear to be in a place where it can be caught.

Known workarounds

None.

Add end-to-end tests for v4

Here's the tests we have for the v3 model: https://github.com/Azure/azure-functions-nodejs-worker/tree/v3.x/test/end-to-end

We need at least the same thing for the v4 model, although wouldn't hurt to upgrade them to something better. A few things I feel like we could improve:

  1. The tests take forever (over an hour)
  2. The tests fail intermittently if you have multiple builds at the same time
  3. The tests are written in C#. This makes it easier to share tests across workers, but discourages us Node.js folks from improving our own e2e tests

Cancel Function execution from a Hook

As touched on in Azure/azure-functions-nodejs-worker#664 a scenario for a pre/post invocation hook, when used like middleware, would be to cancel/abort the execution of the Function, and even provide an alternative out binding.

Here's a scenario:

You have a REST API using Azure Functions to handle the user interactions. To ensure that you don't rely on the client to provide valid data you want to perform server validation based off a validation schema. As this is a REST API you can infer the object you're validating from the URL of the request, so rather than having each function perform its own validation you want a piece of middleware to do it.
If the validation passes, then you want to call to "original" Function so that data is persisted/etc. but if it fails validation you want to respond with a 400 Bad Request and provide the error messages.

Currently there is no way to say "stop execution" or "execute this Function instead" from within a hook, without overriding functionCallback property of the HookContext, but that's probably not recommended.

From a billing/usage standpoint, we should still treat the Function as "executed", even if there is an early abort provided, as you've still executed something in context of that Function invocation.

Provide extensibility scenario for `app` namespace

Related to #12. Currently, the app object provided from the root of the @azure/functions library (in v4), is a namespace:

import { app } from `@azure/functions`

app.get(/* function*/);

This provides limited extensibility when it comes to users/other packages adding their own methods to the root app namespace. For example, the Durable SDK had no easy way to add an app.durableOrchestration() method to register durable functions, and instead it has to export its own functions under its own namespace.

This issue is to discuss how/if we should provide a way for another package to extend the app namespace with their own methods. One such suggestion would be to make app an instance of a Class instead of an object, which can then be extended by other packages. Example:

import { FuncApp } from '@azure/functions'
import * as df from 'durable-functions';

let app = new FuncApp();
app = df.addDurableMagic(app);

app.get(/* HTTP function */); // same methods exist as before

app.durableOrchestration(/* Durable function */);  // external package can add its own methods too

Instead of using the new FuncApp() syntax, which may not be very idiomatic to node, we could also follow Express's model and export a function that does this behind the scenes. Example:

const func = require('@azure/functions');
const df = require('durable-functions');

let app = func();
app = df.addDurableMagic(app);

// same as before
app.get(/* */);
app.durableOrchestration(/* */);

There could be other ways of providing this extensibility too, but this is the easiest one I could think of ๐Ÿ™‚

TypeScript errors thrown from `undici` when building project

Trying to build a TS project which references the latest v4 version (4.0.0-alpha.3) will result in TS errors being thrown:

node_modules/undici/types/client.d.ts:5:8 - error TS1259: Module '"C:/Users/hossamnasr/ms/azure/durable/js-sdk/node_modules/undici/types/connector"' can only be default-imported using the 'esModuleInterop' flag

5 import buildConnector, {connector} from "./connector";
         ~~~~~~~~~~~~~~

  node_modules/undici/types/connector.d.ts:4:1
    4 export = buildConnector
      ~~~~~~~~~~~~~~~~~~~~~~~
    This module is declared with using 'export =', and can only be used with a default import when using the 'esModuleInterop' flag.

node_modules/undici/types/dispatcher.d.ts:6:13 - error TS1259: Module '"C:/Users/hossamnasr/ms/azure/durable/js-sdk/node_modules/undici/types/readable"' can only be default-imported using the 'esModuleInterop' flag

6 import type BodyReadable from './readable'
              ~~~~~~~~~~~~

  node_modules/undici/types/readable.d.ts:4:1
    4 export = BodyReadable
      ~~~~~~~~~~~~~~~~~~~~~
    This module is declared with using 'export =', and can only be used with a default import when using the 'esModuleInterop' flag.

node_modules/undici/types/handlers.d.ts:1:8 - error TS1259: Module '"C:/Users/hossamnasr/ms/azure/durable/js-sdk/node_modules/undici/types/dispatcher"' can only be default-imported using the 'esModuleInterop' flag

1 import Dispatcher from "./dispatcher";
         ~~~~~~~~~~

  node_modules/undici/types/dispatcher.d.ts:12:1
    12 export = Dispatcher;
       ~~~~~~~~~~~~~~~~~~~~
    This module is declared with using 'export =', and can only be used with a default import when using the 'esModuleInterop' flag.


Found 3 errors.

Repro steps

Provide the steps required to reproduce the problem:

  1. Create a new TypeScript app with the following tsconfig.json:
{
    "compilerOptions": {
        "module": "commonjs",
        "target": "es6",
        "outDir": "dist",
        "rootDir": ".",
        "sourceMap": true,
        "strict": false,
        "baseUrl": "./"
    }
}
  1. Install @azure/functions version 4.0.0-alpha.3
  2. Run npm run build
  3. You get the above TypeScript errors

Expected behavior

Builds without errors

Actual behavior

See the above TypeScript errors

Known workarounds

Pin undici to 5.9.1

Filtering hooks

Building on #7 and Azure/azure-functions-nodejs-worker#664.

When a preInvocation and postInvocation hook is defined it will be executed for every single Function that gets executed, but there may be scenarios where you don't want them to be run every time, here's some examples:

  • Performing validation of an inbound HTTP payload for a POST request on a HTTP Trigger
  • Creating a CosmosClient to provide for Functions that can't use the input/output bindings (for removing or replacing items in a Cosmos collection)
  • Only executing for a certain trigger type in an app that has multiple different trigger types in use

Here's a pseudocode API:

// only apply to a specific trigger type
app.preInvocation({
  hook: () => {},
  filter: 'timer'
});

// filter based on a function name
app.preInvocation({
  hook: () => {},
  filter: (context) => {
    return context.functionName.contains('Delete') || context.functionName.contains('Put');
  }
});

// filter based on some custom logic
app.preInvocation({
  hook: () => {},
  filter: (context) => {
    return context.trigger.type === 'httpTrigger' && context.trigger.payload.method === "POST";
  }
});

Worker doesn't respect `languageWorkers:node:arguments` in v4 using placeholders in Azure

As of this morning (March 9), all my functions stopped working. It no longer invokes Node with the arguments specified in languageWorkers:node:arguments. In my case, this led to the app failing to start, since I use Yarn PnP, and I use the startup args to include the PnP loader.

Investigative information

  • Timestamp: 2023-03-09T07:00:00Z
  • Function App name: tzur-portal, tzur-portal-test
  • Function name(s) (as appropriate): Api
  • Invocation ID: N/A, app doesn't start
  • Region: West Europe

Repro steps

  1. Using Functions runtime ~4, add a languageWorkers:node:arguments configuration. In my case: "-r ./.pnp.cjs --max-old-space-size=1272 --expose-gc"
  2. Attempt to start the Functions app

Expected behavior

App launches correctly. The included file gets executed. The command line arguments show up in the node.exe process in the Process Explorer.

Actual behavior

The args are not passed to node.exe. They do not show up in the Process Explorer for the node process, and the included file is not run. In my app, this leads to no 3rd party packages being loaded. The logs show 'tslib' as the non-loaded package, because it comes first.

Known workarounds

I downgraded to Functions runtime ~3 and Node version ~14, after which it worked as before.

Related information

  • Programming language used: Typescript
  • Platform: Windows
  • Bindings used: HTTP

Should we auto parse json trigger inputs or leave as string

By default, we try to parse trigger inputs (for example, a queue/event hub/service bus message) as if they were JSON and leave them as a string if that fails. However, that can occasionally lead to weird behavior. For example, in this "copy blob" example it takes a blob input and copies it to a blob output. If the blob being copied is a json file, it will lose all formatting after it's copied because we parsed and serialized it instead of just leaving it as a string throughout the transaction.

Fwiw, it appears C# functions leaves these as strings, so we're also inconsistent across languages.

We could always make this a toggle-able option, but now would be a good time to decide if we want to change the default as a part of the v4 breaking change release.

Provide a `Response` class for HTTP trigger functions

With a HTTP Trigger callback, we return an object with all the information for the response, from the body to status code, headers, etc.

This can feel a bit cumbersome, especially if you're coming from other API frameworks, such as Express or Fastify, where you have a response parameter and the function only returns the body.

Doing so would mean you could simplify code such as this:

app.get('route', (_, req) => {
  const headers: { [key of string]: string } = {}

  if (req.headers.has('accepts')) {
    const accepts = req.headers.get('accepts');
    if (accepts.indexOf('application/json')) {
      headers['content-type'] = 'application/json';
    }
  }

  return { headers, body: { foo: 'bar' } };
});

To:

app.get('route', (_, req, res) => {
  if (req.headers.has('accepts')) {
    const accepts = req.headers.get('accepts');
    if (accepts.indexOf('application/json')) {
      res.headers.add('content-type', 'application/json');
    }
  }

  return { foo: 'bar' };
});

Admittedly, this is the kind of thing that could be provided by a "higher level" wrapper around the low-level Functions API.

Name uniqueness isn't enforced with the new programming model

Since the developer provides the name of a Function in the new programming model, rather than it coming from the file system, it's possible to register multiple Functions with the same name:

import { app } from "@azure/functions";

app.get("todo", () => {});
app.post("todo", () => {});

In this example you're expecting that todo is the route (which it would be set as) and we have registered different verbs on the same route. But since the first argument is both name and route, only the last one is registered.

I see there being two options:

  1. We take the user-provided name and add some additional context information, e.g.: http-todo-get, where it's <trigger>-<name>-<additional info if required>.
  2. We either raise an error if a duplicate name is detected or warn in logs that only the last registered with that name will be registered.

Option 1 does a better job at enforcing unique names but with the drawback that it won't be as obvious in the logs until people realise the name they provided is only part of it. Option 2 avoids "magic naming" but with the drawback of a runtime warning or error, which might not be the ideal DX.

v4 doesn't detect my functions

I just switched my existing project to the new programming model v4 using this migration guide. I love the idea that I can structure my directories and files as I want, but it doesn't seem to work.

Repro steps

Provide the steps required to reproduce the problem:

  1. create functions inside nested folders. My folder structure:
src
+---functions
|   +---criteria
|   |   |   create.ts
|   |   |   getall.ts
|   |   |   getone.ts
|   |   |
|   |   \---types
|   |           createCriteria.dto.ts
|   |
|   +---events
|   |       getall.ts
|   |
|   \---ratings
|       |   create.ts
|       |   getcriteria.ts
|       |   rateone.ts
|       |
|       \---types
|               createRating.dto.ts
|
\---lib ...

All my functions look something like this:

import { HttpRequest, HttpResponseInit, InvocationContext, app } from '@azure/functions'

export const getCriteria = async (req: HttpRequest, context: InvocationContext): Promise<HttpResponseInit> => {
  ...
  return { body: ... }
}

app.http('ratings-getcriteria', {
  methods: ['GET'],
  route: 'ratings/{id}',
  handler: getCriteria
})
  1. set the main property in package.json to dist/src/functions/**/*.js (yes, there is an src directory inside dist)
    I also tried assigning all the http routes in one index.ts file in /functions and refenrece that one file in packege.json, but that didn't work either.

Expected behavior

My functions are recognised and mapped to HTTP routes

Actual behavior

I get the following warning:

No job functions found. Try making your job classes and methods public. If you're using binding extensions (e.g. Azure Storage, ServiceBus, Timers, etc.) make sure you've called the registration method for the extension(s) in your startup code (e.g. builder.AddAzureStorage(), builder.AddServiceBus(), builder.AddTimers(), etc.).

Sorry if this isn't the right place to ask (this probably isn't an issue with the package, I just couldn't figure out how to use it) but there is very little information about this version right now.

Add a flag to allow users to suppress the unrecommended `context.done()` patterns error log

This issue was inspired by this Durable SDK issue: Azure/azure-functions-durable-js#373.

Some patterns related to use of context.done() are unsafe/unrecommended, and would throw a warning if used by user code, such as calling context.done() within an async function, or calling context.done() twice. However, there was a bug in the Node worker before, which suppressed this error message from being logged. This bug was fixed in Azure/azure-functions-nodejs-worker#555, which meant this error message was now being shown. However, since this error message was suppressed before, some users who weren't getting an error message before, are not getting it, leaving with the option to either live with the error message or do a potentially breaking change. One such example is the Durable SDK, see discussion here for more details: Azure/azure-functions-durable-js#380 (comment).

This issue is to add a flag in the context.done() method to allow users to indicate they know what they are doing and to suppress the error message

Context Logging not in Sequential order

Created the basic Http Trigger in Node Js 18 LTS Azure Functions using VS Code.

module.exports = async function (context) {
    context.log('JavaScript HTTP trigger function processed a request.');

    context.log.info({ a: 'first object', b: { q: 'test 1' } });
    context.log.info({ a: 'second object', b: { q: 'test 2' } });
    context.log.info({ a: 'third object', b: { q: 'test 3' } });
  
    context.log.info(JSON.stringify({ a: 'first string', b: { q: 'test 1' } }));
    context.log.info(JSON.stringify({ a: 'second string', b: { q: 'test 2' } }));
    context.log.info(JSON.stringify({ a: 'third string', b: { q: 'test 3' } }));
  
    context.log.info('first simple string');
    context.log.info('second simple string');
    context.log.info('third simple string');
    const responseMessage = "Hello Krishna, This Http Triggered Function executed successfully."

    context.res = {
        // status: 200, /* Defaults to 200 */
        body: responseMessage
    };
}

The way I have written my log statements are not coming in the terminal with the same order when I run my function.

image

Is it the behavior of Context.Log() or am I missing something?

extraOutputs throws error if unused in handler (v4)

When using extraOutputs on an httpTrigger i get an error message if this output is not used inside the handler.

For example the following code throws:
Error message testing locally
[2023-03-31T07:21:24.197Z] Executed 'Functions.ping' (Failed, Id=c1bd6ece-961f-44df-86cb-34ebe0310847, Duration=155ms)
[2023-03-31T07:21:24.200Z] System.Private.CoreLib: Exception while executing function: Functions.ping. Microsoft.Azure.WebJobs.Script.Grpc: Unknown ParameterBindingType.

Error message when deployed
2023-03-31T07:37:35Z [Error] Executed 'Functions.ping' (Failed, Id=d28c8178-0784-4f89-b234-ee23544da68d, Duration=109ms)

Code

const cosmosOutput = output.cosmosDB({
    connectionStringSetting: 'CosmosDbConnectionString',
    databaseName: 'test',
    collectionName: 'test'
});

app.http('ping', {
    methods: ['GET'],
    authLevel: 'function',
    extraOutputs: [cosmosOutput],
    handler: (request, context): Promise<HttpResponseInit> => {
        return;
    }
});

Working example
If i do the same thing but using the output at least once everything works fine. So this would work for example

const cosmosOutput = output.cosmosDB({
    connectionStringSetting: 'CosmosDbConnectionString',
    databaseName: 'test',
    collectionName: 'test'
});

app.http('ping', {
    methods: ['GET'],
    authLevel: 'function',
    extraOutputs: [cosmosOutput],
    handler: (request, context): Promise<HttpResponseInit> => {
        context.extraOutputs.set(cosmosOutput, []);       // this line is needed
        return;
    }
});

Since i want the output binding only to be used in a certain if statement, it would be nice if i dont have to set a placeholder like mentioned in the working example.
I've only tested it with the httpTrigger, so i'm not sure if other triggers might as well have this problem.
I am also not 100% sure if this a bug in the new v4 model or if i doing something wrong?

Add method to list all bindings in v4

In the old model, it was possible to list all the available bindings on the context object, by simply calling context.bindings.

In v4, this is how extra input/output bindings are retrieved:

const inputBinding = input.generic(/* whatever config */)
//...
// inside function definition
const inputValue = context.extraInputs.get(inputBinding);

Unlike before, there isn't anymore a way for the function to list all the available bindings. It's possible it may have other use-cases for customers too, but in Durable Functions, this allowed the Durable SDK to find the Durable Client input binding, by simply passing the invocation context, as below:

const client = context.df.getClient(context);

But now, the user also has to provide the input binding info, so the SDK is able to retrieve it:

const client = context.df.getClient(context, clientInput);

To make an experience comparable to the old model (see example here: Azure/azure-functions-durable-js#414 (comment)), we would need a way for the SDK to retrieve all input bindings registered to the function, for example:

const allInputBindings = context.extraInputs.list();

Should first parameter for http triggers be route or function name

Some feedback here from @ChuckJonas

Is the first parameter the route? Or just the function name? I'm hoping it's the former as this would make it similar to other popular JS frameworks.

Fwiw, we did have the route as the first parameter in one of the prototypes (see here), but we got mixed feedback on that. Sure, it's more similar to express, but a lot of people view the function name as very important. Our initial decision was to use the function name for the sake of simplicity and to be consistent with other triggers. We are considering a true express.js programming model and that may be the better place to use route as the first parameter, tracked here: #16

Use an `app` root object for all function registrations

Copied from ejizba/func-nodejs-prototype#3

Collecting feedback from various places on the app root object. Seems like everyone is overwhelmingly in favor of it, but we can have further discussion here.

From @aaronpowell:

  • I like this approach as it mirrors the kind of thing we find with Express, Fastify, and other web frameworks
  • It also makes for a clear ordering to the Functions that are defined, so if you've got multiple HTTP triggers that work off similar routes, you should have a clear order of precedence
  • It would also make it easier to have discovery, rather than having to know what you can import, you import one thing and intellisense will tell you want it can do

From @YunchuWang

With a central object like app for managing functions, it is more intuitive for users to know/look up methods compared to option1 via intellijsense. (ex, typing app. autocompletes list of methods)

Allow users to specify IncludeEmptyEntriesInMessagePayload capability

For trigger payloads of collection type(ex: event hub trigger payload with batched messages), host will skip empty entries by default. We recently made a host level change so that host will not skip empty entries if worker includes IncludeEmptyEntriesInMessagePayload capability when advertising the capability list to host. The dotnet worker was updated to send this capability if user opts in for that(In the next major version, we will enable it by default). node worker should consider making similar change.

Update bundle/package process as fitting for an npm package

We pulled this code out of the nodejs worker which is essentially a server side app, but haven't updated the bundle/package process to reflect the fact that this is now an npm package. Here are the main changes I think are necessary

Very obvious - don't ship dependencies

This repo should not ship its dependencies in its own package. Each npm package should only ship its own code and npm will handle installing any dependencies in the user's "node_modules" folder. Currently the dependencies are being bundled into the "index-bundle.js" file and we should remove that.

Somewhat obvious - ship files that help with debuggability

Right now if the user tried to "step into" the @azure/functions package they will be taken to a minified mess of a file with no chance of debugging it at all. Instead, I think npm packages should ship their original *.ts files and source map files so that users can debug through the package much more easily.

The main downside is that this increases the size of the package, but honestly it's a pretty minimal difference and the user should get rid of these files in their own bundling process before deploying their app to prod.

More debatable - should the code be bundled/minified

As stated above, we shouldn't use webpack to bundle our dependencies. However, should we use webpack to bundle our own source files into one file? And should it be minified (or "uglified") to make it the most performant as possible? There's no clear answer online, but a few places where people discuss this (example one, example two). The main argument for this is that sadly many user's don't use a bundling process before deploying to prod (even though they should), so we should bundle to improve performance as much as possible. I took a look at some other packages (including azure sdks like @azure/storage-blob) and many appear to ship both a bundled file (storage-blob.js) and a bundled/minified file (storage-blob.min.js). The user can choose which one to use hypothetically, but they use the bundled (and not minified) one by default. I think this makes sense because the bundling process probably makes a way bigger difference on performance than minification, and minification drastically reduces readability.

Support changing dataType for a binding in v4

v3 function.json content:

{
    "name": "myQueueItem",
    "type": "queueTrigger",
    "direction": "in",
    "queueName": "helloworldqueue",
    "connection": "storage_APPSETTING",
    "dataType": "binary"
}

v3 index.js content:

module.exports = async function (context, myQueueItem) {
    context.log('JavaScript queue trigger function processed work item', myQueueItem);
};

In v3 the type of myQueueItem changes based on the value of dataType. If you set dataType to binary, myQueueItem will be a Buffer. Otherwise, myQueueItem will either be a string or possibly a parsed object if the string was JSON.

In v4, we don't allow users to change myQueueItem to a Buffer type, which we should. This does not apply to http triggers, but probably applies to several other triggers/bindings other than storage queue.

Transfer-Encoding Chunked does not work in Javascript functions

An HTTP request sent using Transfer-Encoding: Chunked to an Azure Function written in Javascript causes the body to be undefined.

I discovered this issue when a C# client was communicating with a javascript azure function.
I refactored the client to change from StringContent to ObjectContent, and therefore it would stream the request directly to the HTTP request stream, instead of creating an intermediate string.
When sending the request this way, the Content Length isn't known, and so chunked transfer encoding is used.

When experimenting, the issue occurs whenever the Content-Length is not known before the request is sent.

Minimum reproduction:
A javascript azure function:

// index.js
module.exports = async function (context, req) {
    context.log('JavaScript HTTP trigger function processed a request.');

    // req.rawBody produces the same result.
    let body = req.body;

    context.log(body);

    let responseString = body.toString();

    context.res = {
        body: responseString
    };
}

// function.json
{
  "bindings": [
    {
      "authLevel": "function",
      "type": "httpTrigger",
      "direction": "in",
      "name": "req",
      "methods": [
        "post"
      ]
    },
    {
      "type": "http",
      "direction": "out",
      "name": "res"
    }
  ]
}

A console client in C#

// Program.cs
var content = new StringContent(Guid.NewGuid().ToString());
		
// Clear the ContentLength to trigger chunked transfer encoding
content.Headers.ContentLength = null;
		
var request = new HttpRequestMessage(HttpMethod.Post, "http://localhost:5000/api/HttpTrigger")
{
    Content = content
};

var httpClient = new HttpClient();
var response = await httpClient.SendAsync(request);
Console.WriteLine(response.StatusCode);

In this example, req.body (or req.rawBody) will be set as undefined in the javascript function.
Removing the line content.Headers.ContentLength will set req.body to the expected value.

This issue does not occur if the function is re-written in C#:

public static class ChunkedEncodingFunction
{
    [FunctionName("HttpTrigger")]
    public static async Task<IActionResult> Run(
        [HttpTrigger(AuthorizationLevel.Anonymous, "post", Route = null)] HttpRequest req,
        ILogger log)
    {
        log.LogInformation("C# HTTP trigger function processed a request.");

        // requestBody has the expected request content.
        string requestBody = await new StreamReader(req.Body).ReadToEndAsync();
        log.LogInformation(requestBody);

        return new OkObjectResult(requestBody);
    }
}

Our current workaround is to call HttpContent.LoadIntoBufferAsync();.
The azure function we're calling is a mock for a real service, and the real service supports chunked encoding. So we're having to put in a check to see if the client is communicating with the mock. Obviously, we would rather not require this check.

`InvocationContext` should be second parameter to HTTP trigger function

In the v4 programming model, we write HTTP trigger functions have a type signature of:

export type HttpHandler = (context: InvocationContext, request: HttpRequest) => FunctionResult<HttpResponse>;

But the InvocationContext is not something that you necessarily need, meaning that it's quite common to have it as an ignored argument, resulting in a TypeScript warning, ts(6133).

Propose solution

Reorder the type signature (and underlying code) so that the HttpRequest is the first argument, allowing the InvocationContext argument to dropped if it's not needed.

Adding middleware to new programming model

This builds on some of the ideas proposed in Azure/azure-functions-nodejs-worker#664

Here's an idea I have for how we could do middleware:

import { app } from "@azure/functions";

app.get("home", () => ({ body: "Hello World" });

app.middleware({
  hookName: "preInvocation",
  hook: (context) => {
    //do stuff
  }
});

app.preInvocation(context => {
  //do stuff
});

app.middleware({
  hookName: "preInvocation",
  filter: "httpTrigger"
  hook: () => {}
});

app.middleware({
  hookName: "preInvocation",
  filter: (context) => context.trigger.url.contains("foo"),
  hook: () => {}
});

There's probably a bunch of additional permutations that we could support to make it more akin to the kind of middleware like you would come across in web frameworks, but also support the scenarios of having different trigger types within your application.

Auto-generated Timer Function Should Use Timer Type Instead of `any` in TypeScript

Instead of using the Timer type exported from @azure/functions, a Timer-triggered function generated by the az cli or VSCode extension uses any:

import { AzureFunction, Context } from '@azure/functions';
const timerTrigger: AzureFunction = async (context: Context, myTimer: any): Promise<void> => {

should become

import type { AzureFunction, Context, Timer } from '@azure/functions';
const timerTrigger: AzureFunction = async (context: Context, myTimer: Timer): Promise<void> => {

Model V4 will not run with a local emulator in WSL via VSCode

It does not appear that Model V4 functions can be run locally inside of WSL, it is not correctly detecting functions when attempting to launch. I am not sure if this is an issue with Azure Functions Core Tools or the NodeJS library.

Investigative information

Please provide the following:

  • Timestamp: N/A
  • Function App name: N/A
  • Function name(s) (as appropriate): N/A
  • Invocation ID: N/A
  • Region: N/A

This is an issue running with a local emulator.

Repro steps

Provide the steps required to reproduce the problem:

  1. Launch VSCode in WSL
  2. Install Azure Functions, Azure Functions Core Tools and Azurite via VSCode Extensions
  3. Via the Azure Functions extension, create a new function in the local workspace
  4. Select folder > JavaScript > Model V4 > HTTP Trigger > httpTrigger1
  5. Launch Azurite in VSCode via "Azurite: Start"
  6. Under Workspace in the Azure Extension select "Start debugging to update this list...."
  7. Choose "Local Emulator" to add "AzureWebJobsStorage": "UseDevelopmentStorage=true" to the local.settings.json file.

Expected behavior

Example HTTP Trigger function launches, listening on http://localhost:7071/api/httpTrigger1

Actual behavior

Azure Functions Core Tools Launches, reports "No job functions found. Try making your job classes and methods public. If you're using binding extensions (e.g. Azure Storage, ServiceBus, Timers, etc.) make sure you've called the registration method for the extension(s) in your startup code (e.g. builder.AddAzureStorage(), builder.AddServiceBus(), builder.AddTimers(), etc.)."

Running with "func start --verbose" shows no functions are detected.

Known workarounds

Currently only able to get this running on Windows without WSL

Related information

N/A

Support Arrays In Query String

There is a convention for the syntax to pass an array in query parameters:

  1. Passing the same parameter multiple times:
    ?foo=bar&foo=qux
  2. Passing the same parameter multiple times with a [] suffix:
    ?foo[]=bar&foo[]=qux - URL encoded as ?foo%5B%5D=bar&foo%5B%5D=qux
  3. Passing the same parameter multiple times with an explicit index in the [] suffix:
    ?foo[0]=bar&foo[1]=qux - URL encoded as ?foo%5B0%5D=bar&foo%5B1%5D=qux
    or, to demonstrate the functionality of the explicit index:
    ?foo[1]=qux&foo[0]=bar - URL encoded as ?foo%5B1%5D=qux&foo%5B0%5D=bar
  4. Passing a comma-separated list:
    ?foo=bar,qux

Sadly, there is no unified standard or W3C spec, and every library and framework handles these cases differently.
As of today, none of these options will produce an array in Azure Functions.

The change I propose is a breaking change, so perhaps an opt-in should be added, or it could be merged to version 5 if such a version is upcoming.

Investigative information

N/A

Repro steps

  1. Create a sample HTTP function that prints its query parameters
  2. Send a request that uses one of the syntaxes for arrays in query params (for example ?foo[]=bar&foo[]=qux)

Expected behaviour

We should receive an object that contains the parameter as the field and an array as the value.
For example, when sending ?foo[]=bar&foo[]=qux we should get `{ foo: ["bar", "qux"] } as our query.

Actual behaviour

"?foo=bar&foo=qux"         => { "foo": "bar,qux" }
"?foo[]=bar&foo[]=qux"     => { "foo[]": "bar,qux" }
"?foo[1]=qux&foo[0]=bar"   => { "foo[0]": "bar", "foo[1]": "qux" }
"?foo=bar,qux"             => { "foo": "bar,qux" }

Known workarounds

A workaround is available if we utilize the qs (short for query string) or query-string libraries and parse the query params with the next code:
Be advised this code is overly simplified and does not cover edge cases.

import type { AzureFunction, Context, HttpRequest } from "@azure/functions";
import qs from "qs";

const httpTrigger: AzureFunction = async (context: Context, req: HttpRequest): Promise<void> => {
  const query = qs.parse(req.url.split("?")[1]);
  context.res = { body: query };
}

qs and query-string are two powerful libraries that can parse query parameters into arrays and optionally, even to objects.
Be advised, however, that both libraries require certain options to be passed to handle arrays for different syntaxes and avoid unexpected behaviour, especially when it comes to trickier parts like objects.

Related information

Provide any related information

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.