Giter VIP home page Giter VIP logo

autometrics-ts's People

Contributors

arendjr avatar benjibuiltit avatar brettimus avatar flenter avatar keturiosakys avatar oscarvz avatar stephlow avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

autometrics-ts's Issues

🐛 Change histogram buckets to reflect seconds, not milliseconds

Environment information

n/a

What happened?

We recently changed the library (it's an unreleased change as of writing) to report seconds instead of milliseconds to the prometheus latency histogram.

One problem this introduced as I was playing with the push gateway support is that the histogram still uses the following buckets:

[ 0, 5, 10, 25, 50, 75, 100, 250, 500, 750, 1000, 2500, 5000, 7500, 10000 ]

Two possible solutions to this, and I'll follow up on trying them out later:

Solution 1: Report the unit when we create the histogram. Just based off of intuition, I don't think this is going to work.

const histogram = meter.createHistogram(HISTOGRAM_NAME, {
    description: HISTOGRAM_DESCRIPTION,
    unit: "s"
  });

Solution 2: Pass modified buckets to our MeterProvider

    autometricsMeterProvider = new MeterProvider({
      views: [
        new View({
          aggregation: new ExplicitBucketHistogramAggregation([
            0, 0.005, 0.01, 0.025, 0.05, 0.075, 0.1, 0.25, 0.5, 0.75, 1, 2.5, 5,
            7.5, 10,
          ]),
          instrumentName: HISTOGRAM_NAME,
        }),
      ],
    });

Expected result

Buckets for the latency histogram should be

[0, 0.005, 0.01, 0.025, 0.05, 0.075,  0.1,  0.25,  0.5,  0.75,    1,   2.5,     5,   7.5,   10 ]

not

[ 0, 5, 10, 25, 50, 75, 100, 250, 500, 750, 1000, 2500, 5000, 7500, 10000 ]

TypeError when using Hono framework

Environment information

typescript, hono framework

in index.ts:


import { Hono } from 'hono'
import { prettyJSON } from 'hono/pretty-json'

import { getUsers } from './model'




const app = new Hono()

app.get('/', (c) => c.text('Hello Hono!'))
app.use('*', prettyJSON())
app.notFound((c) => c.json({ message: 'Not Found', ok: false }, 404))


app.get('/users', (c) => {
  const users = getUsers()
  return c.json({ users })
})

export default app

and model.ts:

import { autometrics } from 'autometrics'

export interface User {
  id: string,
  name: string,
  email: string,

}

export const getUsers = autometrics(async function getUsers(): Promise<User[]> {
  const users: User[] = []
  return users
})


### What happened?

$ bun run --hot src/index.ts
63 | var _a, b;
64 | const defaultPrepareStackTrace = Error.prepareStackTrace;
65 | Error.prepareStackTrace = (
, stack2) => stack2;
66 | const { stack: stackConstructor } = new Error();
67 | Error.prepareStackTrace = defaultPrepareStackTrace;
68 | const stack = stackConstructor.map((callSite) => ({
^
TypeError: stackConstructor.map is not a function. (In 'stackConstructor.map((callSite) => ({
name: callSite.getFunctionName(),
file: callSite.getFileName()
}))', 'stackConstructor.map' is undefined)
at getModulePath (/Users/mies/p/am-samples/hono-autometrics/node_modules/@autometrics/autometrics/dist/index.js:68:16)
at autometrics (/Users/mies/p/am-samples/hono-autometrics/node_modules/@autometrics/autometrics/dist/index.js:309:17)
at /Users/mies/p/am-samples/hono-autometrics/src/model.ts:10:24
63 | var _a, b;
64 | const defaultPrepareStackTrace = Error.prepareStackTrace;
65 | Error.prepareStackTrace = (
, stack2) => stack2;
66 | const { stack: stackConstructor } = new Error();
67 | Error.prepareStackTrace = defaultPrepareStackTrace;
68 | const stack = stackConstructor.map((callSite) => ({
^
TypeError: stackConstructor.map is not a function. (In 'stackConstructor.map((callSite) => ({
name: callSite.getFunctionName(),
file: callSite.getFileName()
}))', 'stackConstructor.map' is undefined)
at getModulePath (/Users/mies/p/am-samples/hono-autometrics/node_modules/@autometrics/autometrics/dist/index.js:68:16)
at autometrics (/Users/mies/p/am-samples/hono-autometrics/node_modules/@autometrics/autometrics/dist/index.js:309:17)
at /Users/mies/p/am-samples/hono-autometrics/src/model.ts:10:24


### Expected result

no error

Add a minimum required version of Node

Some APIs we use are available or available globally only since certain, recent versions of Node. We should pin that requirement (to at least Node 16).

Rename metrics

  • Rename counter to function.calls and ensure it is exported to Prometheus as function_calls_total
  • Ensure histogram is exported to Prometheus as function_calls_duration_seconds

🐛 Quickstart throws an error in Deno deploy

Environment information

Supabase functions (Deno deploy edge functions) with supabase cli v1.77.9

What happened?

Importing autometrics via import { autometrics } from "https://esm.sh/[email protected]"; and wrapping a simple function gave me the following error:

[Info] Autometrics: Initiating a Prometheus Exporter on port: 9464, endpoint: /metrics

[Error] TypeError: (0 , re.createServer) is not a function
    at new r (https://esm.sh/v128/@opentelemetry/[email protected]/esnext/exporter-prometheus.mjs:18:1309)
    at getMetricsProvider (https://esm.sh/v128/@autometrics/[email protected]/esnext/autometrics.mjs:4:2276)
    at getMeter (https://esm.sh/v128/@autometrics/[email protected]/esnext/autometrics.mjs:4:2387)
    at https://esm.sh/v128/@autometrics/[email protected]/esnext/autometrics.mjs:4:3459
    at Server.<anonymous> (file:///home/deno/functions/rabbit/index.ts:15:26)


Expected result

I would expect a friendlier error message instead of a runtime Type error

🐛 Class decorators break in the browser when using the init function

Environment information

What happened?

In the current implementation, when you use a class decorator and a custom init configuration, the decorator will run before the init function.

Because of this, it will try to spin up a PrometheusExporter, even though we configure a push gateway in the init function.

Expected result

The class decorator should use the configuration provided in the init call

Implement web builds

Currently we offer two types of packages: Deno and NPM. Unfortunately, both have some issues for consumption in browsers. Related issues:

To resolve this, there's two types of web builds that we can offer...

Adding browser support to NPM packages

The first thing we can (and should) do is to add explicit browser support to the NPM packages. We currently rely on bundlers to make sure the NPM package is compatible with web, but there's only so much they can do. To help the bundlers, we should make use of the browsers field in package.json to replace files with Node-specific functionality with files that offer web-compatible versions.

Small refactorings might be necessary to accomplish this effectively.

ESM module

We can also compile the Deno package into JS bundles that could be consumed by people who don't use a bundler. My gut feeling is that most people building clients they wish to autometricize are people who do use bundlers, so I would consider this one lower priority. But if we have already done the work of splitting Deno/Node specific functionality into separate files, then using the web-compatible files to compile the Deno package into a pure JS build is probably relatively easy to do as well.

🐛 API Reference mentions BuildInfo type but there's no link to the type

Environment information

n/a

What happened?

I clicked the link to the API Reference (this: https://github.com/autometrics-dev/autometrics-ts/blob/main/packages/lib/reference/README.md), in order to find details about the init function's initOptions.

The parameter buildInfo is of type BuildInfo, and says it is necessary for client-side apps. However, there is no link to this type.

See: https://github.com/autometrics-dev/autometrics-ts/blob/main/packages/lib/reference/README.md#initoptions

image

Expected result

I'd expect the type to be in the reference

@opentelemetry/exporter-prometheus does not work with create-react-app (bundling with Webpack V5)

When trying to add autometrics with a push gateway to a create-react-app project, I get the following error:

ERROR in ./node_modules/@opentelemetry/exporter-prometheus/build/src/PrometheusExporter.js 28:14-28
Module not found: Error: Can't resolve 'url' in '/Users/brettbeutell/fiber/aplos-app/node_modules/@opentelemetry/exporter-prometheus/build/src'

BREAKING CHANGE: webpack < 5 used to include polyfills for node.js core modules by default.
This is no longer the case. Verify if you need this module and configure a polyfill for it.

If you want to include a polyfill, you need to:
	- add a fallback 'resolve.fallback: { "url": require.resolve("url/") }'
	- install 'url'
If you don't want to include a polyfill, you can use an empty module like this:
	resolve.fallback: { "url": false }

webpack compiled with 2 errors
No issues found.

The issue appears to be that we need polyfills for node.js core modules.

Maybe it'd be good to have a client-side tailored package of autometrics? Since for browser-based apps we might not need all of the open telemetry deps

🐛 Caller information is sometimes missing

Environment information

Server-side code running in Node or Deno can extract caller information using AsyncLocalStorage. Unfortunately, we load this from the node:async_hooks module, which is not available in browsers. Because of this we attempt to load in a promise, but while this promise is still pending, any (synchronous) calls that should be tracked by Autometrics will be lacking their caller information.

What happened?

Calls that are tracked by Autometrics before the AsyncLocalStorage is initialized lack caller information.

Expected result

Caller information should be available at all times if the environment supports it.

Support custom `errorIf` and `okIf` callbacks

Currently, the only way a function will register an error in Autometrics is if it throws it. This limits the usefulness of the library in contexts such as route handlers, where any further calls would be wrapped in a trycatch and likely handled. So as an example, a case like this...

export async function getUsers(req: express.Request, res: express.Response) {
  try {
    const users = await db.getUsers();
    return res.status(200).json(users);
  } catch (e) {
    return res.status(500).send(e.message);
  }
}

app.get("/users", autometrics(getUsers));

...would always register result="ok" in metrics because it never throws even though the actual result is an error.

Proposal

Similar to autometrics-dev/autometrics-rs#32, we should add ways for users to tweak, what is an error and what isn't. As we don't have the trait interfaces in TS/JS, one way we can solve this is by exposing errorIf and okIf interfaces in the AutometricsOptions block, that users would be able to supply conditions to.

Here's how the above example would change that would correctly register an error then:

const errorIf = async (res: Promise<express.Response>) => {
  return (await res).statusCode >= 400;
};

app.get("/users", autometrics({ errorIf }, handleGetUsers));

🐛 Supabase Edge Functions: My filesystem details are leaked by the `module` label when using autometrics-ts with Supabase Edge Functions

Environment information

Supabase Edge Functions (prod deployment)

What happened?

I deployed an edge function with Supabase, and configured autometrics-ts to work with a push gateway.

The module label contained the filesystem path on my local machine to the script containing the edge function.

function_calls_duration_sum{caller="",function="getRabbit",module="file:///Users/brettbeutell/fiber/to-err-is-panda/supabase/functions/rabbit/index.ts"} 0.004

Interestingly, if I serve the edge function runtime locally (while testing them out), then the module is specific to the docker container that's hosting the edge runtime, and the module label looks like /home/deno/...

Expected result

I would expect my filesystem details to not be present in the module label.

Consider adding support for Webpack-bundled projects

Currently any/all Webpack supports will by default fail at build time as Webpack attempts to resolve Node-native libraries regardless that they're guarded for Node-only use. The Node-native libraries are: stream-http, url (used in the Prometheus Exporter that is bundled) and async_hooks that is used in the library itself.

The issue can be worked around by manually installing and setting up polyfills as documented in #40 and/or probably setting resolve.fallback values to false for the afore-mentioned Node-native libraries. However that is somewhat brittle, and will not work with a default create-react-app config.

We should investigate supporting Webpack without extra configuration if we're looking to support client-side Autometrics.


This is the full error that shows:

Module not found: Error: Can't resolve 'http' in '<USR_PATH>/react-repro/node_modules/@opentelemetry/exporter-prometheus/build/src'
BREAKING CHANGE: webpack < 5 used to include polyfills for node.js core modules by default.
This is no longer the case. Verify if you need this module and configure a polyfill for it.

If you want to include a polyfill, you need to:
        - add a fallback 'resolve.fallback: { "http": require.resolve("stream-http") }'
        - install 'stream-http'
If you don't want to include a polyfill, you can use an empty module like this:
        resolve.fallback: { "http": false }
ERROR in ./node_modules/@opentelemetry/exporter-prometheus/build/src/PrometheusExporter.js 25:15-30
Module not found: Error: Can't resolve 'http' in '<USR_PATH>/react-repro/node_modules/@opentelemetry/exporter-prometheus/build/src'

BREAKING CHANGE: webpack < 5 used to include polyfills for node.js core modules by default.
This is no longer the case. Verify if you need this module and configure a polyfill for it.

If you want to include a polyfill, you need to:
        - add a fallback 'resolve.fallback: { "http": require.resolve("stream-http") }'
        - install 'stream-http'
If you don't want to include a polyfill, you can use an empty module like this:
        resolve.fallback: { "http": false }

ERROR in ./node_modules/@opentelemetry/exporter-prometheus/build/src/PrometheusExporter.js 28:14-28
Module not found: Error: Can't resolve 'url' in '<USR_PATH>/react-repro/node_modules/@opentelemetry/exporter-prometheus/build/src'

BREAKING CHANGE: webpack < 5 used to include polyfills for node.js core modules by default.
This is no longer the case. Verify if you need this module and configure a polyfill for it.

If you want to include a polyfill, you need to:
        - add a fallback 'resolve.fallback: { "url": require.resolve("url/") }'
        - install 'url'
If you don't want to include a polyfill, you can use an empty module like this:
        resolve.fallback: { "url": false }

ERROR in ./node_modules/@autometrics/autometrics/dist/index.mjs 53:12-33
Module not found: Error: Can't resolve 'async_hooks' in '<USR_PATH>/react-repro/node_modules/@autometrics/autometrics/dist'

🐛 Low priority: Remove tests from autometrics package distro

Environment information

n/a

What happened?

When you npm i @auotmetrics/autometrics and inspect node_modules, it seems we include our tests folder.

% ls -1 node_modules/@autometrics/autometrics
README.md
dist
package.json
reference
src
tests
tsconfig.json
tsup.config.ts
typedoc.json

We can use an .npmignore to exclude these from the package. Just need to be careful, as using .npmignore means npm wil no longer respect .gitignore

Expected result

I wouldn't expect tests to be packaged with the autometrics library

OTLP export support

As part of this task, I will be adding support for exporting to the OTel Collector using OTLP.

Since we already bundle two methods for exporting (the Prometheus exporter and the push gateway), and we decided we no longer want to auto-start a webserver under the default configuration, I will rework how the exporter configuration works and will split the exporters into separate packages.

  • Create package for the existing Prometheus exporter
  • Create package for the existing push gateway exporter
  • Create package for the new OTLP push exporter
  • Update the core package and the init() function to work with these exporter packages.

Add initial experimental support for immediately pushing metrics in a FaaS context

Motivation

When using autometrics with a push configuration inside, e.g., an edge function, we run into the issue that the library sets up an interval at which it pushes metrics to a push/aggregation gateway.

This makes the assumption that the parent script is a long-running process.

In an environment like that of Supabase's Edge Functions, we run into the issue that when autometrics is imported and configured in an edge function, it will keep the edge function alive (because of its interval) until the edge function either exceeds its allowed CPU time or times out entirely. In production, anyone using autometrics would effectively be faced with extra billing costs due to this.

Solution

As a first pass, we configure autometrics to immediately push new metrics whenever they are available, in the case where the pushInterval configuration parameter is set to 0.

For a future iteration, we can support a more robust push configuration, as there are likely other relevant options for the FaaS usecase (e.g., configuring a timeout of requests to the gateway).

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.