Giter VIP home page Giter VIP logo

eventstore-client-nodejs's Introduction

EventStoreDB

EventStoreDB is the event-native database, where business events are immutably stored and streamed. Designed for event-sourced, event-driven, and microservices architectures

What is EventStoreDB

EventStoreDB is a new category of operational database that has evolved from the Event Sourcing community. Powered by the state-transition data model, events are stored with the context of why they have happened. Providing flexible, real-time data insights in the language your business understands.

Download the latest version. For more product information visit the website.

What is Event Store Cloud?

Event Store Cloud is a fully managed cloud offering that's designed to make it easy for developers to build and run highly available and secure applications that incorporate EventStoreDB without having to worry about managing the underlying infrastructure. You can provision EventStoreDB clusters in AWS, Azure, and GCP, and connect these services securely to your own cloud resources.

For more details visit the website.

Licensing

View Event Store Ltd's licensing information.

Docs

For guidance on installation, development, deployment, and administration, see the User Documentation.

Getting started with EventStoreDB

Follow the getting started guide.

Getting started with Event Store Cloud

Event Store can manage EventStoreDB for you, so you don't have to run your own clusters. See the online documentation: Getting started with Event Store Cloud.

Client libraries

This guide shows you how to get started with EventStoreDB by setting up an instance or cluster and configuring it. EventStoreDB supports two protocols: gRPC and TCP(legacy).

EventStoreDB supported gRPC clients

Community supported gRPC clients

Read more in the documentation.

Legacy TCP Clients (support ends with 23.10 LTS)

Deployment

Communities

Contributing

Development is done on the master branch. We attempt to do our best to ensure that the history remains clean and to do so, we generally ask contributors to squash their commits into a set or single logical commit.

If you want to switch to a particular release, you can check out the release branch for that particular release. For example:
git checkout release/oss-v22.10

Building EventStoreDB

EventStoreDB is written in a mixture of C# and JavaScript. It can run on Windows, Linux and macOS (using Docker) using the .NET Core runtime.

Prerequisites

Once you've installed the prerequisites for your system, you can launch a Release build of EventStore as follows:

dotnet build -c Release src

The build scripts: build.sh and build.ps1 are also available for Linux and Windows respectively to simplify the build process.

To start a single node, you can then run:

dotnet ./src/EventStore.ClusterNode/bin/x64/Release/net8.0/EventStore.ClusterNode.dll --dev --db ./tmp/data --index ./tmp/index --log ./tmp/log

Running the tests

You can launch the tests as follows:

dotnet test src/EventStore.sln

Build EventStoreDB Docker image

You can also build a Docker image by running the command:

docker build --tag myeventstore . \
--build-arg CONTAINER_RUNTIME={container-runtime}
--build-arg RUNTIME={runtime}

For instance:

docker build --tag myeventstore . \
--build-arg CONTAINER_RUNTIME=bookworm-slim \
--build-arg RUNTIME=linux-x64

Note: Because of the Docker issue, if you're building a Docker image on Windows, you may need to set the DOCKER_BUILDKIT=0 environment variable. For instance, running in PowerShell:

$env:DOCKER_BUILDKIT=0; docker build --tag myeventstore . `
--build-arg CONTAINER_RUNTIME=bookworm-slim `
--build-arg RUNTIME=linux-x64

Currently, we support the following configurations:

  1. Bookworm slim:
  • CONTAINER_RUNTIME=bookworm-slim
  • RUNTIME=linux-x64
  1. Jammy:
  • CONTAINER_RUNTIME=Jammy
  • RUNTIME=linux-x64
  1. Alpine:
  • CONTAINER_RUNTIME=alpine
  • RUNTIME=linux-musl-x64

You can verify the built image by running:

docker run --rm myeventstore --insecure --what-if

Need help?

eventstore-client-nodejs's People

Contributors

alexeyzimarev avatar cdevarenne avatar coffeemakingtoaster avatar condron avatar dependabot[bot] avatar dhingrak avatar george-payne avatar hayley-jean avatar imanghvs avatar kivancguckiran avatar ledniy avatar leopoldodonnell avatar lucaspaganini avatar mat-mcloughlin avatar nhacsam avatar oskardudycz avatar pvanbuijtene avatar shaan1337 avatar tambeau avatar thefringeninja avatar thomastoye avatar timothycoleman avatar w1am avatar yoeight avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

eventstore-client-nodejs's Issues

Stream subscriptions should allow for backpressure

I noticed this because I had a couple of events where one is expected to be processed completely before the next one runs and then checking the typing it seems the handler function takes the form on(event: "data", listener: (event: E) => void): this.

Not sure if it is in scope for the subscription to await a promise returned by the handler, but it is IMO important for back-pressure to prevent overwhelming a service or database with requests.

An impromptu solution on my end would be to use an async worker "thread" or something, but then I run the risk of my application's memory overflowing if the database gets behind.

This is mostly relevant for me right now as I am working with an all stream subscription that is parsing through a bunch of older events and a slightly-less-jury-rigged solution would be to read form the all stream until the end and then switch to a subscription. And this may be one of the only good options depending on how the GRPC connection works.

Unable to create a subscription for eventStore Cloud

I'm having an issue connecting to EventStore cloud and creating a subscription
the same code work on my local env but not in the cloud

const connectionStr = β€œesdb://localhost:2113?tls=false”
    const client = EventStoreDBClient.connectionString(connectionStr)
    const received: ResolvedEvent[] = []
    const sub = client.subscribeToAll()
    sub.on(β€˜error’, err => console.log(err, err?.message))
    sub.on(β€˜close’, () => {})
    sub.on(β€˜data’, evt => {
      console.log(evt)
    }) 

this works but if I change the connection string to the cloud esdb://<<couldendpoit>>:2113?tls=true

I get this error,

node:43812) UnhandledPromiseRejectionWarning: Error: 7 PERMISSION_DENIED: Access Denied
    at Object.convertToCommandError (.../node_modules/@eventstore/db-client/dist/utils/CommandError.js:246:20)
    at ClientReadableStreamImpl.<anonymous> (.../node_modules/@eventstore/db-client/dist/utils/OneWaySubscription.js:32:38)
    at ClientReadableStreamImpl.emit (events.js:315:20)
    at Object.onReceiveStatus (.../node_modules/@grpc/grpc-js/build/src/client.js:327:28)
    at Object.onReceiveStatus (.../node_modules/@grpc/grpc-js/build/src/client-interceptors.js:299:181)
    at .../node_modules/@grpc/grpc-js/build/src/call-stream.js:130:78
    at processTicksAndRejections (internal/process/task_queues.js:75:11)

Unable to connect to ES Cloud

Tried many things but nothing works. Here is my code:

import es from "@eventstore/db-client";

const endpoint = {
    address: "<node-address>",
    port: 2113
}
const connection = es.EventStoreConnection
    .builder()
    .singleNodeConnection(endpoint);
const events = await es.readEventsFromStream("test")
    .fromStart()
    .forward()
    .count(10)
    .execute(connection);
console.log(events)

This throws with UNAVAILABLE: Connection dropped

When using

const connection = es.EventStoreConnection
    .builder()
    .gossipClusterConnection([endpoint]);

and pointing endpoint to the cluster DNS name (works with C#), it hands forever, no messages, no errors, just stops on reading events.

When using the same code with three endpoints, each pointing to the specific cluster node (0, 1, 2), it also hands forever.

BTW, I found out that the client uses gRPC gossip, which (afaik) was never intended to be used by clients. The C# client uses normal HTTP gossip (JSON).

Stock example gives UnknownError: 13 INTERNAL: Received RST_STREAM with code 2 (Internal server error)

Using the current image and the stock example from the readme I get an error.
I have a single server running locally that I can access normally via 127.0.0.1:2113 using the example config https://developers.eventstore.com/server/20.6/server/installation/docker.html

UnknownError: 13 INTERNAL: Received RST_STREAM with code 2 (Internal server error)
    at Object.exports.convertToCommandError (/home/mbneto/Development/event-store/node_modules/@eventstore/db-client/dist/utils/CommandError.js:278:12)
    at Object.callback (/home/mbneto/Development/event-store/node_modules/@eventstore/db-client/dist/command/streams/WriteEventsToStream.js:69:50)
    at Object.onReceiveStatus (/home/mbneto/Development/event-store/node_modules/@grpc/grpc-js/src/client.ts:451:26)
    at Object.onReceiveStatus (/home/mbneto/Development/event-store/node_modules/@grpc/grpc-js/src/client-interceptors.ts:434:34)
    at Object.onReceiveStatus (/home/mbneto/Development/event-store/node_modules/@grpc/grpc-js/src/client-interceptors.ts:397:48)
    at /home/mbneto/Development/event-store/node_modules/@grpc/grpc-js/src/call-stream.ts:237:24
    at processTicksAndRejections (internal/process/task_queues.js:79:11) {
  code: 13,
  _raw: Error: 13 INTERNAL: Received RST_STREAM with code 2 (Internal server error)
      at Object.callErrorFromStatus (/home/mbneto/Development/event-store/node_modules/@grpc/grpc-js/src/call.ts:81:24)
      at Object.onReceiveStatus (/home/mbneto/Development/event-store/node_modules/@grpc/grpc-js/src/client.ts:451:36)
      at Object.onReceiveStatus (/home/mbneto/Development/event-store/node_modules/@grpc/grpc-js/src/client-interceptors.ts:434:34)
      at Object.onReceiveStatus (/home/mbneto/Development/event-store/node_modules/@grpc/grpc-js/src/client-interceptors.ts:397:48)
      at /home/mbneto/Development/event-store/node_modules/@grpc/grpc-js/src/call-stream.ts:237:24
      at processTicksAndRejections (internal/process/task_queues.js:79:11) {
    code: 13,
    details: 'Received RST_STREAM with code 2 (Internal server error)',
    metadata: Metadata { internalRepr: Map {}, options: {} }
  },
  type: 'unknown'
}

Allow tests to run on LTS node

Should be tested against lowest supported version of node (12.x LTS)

  • Allow tests to run against LTS node
  • CI should test on LTS node

FWIW - jest fails by choking on: const version = process.env.EVENTSTORE_IMAGE ?? "github:ci";
Using node v12.18.4 and everything else derived from package.json

Originally posted by @leopoldodonnell in #41 (comment)

Monitoring persistent subscriptions: Make Client.resolveUri public (?)

We are monitoring our persistent subscriptions by periodically accessing the HTTP endpoint /subscriptions on a node. In a clustered environment it is not obvious which node that should be. Even though it is declared private we are calling Client.resolveUri because it does exactly what we need.

  • Is there an easier/better way?
  • Is it a problem to make Client.resolveUri public?
  • Are there plans for something like PersistentSubscriptionsManager for this client?

Cannot read property 'getMembersList' of undefined

Cluster discovery fails without certificate with an unhelpful runtime error

EventStoreConnection.builder()
    .gossipClusterConnection([
        { address: '127.0.0.1', port: 3456 },
        { address: '127.0.0.1', port: 3457 },
        { address: '127.0.0.1', port: 3458 },
    ]);
/node_modules/@eventstore/db-client/dist/connection/discovery.js:111
            for (const grpcMember of info.getMembersList()) {
                                          ^
TypeError: Cannot read property 'getMembersList' of undefined

API for manipulating Projections

The API for manipulating projections is missing. There are protos from grpc which would cover this but there is no api yet exported or implemented.

I would need this very much, thanks in advanced and best regards JΓΌrgen

Subscription async iterator interface

It would be nice to have an async iterator interface to subscriptions.
Something like:

for await (const event of subscribeToStream(<stream-name>).authenticated("admin", "changeit").fromStart().execute(connection)) {
  await projectEventIntoSQLStore(event);
}

This would make writing async handlers easier, e.g projecting a stream into another store.

Flakey connectToPersistentSubscription test

Probably a timing issue

FAIL src/__test__/persistentSubscription/connectToPersistentSubscription.test.ts (30.778 s)
  ● connectToPersistentSubscription β€Ί should connect to a persistant subscription β€Ί nack

    expect(jest.fn()).toBeCalledTimes(expected)

    Expected number of calls: 61
    Received number of calls: 62

      222 |       expect(onConfirmation).toBeCalledTimes(1);
      223 | 
    > 224 |       expect(onEvent).toBeCalledTimes(
          |                       ^
      225 |         // skipped
      226 |         skipCount +
      227 |           // retried

      at Object.<anonymous> (src/__test__/persistentSubscription/connectToPersistentSubscription.test.ts:224:23)

Connection string syntax not compatible with documentation.

The documentation at https://developers.eventstore.com/clients/grpc/getting-started/#connection-details makes it possible to generate a connection string. For example, by picking "Specify manually", "Single node", "Insecure" and "localhost:2113", the connection string will be esdb://localhost:2113?Tls=false.

On the subsequent page, the NodeJS code for creating the client is:

const client = EventStoreDBClient.connectionString("esdb://localhost:2113?Tls=false");

When executing this code, a warning is printed saying

Unknown option key "Tls", setting will be ignored.

and the connection to the database fails. If I change Tls to tls in the connection string (lowercase t), which is consistent to what the parsing code expects at https://github.com/EventStore/EventStore-Client-NodeJS/blob/master/src/Client/parseConnectionString.ts#L265, the connection works fine.

Tested on: 0.0.0-alpha.12

Cannot find type definition file for 'debug'

Hi, running into the following error when trying to build:

node_modules/@eventstore/db-client/dist/utils/debug.d.ts:1:23 - error TS2688: Cannot find type definition file for 'debug'.

1 /// <reference types="debug" />

There may be a ts compiler config to workaround or ignore the issue, but wanted to verify.

GRPC assertion failure when subscribing to streams

Not exactly sure what is causing this error, it seems to be pretty sporadic and only happen after running a few iterations of testing.

Assertion failed: stream->dep_prev (../deps/nghttp2/lib/nghttp2_stream.c: nghttp2_stream_dep_remove: 777)

It seems that this test is able to reproduce it most of the time with the added loop:

describe('Eventstore GRPC Client', () => {
  let client: EventStoreDBClient

  beforeAll(() => {
    client = new EventStoreDBClient(
      {
        endpoint: {
          address: process.env.ES_HOSTNAME ?? 'eventstore',
          port: parseInt(process.env.ES_GRPC_PORT ?? '2113'),
        },
      },
      { insecure: true },
      {
        username: process.env.ES_USERNAME!,
        password: process.env.ES_PASSWORD!,
      },
    )
  })

  it('can subscribe to a stream', async () => {
    jest.setTimeout(10000)

    for (let j = 0; j < 10; ++j) {
      const stream = `test_${uuidv4().replace(/\-/g, '')}`
      const events = []
      for (let i = 0; i < 10; ++i) {
        events.push(
          jsonEvent({
            type: 'test-event',
            id: uuidv4(),
            data: {
              a: i,
            },
          }),
        )
      }

      const appendRes = await client.appendToStream(stream, events.slice(0, 3), {
        expectedRevision: NO_STREAM,
      })

      const received: ResolvedEvent[] = []
      const sub = client.subscribeToStream(stream, {
        fromRevision: START,
      })
      sub.on('error', (err) => console.log(err, err?.message))
      sub.on('close', () => console.log('subscription stopped'))
      sub.on('data', (evt) => {
        console.log(`received event ${events.length}`)
        received.push(evt)
      })
      while (received.length < 3) await forMs(10)
      await client.appendToStream(stream, events.slice(3, 10), {
        expectedRevision: appendRes.nextExpectedRevision,
      })
      while (received.length < 10) await forMs(10)
      expect(received.map((e) => e.event?.id)).toStrictEqual(events.map((e) => e.id))
      await sub.unsubscribe()
    }
  })
})

This is my eventstore configuration (from docker compose) for testing

  eventstore:
    image: eventstore/eventstore:20.10.0-bionic
    environment:
      EVENTSTORE_START_STANDARD_PROJECTIONS: 'true'
      EVENTSTORE_INSECURE: 'true'
      EVENTSTORE_RUN_PROJECTIONS: 'All'

This was tested with Alpha 13.

There doesn't appear to be straightforward way to provide custom metadata

CustomMetadata is provided in

 export namespace ProposedMessage {
        export type AsObject = {
            id?: shared_pb.UUID.AsObject,
            metadataMap: Array<[string, string]>,
            customMetadata: Uint8Array | string,
            data: Uint8Array | string,
        }
    }

but writeEventsToStream does not appear to offer a way to include customMetadata.

It isn't clear what the preferred approach to setting metadata is. Important metadata includes such things as JsonSchema version.

Am I missing something?

Reading out of bounds backwards yields cryptic error

In one of my tests I apparently had it calling readStream with out of bounds values, I would expect this to either return an empty array or to at least give a somewhat useful error message.

I called readStream('somestream', 16, { direction: BACKWARDS, fromRevision: -1n }) and got the error:

13 INTERNAL: Request message serialization failure: Cannot read property 'lo' of null

      at Object.convertToCommandError (../node_modules/@eventstore/db-client/dist/utils/CommandError.js:281:12)
      at ClientReadableStreamImpl.<anonymous> (../node_modules/@eventstore/db-client/dist/utils/handleBatchRead.js:27:38)
      at Object.onReceiveStatus (../node_modules/@grpc/grpc-js/src/client.ts:570:18)
      at Object.onReceiveStatus (../node_modules/@grpc/grpc-js/src/client-interceptors.ts:387:48)
      at ../node_modules/@grpc/grpc-js/src/call-stream.ts:243:24

If it matters, the stream did already have several elements in it.

I would not say this is a bug per se, but the way it broke makes me wonder if there could be some other edge cases?

This was tested with alpha 12.

Support 64bit ints

Currently uint64 is being passed around as a Number causing it to fail (too big). These should be strings in grpc, and BigInt by the time they reach the consumer.

This the cause of the currently skipped (failing) tests.

  • Update protos to specify that a string should be used:
uint64 stream_revision = 3 [jstype = JS_STRING];
  • Convert recieved strings to BigInt where needed
  • Take number or bigInt , and convert to string when passing to grpc
  • remove the skips in tests to check the issue is fixed

PersistentSubscriptionSettings.fromRevision only accepts START and bigint

I already asked this at https://discuss.eventstore.com/t/persistentsubscriptionsettings-fromrevision-only-accepts-start-and-bigint/3206 but so far I have not gotten any answers and it is important to us.

Could you explain why PersistentSubscriptionSettings.fromRevision [1] cannot be set to END ? In most cases we would not want to consume all events from the beginning. Would using ReadRevision [2] not make more sense? or is it that way because it is safer to set it to an explicit revision if we don’t start from the beginning?

Judging from the implementation [3] it does not look like an oversight.

We will soon be migrating our existing ESDB to ESDB cloud using https://replicator.eventstore.org. The replicator does not copy the persistent subscriptions and rather than specifying the explicit revision we would like them to start at the end. Also the server still seems to understand -1 for startFrom so I don't see why the client does not.

[1] https://github.com/EventStore/EventStore-Client-NodeJS/blob/master/src/utils/persistentSubscriptionSettings.ts#L17
[2] https://github.com/EventStore/EventStore-Client-NodeJS/blob/master/src/types/index.ts#L35
[3] https://github.com/EventStore/EventStore-Client-NodeJS/blob/master/src/persistentSubscription/createPersistentSubscription.ts#L54

Rename Connection to Client

EventStoreConnection only makes sense for TCP as it requires Open and Close.

Since gRPC uses a stateless client, which doesn't require to open and close the connection, we should avoid using the confusing Connection term.

We must also align all the clients with those that are already widely used and stick to the same terminology and semantics across all the clients.

Even though it introduces a breaking change, we still should consider doing it for the sake of consistency. Any user of the client can easily replace one name with another, especially considering the number of places where the client is instantiated (normally just one).

Cannot connect as described in the README

I have a single EventStoreDB node running locally as Docker container and want to connect to it. I can verify the instance is running and available by accessing the web interface. However, I cannot connect with the Node.js client as shown in the README.

When instantiating the class EventStoreDBClient and subsequently performing an action, the error message "Error: 14 UNAVAILABLE: No connection established" is logged. However, when using the static factory function as described in the docs, it works correctly.

My setup:

  • OS: Linux (Ubuntu)
  • Node.js: v15.2.0
  • Module syntax: ESM
  • EventStoreDB running as local docker container

Independent of my issue, the example seems to be incomplete. While the code is correctly attempting to connect, the exemplary usage function simpleTest() is never executed.

Append result `nextExpectedVersion` is last written revision and not the next

Sample code where this issue became apparent:

import { EventStoreDBClient, FORWARDS, jsonEvent, NO_STREAM, START } from '@eventstore/db-client'
import { v4 as uuidv4 } from 'uuid'

describe('Eventstore GRPC Client', () => {
  let client: EventStoreDBClient

  beforeAll(() => {
    client = new EventStoreDBClient(
      {
        endpoint: {
          address: process.env.ES_HOSTNAME ?? 'eventstore',
          port: parseInt(process.env.ES_GRPC_PORT ?? '2113'),
        },
      },
      { insecure: true },
      {
        username: process.env.ES_USERNAME!,
        password: process.env.ES_PASSWORD!,
      },
    )
  })

  it('can connect to event store', async () => {
    const id = uuidv4().replace(/\-/g, '')
    const stream = uuidv4().replace(/\-/g, '')
    const event = jsonEvent({
      type: 'context-client-test',
      data: {
        a: id,
      },
    })

    const appendResult = await client.appendToStream(stream, [event], {
      expectedRevision: NO_STREAM,
    })
    // THIS IS THE ERROR
    expect(appendResult.nextExpectedVersion).toEqual(1n)

    const events = await client.readStream(stream, 10, {
      fromRevision: START,
      direction: FORWARDS,
    })
    expect(events).toHaveLength(1)
    expect((events[0].event!.data as any).a).toBe(id)
  })
})

I would expect that after writing to an empty stream, the next expected version would be 1 or 2 depending on if it is 0 or 1 indexed, but it seems that I am actually getting 0 which seems to indicate that it is returning the revision that was written and not the next version.

Personally I prefer having the last revision written returned, but the name needs to be changed or at minimum have more documentation around it assuming this is in fact doing what I think it is doing.

(Tested with alpha 12)

Resolve TLS warning in tests

(node:253237) [DEP0123] DeprecationWarning: Setting the TLS ServerName to an IP address is not permitted by RFC 6066. This will be ignored in a future version.

Make `JSONRecordedEvent` generic (TypeScript)

Hi, first of all thanks for all the work on this library.

I would like for JSONRecordedEvent to be generic. The current definition of this type is:

export interface JSONRecordedEvent extends RecordedEventBase {
    /**
     * Indicates wheter the content is internally marked as JSON.
     */
    isJson: true;
    /**
     * Payload of this event.
     */
    data: unknown;
    /**
     * Representing the metadata associated with this event.
     */
    metadata: Record<string, unknown>;
}

Forcing me to cast event.data every time. I would like the type definition to be something like this:

export interface JSONRecordedEvent<D = unknown, T = string> extends RecordedEventBase {
    /**
     * Indicates wheter the content is internally marked as JSON.
     */
    isJson: true;
    /**
     * Payload of this event.
     */
    data: D;
    /**
     * Representing the metadata associated with this event.
     */
    metadata: Record<string, unknown>;
    /**
     * Type of this event.
     */
    eventType: T;
}

This would allow me to define and pass events like this:

type EventX = JSONRecordedEvent<{ userId: string, repositoryName: string }, 'UserStarredRepository'>
type EventY = JSONRecordedEvent<{ userId: string, repositoryName: string, reason: string }, 'UserUnstarredRepository'>

It will also allow narrowing, since JSONRecordedEvent.eventType can now be used as a discriminator:

const fun = (event: EventX | EventY) => {
  if (event.eventType === 'UserStarredRepository') {
    // type of `event` inferred as  EventX
  } else {
    // type of `event` inferred as EventY
  }
}

Due to the type parameters having a default value, this is not a breaking change.

All stream subscriptions have errors with filters

Now I have not tried anywhere close to all combinations, but I know in Alpha 12, I was finding that filters basically caused it to never find any events, and now with Alpha 13 it seems that filters are causing an error. I did (in Alpha 12) try playing around with regex/prefix and some different options but all seemed to return no events even when I know there should have been some.

Anyway, this is the code I expect should work:

describe('Eventstore GRPC Client', () => {
  let client: EventStoreDBClient

  beforeAll(() => {
    client = new EventStoreDBClient(
      {
        endpoint: {
          address: process.env.ES_HOSTNAME ?? 'eventstore',
          port: parseInt(process.env.ES_GRPC_PORT ?? '2113'),
        },
      },
      { insecure: true },
      {
        username: process.env.ES_USERNAME!,
        password: process.env.ES_PASSWORD!,
      },
    )
  })

  it('can subscribe to the all stream with a filter', async () => {
    jest.setTimeout(10000)
    const stream = `test_${uuidv4().replace(/\-/g, '')}`
    const events = []
    for (let i = 0; i < 10; ++i) {
      events.push(
        jsonEvent({
          type: 'test-event',
          id: uuidv4(),
          data: {
            a: i,
          },
        }),
      )
    }

    const appendRes = await client.appendToStream(stream, events.slice(0, 3), {
      expectedRevision: NO_STREAM,
    })

    const received: AllStreamResolvedEvent[] = []
    const sub = client.subscribeToAll({
      filter: { filterOn: STREAM_NAME, regex: `^${stream}$` },
      fromPosition: START,
    })
    sub.on('error', (err) => console.log(err, err?.message))
    sub.on('close', () => console.log('subscription stopped'))
    sub.on('data', (evt) => {
      console.log(`received event ${events.length}`)
      received.push(evt)
    })
    while (received.length < 3) await forMs(10)
    await client.appendToStream(stream, events.slice(3, 10), {
      expectedRevision: appendRes.nextExpectedRevision,
    })
    while (received.length < 10) await forMs(10)
    expect(received.map((e) => e.event?.id)).toStrictEqual(events.map((e) => e.id))
    await sub.unsubscribe()
  })

And this is the error I see:

  console.log
    UnknownError: 2 UNKNOWN: Exception was thrown by handler.
        at Object.convertToCommandError (/app/node_modules/@eventstore/db-client/dist/utils/CommandError.js:281:12)
        at ClientReadableStreamImpl.<anonymous> (/app/node_modules/@eventstore/db-client/dist/utils/OneWaySubscription.js:32:38)
        at ClientReadableStreamImpl.emit (events.js:315:20)
        at Object.onReceiveStatus (/app/node_modules/@grpc/grpc-js/src/client.ts:570:18)
        at Object.onReceiveStatus (/app/node_modules/@grpc/grpc-js/src/client-interceptors.ts:387:48)
        at /app/node_modules/@grpc/grpc-js/src/call-stream.ts:249:24
        at processTicksAndRejections (internal/process/task_queues.js:75:11) {
      code: 2,
      _raw: Error: 2 UNKNOWN: Exception was thrown by handler.
          at Object.callErrorFromStatus (/app/node_modules/@grpc/grpc-js/src/call.ts:81:24)
          at Object.onReceiveStatus (/app/node_modules/@grpc/grpc-js/src/client.ts:570:32)
          at Object.onReceiveStatus (/app/node_modules/@grpc/grpc-js/src/client-interceptors.ts:387:48)
          at /app/node_modules/@grpc/grpc-js/src/call-stream.ts:249:24
          at processTicksAndRejections (internal/process/task_queues.js:75:11) {
        code: 2,
        details: 'Exception was thrown by handler.',
        metadata: Metadata { internalRepr: [Map], options: {} }
      },
      type: 'unknown'
    } 2 UNKNOWN: Exception was thrown by handler.

      at OneWaySubscription.<anonymous> (eventstore-connection.spec.ts:83:38)

Again, in Alpha 12 this error did not occur, but filter still did not work, so I suspect there's two layers of errors here, but I also did not test extensively so I may be mistaken.

This is my eventstore configuration (from docker compose) for testing

  eventstore:
    image: eventstore/eventstore:20.10.0-bionic
    environment:
      EVENTSTORE_START_STANDARD_PROJECTIONS: 'true'
      EVENTSTORE_INSECURE: 'true'
      EVENTSTORE_RUN_PROJECTIONS: 'All'

PersistentSubscription silently drops

We are experiencing consumer drops from persistent subscriptions which do not raise any error. Consumer service hangs indefinitely because of this. This is only noticeable in the EventStoreDB frontend and it happens on a weekly basis.

Is this a known issue? We suspect that this might be a server side drop, but it should somehow raise an error on client side so that we can restart the consumer service.

We are currently using 0.0.0-alpha.7 release.

Persistent subscription connection failed with NOT_FOUND: Leader info available

We are seeing an error with persistent subscription connections on our EventStore Cloud cluster.

We are using the v0.0.0-alpha.7 release.
Non-persistent subscriptions work correctly.

We are initializing the client with the 3 GRPC addresses shown in the cloud console.

Connection.builder().gossipClusterConnection(clusterAddresses)

esdb:connection Importing root certificate from path "/etc/ssl/certs/ca-certificates.crt" +0ms esdb:connection Using credentials { esdb:connection insecure: false, esdb:connection rootCertificate: <Buffer 2d 2d 2d 2d 2d 42 45 47 49 4e 20 43 45 52 54 49 46 49 43 41 54 45 2d 2d 2d 2d 2d 0a 4d 49 49 48 30 7a 43 43 42 62 75 67 41 77 49 42 41 67 49 49 58 73 ... 198366 more bytes>, esdb:connection privateKey: undefined, esdb:connection certChain: undefined, esdb:connection verifyOptions: undefined esdb:connection } +3ms esdb:command CreatePersistentSubscription: { esdb:command requiresLeader: false, esdb:command stream: 'test', esdb:command group: 'eventStorePoc', esdb:command resolveLink: false, esdb:command extraStats: false, esdb:command revision: 0n, esdb:command messageTimeout: 30000, esdb:command maxRetryCount: 10, esdb:command checkpointAfter: 2000, esdb:command minCheckpointCount: 10, esdb:command maxCheckpointCount: 1000, esdb:command maxSubscriberCount: 0, esdb:command liveBufferSize: 500, esdb:command readBatchSize: 20, esdb:command historyBufferSize: 500, esdb:command strategy: 'round_robin', esdb:command credentials: { username: 'xxxxxxx', password: 'xxxxxxxx' } esdb:command } +0ms esdb:command:grpc CreatePersistentSubscription: { esdb:command:grpc options: { esdb:command:grpc streamIdentifier: { streamname: 'dGVzdA==' }, esdb:command:grpc groupName: 'eventStorePoc', esdb:command:grpc settings: { esdb:command:grpc resolveLinks: false, esdb:command:grpc revision: '0', esdb:command:grpc extraStatistics: false, esdb:command:grpc maxRetryCount: 10, esdb:command:grpc minCheckpointCount: 10, esdb:command:grpc maxCheckpointCount: 1000, esdb:command:grpc maxSubscriberCount: 0, esdb:command:grpc liveBufferSize: 500, esdb:command:grpc readBatchSize: 20, esdb:command:grpc historyBufferSize: 500, esdb:command:grpc namedConsumerStrategy: 1, esdb:command:grpc messageTimeoutTicks: '0', esdb:command:grpc messageTimeoutMs: 30000, esdb:command:grpc checkpointAfterTicks: '0', esdb:command:grpc checkpointAfterMs: 2000 esdb:command:grpc } esdb:command:grpc } esdb:command:grpc } +0ms esdb:connection Createing client for CreatePersistentSubscription +25ms esdb:connection Connecting to http://xxxxxxxxxxxxxx-0.mesdb.eventstore.cloud:2113 +0ms NotLeaderError: 5 NOT_FOUND: Leader info available at Object.exports.convertToCommandError (/app/node_modules/@eventstore/db-client/dist/utils/CommandError.js:239:20) at Object.callback (/app/node_modules/@eventstore/db-client/dist/command/persistentSubscription/CreatePersistentSubscription.js:217:50) at Object.onReceiveStatus (/app/node_modules/@grpc/grpc-js/src/client.ts:334:26) at Object.onReceiveStatus (/app/node_modules/@grpc/grpc-js/src/client-interceptors.ts:434:34) at Object.onReceiveStatus (/app/node_modules/@grpc/grpc-js/src/client-interceptors.ts:397:48) at /app/node_modules/@grpc/grpc-js/src/call-stream.ts:237:24 at processTicksAndRejections (internal/process/task_queues.js:79:11) { code: 5, _raw: Error: 5 NOT_FOUND: Leader info available at Object.callErrorFromStatus (/app/node_modules/@grpc/grpc-js/src/call.ts:81:24) at Object.onReceiveStatus (/app/node_modules/@grpc/grpc-js/src/client.ts:334:36) at Object.onReceiveStatus (/app/node_modules/@grpc/grpc-js/src/client-interceptors.ts:434:34) at Object.onReceiveStatus (/app/node_modules/@grpc/grpc-js/src/client-interceptors.ts:397:48) at /app/node_modules/@grpc/grpc-js/src/call-stream.ts:237:24 at processTicksAndRejections (internal/process/task_queues.js:79:11) { code: 5, details: 'Leader info available', metadata: Metadata { internalRepr: [Map], options: {} } }, type: 'not-leader', leader: { address: 'xxxxxxxxxxxxxxxxx-2.mesdb.eventstore.cloud', port: 2113 } }

Package requires @types/google-protobuf as prod dependency but declares as a dev dependency

The package not only internally uses google-protobuf but also exports some of those library types. To export types from other libraries, you need them to be present when users install your library, which won't happen if they are declared as dev dependencies.

Places where it's used:

/generated/gossip_pb.d.ts:7:23
7 import * as jspb from "google-protobuf";

/generated/persistent_pb.d.ts:7:23
7 import * as jspb from "google-protobuf";

/generated/projections_pb.d.ts:7:23
7 import * as jspb from "google-protobuf";

/generated/projections_pb.d.ts:8:44
8 import * as google_protobuf_struct_pb from "google-protobuf/google/protobuf/struct_pb";

/generated/shared_pb.d.ts:7:23
7 import * as jspb from "google-protobuf";

/generated/streams_pb.d.ts:7:23
7 import * as jspb from "google-protobuf";

Unicode Issue

Hi,

Hope you can help us with the following:

We are experiencing an issue with unicode, we append a event to stream with the following data:

{ name: CC ‐ 1830 }

When we read that event from the stream we got a string as data instead an object, and the value of the string is the following:

'{"name":"CC \u0010 1830"}'

We can parse that string to JSON because \u0010 is not valid.

You can reproduce the issue with the following script: https://codeshare.io/G7nk6L

Clearer testing instructions

  • Update readme with instructions to log into github docker
  • Add preflight to tests that checks for dependancies and pulls required images

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    πŸ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. πŸ“ŠπŸ“ˆπŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❀️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.