Giter VIP home page Giter VIP logo

ndc-typescript-deno's Introduction

ndc-typescript-deno

image

The TypeScript (Deno) Connector allows a running connector to be inferred from a TypeScript file (optionally with dependencies).

image

Useful Links:

Overview

The connector runs in the following manner:

  • TypeScript sources are assembled (with index.ts acting as your interface definition)
  • Dependencies are fetched
  • Inference is performed and made available via the /schema endpoint
  • Functions are served via the connector protocol

Note: The Deno runtime is used and this connector assumes that dependencies are specified in accordance with Deno conventions.

Before you get Started

It is recommended that you:

The last item is currently required for access to the Hasura DDN in order to deploy and manage V3 projects.

Once the Hasura DDN is generally available this will no longer be required.

Typescript Functions Format

Your functions should be organised into a directory with one file acting as the entrypoint.

An example TypeScript entrypoint:
// functions/index.ts

import { Hash, encode } from "https://deno.land/x/[email protected]/mod.ts";

export function make_bad_password_hash(pw: string): string {
  return new Hash("md5").digest(encode(pw)).hex();
}

/**
 * Returns the github bio for the userid provided
 *
 * @param username Username of the user who's bio will be fetched.
 * @returns The github bio for the requested user.
 * @pure This function should only query data without making modifications
 */
export async function get_github_profile_description(username: string): Promise<string> {
  const foo = await fetch(`https://api.github.com/users/${username}`);
  const response = await foo.json();
  return response.bio;
}

export function make_array(): Array<string> {
  return ['this', 'is', 'an', 'array']
}

type MyObjectType = {'foo': string, 'baz': Boolean}

export function make_object(): MyObjectType {
  return { 'foo': 'bar', 'baz': true}
}

export function make_object_array(): Array<MyObjectType> {
  return [make_object(), make_object()]
}

/**
 * @pure
 */
export function has_optional_args(a: string, b?: string) {
  if(b) {
    return `Two args: ${a} ${b}`;
  } else {
    return `One arg: ${a}`;
  }
}

Top level exported function definitions with pure tag will be made available as functions, and others as procedures, which will become queries and mutations respectively.

  • Return types are inferred
  • Parameters are inferred and named after their input parameter names.
  • Simple scalar, array, and object types should be supported
  • Exceptions can be thrown
  • Optional parameters will become optional arguments

Limitations:

  • The deno vendor step must be run by hand for local development (vendoring is now automatic by default)
  • Functions can be sync, or async, but Promise's can't be nested
  • All numbers are exported as Floats
  • Unrecognised types will become opaque scalars, for example: union types.
  • Optional object fields are not currently supported
  • Functions can be executed via both the /query and /mutation endpoints
  • Conflicting type names in dependencies will be namespaced with their relative path
  • Generic type parameters will be treated as scalars when referenced
  • Importing multiple versions of the same package using Deno npm: package specifiers does not work properly with type inference (only a single version of the package is imported)

Please file an issue for any problems you encounter during usage of this connector.

Local Development of your Functions

While you can deploy your functions and have errors returned in the hasura3 connector logs, local development will reward you with much more rapid feedback.

In order to develop your functions locally the following is the recommended practice:

  • Have a ./functions/ directory in your project
  • Create your functions in an index.ts file inside the ./functions/ directory
  • Create a development config for your connector in ./config.json:
{
  "functions": "./functions/index.ts",
  "vendor": "./vendor",
  "schemaMode": "INFER"
}
  • Make sure to .gitignore your computed vendor files.
  • Start the connector
deno run -A --watch=./functions --check https://deno.land/x/hasura_typescript_connector/mod.ts serve --configuration ./config.json

Local Development of your Functions (Docker)

You can also perform local development with rapid feedback by using the Docker container instead of deno run. You don't need a config.json in this case.

  • Have a ./functions/ directory in your project
  • Create your functions in an index.ts file inside the ./functions/ directory
  • Start the connector using Docker:
docker run -it --rm -v ./functions:/functions/src -p 8080:8080 -e WATCH=1 ghcr.io/hasura/ndc-typescript-deno:latest

Config Format

The configuration object has the following properties:

  functions      (string): Location of your functions entrypoint
  vendor         (string): Location of dependencies vendor folder (optional)
  preVendor     (boolean): Perform vendoring prior to inference in a sub-process (default: true)
  schemaMode     (string): INFER the schema from your functions, or READ it from a file. (default: INFER)
  schemaLocation (string): Location of your schema file. schemaMode=READ reads the file (required), schemaMode=INFER writes the file (optional)

NOTE: When deploying the connector with the connector create command your config is currently replaced with:

{
  "functions": "/functions/src/index.ts",
  "vendor": "/functions/vendor",
  "schemaMode": "READ",
  "schemaLocation": "/functions/schema.json"
}

This means that your functions volume will have to be mounted to /functions/src.

Deployment for Hasura Users

You will need:

  • V3 CLI (With Logged in Session)
  • Connector Plugin
  • (Optionally) A value to use with SERVICE_TOKEN_SECRET
  • a typescript sources directory. E.g. --volume ./my_functions_directory:/functions

Create the connector:

hasura3 connector create my-cool-connector:v1 \
  --github-repo-url https://github.com/hasura/ndc-typescript-deno/tree/main \
  --config-file <(echo '{}') \
  --volume ./functions:/functions \
  --env SERVICE_TOKEN_SECRET=MY-SERVICE-TOKEN # (optional)

Note: Even though you can use the "main" branch to deploy the latest connector features, see the Hasura Connector Hub for verified release tags

Monitor the deployment status by name - This will indicate in-progress, complete, or failed status:

hasura3 connector status my-cool-connector:v1

List all your connectors with their deployed URLs:

hasura3 connector list

View logs from your running connector:

hasura3 connector logs my-cool-connector:v1

Usage

This connector is intended to be used with Hasura v3 projects.

Find the URL of your connector once deployed:

hasura3 connector list

my-cool-connector:v1 https://connector-9XXX7-hyc5v23h6a-ue.a.run.app active

In order to use the connector once deployed you will first want to reference the connector in your project metadata:

kind: "AuthConfig"
allowRoleEmulationFor: "admin"
webhook:
  mode: "POST"
  webhookUrl: "https://auth.pro.hasura.io/webhook/ddn?role=admin"
---
kind: DataConnector
version: v1
definition:
  name: my_connector
  url:
    singleUrl: 'https://connector-9XXX7-hyc5v23h6a-ue.a.run.app'

If you have the Hasura VSCode Extension installed you can run the following code actions:

  • Hasura: Refresh data source
  • Hasura: Track all collections / functions ...

This will integrate your connector into your Hasura project which can then be deployed or updated using the Hasura3 CLI:

hasura3 cloud build create --project-id my-project-id --metadata-file metadata.hml

Service Authentication

If you don't wish to have your connector publically accessible then you must set a service token by specifying the SERVICE_TOKEN_SECRET environment variable when creating your connector:

  • --env SERVICE_TOKEN_SECRET=SUPER_SECRET_TOKEN_XXX123

Your Hasura project metadata must then set a matching bearer token:

kind: DataConnector
version: v1
definition:
  name: my_connector
  url:
    singleUrl: 'https://connector-9XXX7-hyc5v23h6a-ue.a.run.app'
  headers:
    Authorization:
      value: "Bearer SUPER_SECRET_TOKEN_XXX123"

While you can specify the token inline as above, it is recommended to use the Hasura secrets functionality for this purpose:

kind: DataConnector
version: v1
definition:
  name: my_connector
  url:
    singleUrl: 'https://connector-9XXX7-hyc5v23h6a-ue.a.run.app'
  headers:
    Authorization:
      valueFromSecret: BEARER_TOKEN_SECRET

NOTE: This secret should contain the Bearer prefix.

Debugging Issues

Errors may arise from any of the following:

  • Dependency errors in your functions
  • Type errors in your functions
  • Implementation errors in your functions
  • Invalid connector configuration
  • Invalid project metadata
  • Connector Deployment Failure
  • Misconfigured project authentication
  • Misconfigured service authentication
  • Insufficient query permissions
  • Invalid queries
  • Unanticipated bug in connector implementation

For a bottom-up debugging approach:

  • First check your functions:
    • Run deno check on your functions to determine if there are any obvious errors
    • Write a deno test harness to ensure that your functions are correctly implemented
  • Then check your connector:
    • Check that the connector deployed successfully with hasura3 connector status my-cool-connector:v1
    • Check the build/runtime logs of your connector with hasura3 connector logs my-cool-connector:v1
  • Then check your project:
    • Ensure that your metadata and project build were successful
  • Then check end-to-end integration:
    • Run test queries and view the connector logs to ensure that your queries are propagating correctly

Development

For contribution to this connector you will want to have the following dependencies:

In order to perform local development on this codebase:

  • Check out the repository: git clone https://github.com/hasura/ndc-typescript-deno.git
  • This assumes that you will be testing against function in ./functions
  • Serve your functions with deno run -A --watch=./functions --check ./src/mod.ts serve --configuration <(echo '{"functions": "./functions/index.ts", "vendor": "./vendor", "schemaMode": "INFER"}')
  • The connector should now be running on localhost:8100 and respond to any changes to the your functions and the connector source
  • Use the hasura3 tunnel commands to reference this connector from a Hasura Cloud project

If you are fixing a bug, then please consider adding a test case to ./src/test/data.

Support & Troubleshooting

The documentation and community will help you troubleshoot most issues. If you have encountered a bug or need to get in touch with us, you can contact us using one of the following channels:

ndc-typescript-deno's People

Contributors

sordina avatar daniel-chambers avatar dmoverton avatar manasag avatar codedmart avatar shraddhaag avatar 111mihir avatar

Stargazers

Alexsander Falcucci avatar omar avatar Joshua Whalen avatar Alexander Salas Bastidas avatar Mictian avatar Vinodh avatar Dmitriy avatar Emmanuel Salomon avatar Praveen Durairaju avatar Kai Kitamura avatar Eren avatar Tristen Harr avatar jon ⚝ avatar 0xbc avatar  avatar Rihan avatar Ben Yoshiwara avatar 格林 avatar sina avatar Salem Cobalt avatar uedatatsuya avatar Rahul Agarwal avatar

Watchers

 avatar Phil Freeman avatar Rajoshi Ghosh avatar  avatar Vamshi Surabhi avatar  avatar

ndc-typescript-deno's Issues

Input type objects should be supported [engine]

I cannot currently declare a function with input type set to an object. It should be possible to do that easily.

Eg:

export async function InsertUser({
  first_name,
  last_name
}: {
  first_name: string,
  last_name: string
}): Promise<{ status: string, message: string }> {}

Adding this as a procedure in a connector to hasura results in the build(created using the cli) failing with this error:

[Error { message: "building metadata failed: invalid metadata: error building schema: unable to build schema: internal error: no support for: object types as input", locations: None, path: None, extensions: Some({"code": String("legacyError"), "id": String("aac85ca2-d551-4a81-bc6e-d47ec3abcede")}) }]

Connection Pooling

Want to be able to have a shared pool for connections from functions.

Could we have a lib or proxy solution for this?

Generic type parameters are not handled correctly

Generic type parameters are not handled correctly, for example, the following will keep the name Bar<X> and infer Bar.y as a scalar.

type Foo = {
  a: number,
  b: string
}

type Bar<X> = {
  x: number,
  y: X
}

export function bar(): Bar<Foo> {
  return {
    x: 1,
    y: {
      a: 2,
      b: 'hello'
    }
  }
}

Should be able to use resolvedTypeArgumenst as in the promise code:

  // PROMISE
  // TODO: https://github.com/hasura/ndc-typescript-deno/issues/32 There is no recursion that resolves inner promises.
  //       Nested promises should be resolved in the function definition.
  if (type_name == "Promise") {
    const inner_type = ty.resolvedTypeArguments[0];
    const inner_type_result = validate_type(root_file, checker, object_names, schema_response, name, inner_type, depth + 1);
    return inner_type_result;
  }

Support ForEach queries

This isn't required for simple N+1 relationships, we would use this for optimised commands.

For example, a function definition could support batch api calls if exposed with array parameter annotation.

/**
 * Answers the question: `Are these users active?`
 * @pure
 * @param ids user-ids to query
 * @batch user_is_active ids id
 */
function users_are_active(ids: Array<string>): boolean {
    ...
}

Exposes users_are_active as normal, and user_is_active as if it had id parameter instead of ids and uses Variables capability to wire up relationships.

N+1 Engine Feature: https://hasurahq.atlassian.net/browse/V3API-194

While we design a solution that would allow per-function batch behaviour declaration that the engine can dispatch on, we could implement blankey ForEach support in TS connector and make the decision to perform a batch call in the connector.

Node modules not respected in vendoring

Potentially have to use DENO_UNSTABLE_BYONM=true
Vendor import map will need to know the location of the node modules if they are going to be used.

Ideas:

  • Splunk the deno cache directories for NPM deps
  • Ignore the npm related errors

We may not need to completely solve this for November milestone, but we should at least make some progress and address the issue in documentation if it isn't solved.

Use a better check for inline object type detection

Currently using the following code:

function lookup_type_name(root_file: string, checker: ts.TypeChecker, names: TypeNames, name: string, ty: ts.Type): string {
  const type_str = checker.typeToString(ty);
  if(/{/.test(type_str)) {
    return name;
  }
...

Which is janky. Let's find a better way to do this.

Should be resolved by Daniel's POC: main...daniel/complex-object-types

Implemented: #70

Build a "Type Graph" to visualise the relationships between nominal types in your functions

Could possibly even live directly in the engine/console rather than in the connector.

For example:

function foo(): FooOutput {
}

function bar(x: FooOutput): BarOutput {
}

function baz(x: FooOutput, y: BarOutput): BazOutput {
}

function quux(x: BazOutput): string {
}

Could generate the following (interesting?) graph:

flowchart LR
  foo-->|FooOutput|bar
  foo-->|FooOutput|baz
  bar-->|BarOutput|baz
  baz-->|string| x([" "])  
Loading

Perhaps this could aid in suggest relationships type functionality.

Use Deno.serve instead of fastify

Update SDK to allow use of Deno.serve instead of fastify serve.

This should address the Not implemented: Server.setTimeout() error.

Implement /explain

This will require some design work in order to decide what exactly to return.

/schema infers on every request if INFER mode is set

Due to the spec/sdk now providing the connector state to the schema endpoint, it is recomputed (or read for READ mode) on every request to /schema.

This is obviously not ideal.

Ideally the pre-computed schema that is put into the connector state could be used, but since the SDK doesn't support this data flow, we should probably configurably cache the schema.

Test Deno Deploy with `deployctl`

It would be very cool to be able to use Deno hosting to deploy connectors.

WARNING: This could lead to mutable connectors which connector create avoids.

https://docs.deno.com/deploy/manual

https://github.com/denoland/deployctl#deployctl

This may require a seperate entrypoint that uses a convention for the config, or some other trick:

> deno_deployctl deploy --help
deployctl deploy
Deploy a script with static files to Deno Deploy.

To deploy a local script:
  deployctl deploy --project=helloworld main.ts

To deploy a remote script:
  deployctl deploy --project=helloworld https://deno.com/examples/hello.js

To deploy a remote script without static files:
  deployctl deploy --project=helloworld --no-static https://deno.com/examples/hello.js

To ignore the node_modules directory while deploying:
  deployctl deploy --project=helloworld --exclude=node_modules main.tsx

USAGE:
    deployctl deploy [OPTIONS] <ENTRYPOINT>

OPTIONS:
        --exclude=<PATTERNS>  Exclude files that match this pattern
        --include=<PATTERNS>  Only upload files that match this pattern
        --import-map=<FILE>   Use import map file
    -h, --help                Prints help information
        --no-static           Don't include the files in the CWD as static files
        --prod                Create a production deployment (default is preview deployment)
    -p, --project=NAME        The project to deploy to
        --token=TOKEN         The API token to use (defaults to DENO_DEPLOY_TOKEN env var)
        --dry-run             Dry run the deployment process.

Reimplement `start` to avoid options passthrough mess

See:

export function start<Configuration, State>(
   connector: Connector<Configuration, State>
 ) {
   const program = new Command();
   program.addCommand(get_serve_command(connector));
   program.addCommand(get_serve_configuration_command(connector));
   program.parseAsync(process.argv).catch(console.error);
 }

Resolved by: #50

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.