Giter VIP home page Giter VIP logo

airbyte-api-python-sdk's Introduction

Programatically control Airbyte Cloud through an API.

Authentication

Developers will need to create an API Key within your Developer Portal to make API requests. You can use your existing Airbyte account to log in to the Developer Portal. Once you are in the Developer Portal, use the API Keys tab to create or remove API Keys. You can see a walkthrough demo here๐ŸŽฆ

The Developer Portal UI can also be used to help build your integration by showing information about network requests in the Requests tab. API usage information is also available to you in the Usage tab.

SDK Installation

pip install airbyte-api

SDK Example Usage

Example

import airbyte_api
from airbyte_api import models

s = airbyte_api.AirbyteAPI(
    security=models.Security(
        basic_auth=models.SchemeBasicAuth(
            password="<YOUR_PASSWORD_HERE>",
            username="<YOUR_USERNAME_HERE>",
        ),
    ),
)

res = s.connections.create_connection(request=models.ConnectionCreateRequest(
    destination_id='e478de0d-a3a0-475c-b019-25f7dd29e281',
    source_id='95e66a59-8045-4307-9678-63bc3c9b8c93',
    name='Postgres-to-Bigquery',
    namespace_format='${SOURCE_NAMESPACE}',
))

if res.connection_response is not None:
    # handle response
    pass

Available Resources and Operations

Error Handling

Handling errors in this SDK should largely match your expectations. All operations return a response object or raise an error. If Error objects are specified in your OpenAPI Spec, the SDK will raise the appropriate Error type.

Error Object Status Code Content Type
errors.SDKError 4xx-5xx /

Example

import airbyte_api
from airbyte_api import errors, models

s = airbyte_api.AirbyteAPI(
    security=models.Security(
        basic_auth=models.SchemeBasicAuth(
            password="<YOUR_PASSWORD_HERE>",
            username="<YOUR_USERNAME_HERE>",
        ),
    ),
)

res = None
try:
    res = s.connections.create_connection(request=models.ConnectionCreateRequest(
    destination_id='e478de0d-a3a0-475c-b019-25f7dd29e281',
    source_id='95e66a59-8045-4307-9678-63bc3c9b8c93',
    name='Postgres-to-Bigquery',
    namespace_format='${SOURCE_NAMESPACE}',
))
except errors.SDKError as e:
    # handle exception
    raise(e)

if res.connection_response is not None:
    # handle response
    pass

Server Selection

Select Server by Index

You can override the default server globally by passing a server index to the server_idx: int optional parameter when initializing the SDK client instance. The selected server will then be used as the default on the operations that use it. This table lists the indexes associated with the available servers:

# Server Variables
0 https://api.airbyte.com/v1 None

Example

import airbyte_api
from airbyte_api import models

s = airbyte_api.AirbyteAPI(
    server_idx=0,
    security=models.Security(
        basic_auth=models.SchemeBasicAuth(
            password="<YOUR_PASSWORD_HERE>",
            username="<YOUR_USERNAME_HERE>",
        ),
    ),
)

res = s.connections.create_connection(request=models.ConnectionCreateRequest(
    destination_id='e478de0d-a3a0-475c-b019-25f7dd29e281',
    source_id='95e66a59-8045-4307-9678-63bc3c9b8c93',
    name='Postgres-to-Bigquery',
    namespace_format='${SOURCE_NAMESPACE}',
))

if res.connection_response is not None:
    # handle response
    pass

Override Server URL Per-Client

The default server can also be overridden globally by passing a URL to the server_url: str optional parameter when initializing the SDK client instance. For example:

import airbyte_api
from airbyte_api import models

s = airbyte_api.AirbyteAPI(
    server_url="https://api.airbyte.com/v1",
    security=models.Security(
        basic_auth=models.SchemeBasicAuth(
            password="<YOUR_PASSWORD_HERE>",
            username="<YOUR_USERNAME_HERE>",
        ),
    ),
)

res = s.connections.create_connection(request=models.ConnectionCreateRequest(
    destination_id='e478de0d-a3a0-475c-b019-25f7dd29e281',
    source_id='95e66a59-8045-4307-9678-63bc3c9b8c93',
    name='Postgres-to-Bigquery',
    namespace_format='${SOURCE_NAMESPACE}',
))

if res.connection_response is not None:
    # handle response
    pass

Custom HTTP Client

The Python SDK makes API calls using the requests HTTP library. In order to provide a convenient way to configure timeouts, cookies, proxies, custom headers, and other low-level configuration, you can initialize the SDK client with a custom requests.Session object.

For example, you could specify a header for every request that this sdk makes as follows:

import airbyte_api
import requests

http_client = requests.Session()
http_client.headers.update({'x-custom-header': 'someValue'})
s = airbyte_api.AirbyteAPI(client=http_client)

Authentication

Per-Client Security Schemes

This SDK supports the following security schemes globally:

Name Type Scheme
basic_auth http HTTP Basic
bearer_auth http HTTP Bearer
client_credentials oauth2 OAuth2 token

You can set the security parameters through the security optional parameter when initializing the SDK client instance. The selected scheme will be used by default to authenticate with the API for all operations that support it. For example:

import airbyte_api
from airbyte_api import models

s = airbyte_api.AirbyteAPI(
    security=models.Security(
        basic_auth=models.SchemeBasicAuth(
            password="<YOUR_PASSWORD_HERE>",
            username="<YOUR_USERNAME_HERE>",
        ),
    ),
)

res = s.connections.create_connection(request=models.ConnectionCreateRequest(
    destination_id='e478de0d-a3a0-475c-b019-25f7dd29e281',
    source_id='95e66a59-8045-4307-9678-63bc3c9b8c93',
    name='Postgres-to-Bigquery',
    namespace_format='${SOURCE_NAMESPACE}',
))

if res.connection_response is not None:
    # handle response
    pass

Maturity

This SDK is in beta, and there may be breaking changes between versions without a major version update. Therefore, we recommend pinning usage to a specific package version. This way, you can install the same version each time without breaking changes unless you are intentionally looking for the latest version.

Contributions

While we value open-source contributions to this SDK, this library is generated programmatically. Feel free to open a PR or a Github issue as a proof of concept and we'll do our best to include it in a future release !

SDK Created by Speakeasy

airbyte-api-python-sdk's People

Contributors

aaronsteers avatar bgroff avatar github-actions[bot] avatar jonsspaghetti avatar simplesagar avatar speakeasybot avatar terencecho avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

airbyte-api-python-sdk's Issues

`/health` endpoint missing?

Hi,

I have noticed there are no methods in this SDK to invoke the Airbyte API /health endpoint

Is this on purpose?

I think it should be added to the SDK, for completeness' sake, and also as it's nice to have anyway.

Thanks,
Dan

Airbyte create connection is failing

Script used to test create_connection for Airbyte

import airbyte
from airbyte.models import shared

s = airbyte.Airbyte(
    security=shared.Security(
        basic_auth=shared.SchemeBasicAuth(
            password="password",
            username="airbyte",
        ),
    ),
)

req = shared.ConnectionCreateRequest(
    configurations=shared.StreamConfigurations(
        streams=[
            shared.StreamConfiguration(
                cursor_field=[
                    'violet',
                ],
                name='at BMW',
                primary_key=[
                    [
                        'pfft',
                    ],
                ],
                sync_mode=shared.ConnectionSyncModeEnum.FULL_REFRESH_APPEND,
            ),
        ],
    ),
    data_residency=shared.GeographyEnum.AUTO,
    destination_id='083eafc8-5591-44e0-a570-f6dd427d83a5',
    name='mesh interactive',
    namespace_definition=shared.NamespaceDefinitionEnum.DESTINATION,
    namespace_format='${SOURCE_NAMESPACE}',
    non_breaking_schema_updates_behavior=shared.NonBreakingSchemaUpdatesBehaviorEnum.IGNORE,
    prefix='port Idaho',
    schedule=shared.ConnectionSchedule(
        cron_expression='productivity',
        schedule_type=shared.ScheduleTypeEnum.MANUAL,
    ),
    source_id='b3fd2fd3-07d6-40cb-97ea-6dfc635b80f2',
    status=shared.ConnectionStatusEnum.INACTIVE,
)

res = s.connections.create_connection(req)

if res.connection_response is not None:
    print("connected!!!!")

Testing

Setup and launch of Airbyte was done using docker as given here

Error

Traceback (most recent call last):
  File "/Users/shlokabhalgat/Documents/github/test_lego/test_lego.py", line 1, in <module>
    import airbyte
  File "/Users/shlokabhalgat/opt/anaconda3/envs/myenv/lib/python3.9/site-packages/airbyte/__init__.py", line 3, in <module>
    from .sdk import *
  File "/Users/shlokabhalgat/opt/anaconda3/envs/myenv/lib/python3.9/site-packages/airbyte/sdk.py", line 4, in <module>
    from .connections import Connections
  File "/Users/shlokabhalgat/opt/anaconda3/envs/myenv/lib/python3.9/site-packages/airbyte/connections.py", line 5, in <module>
    from airbyte.models import errors, operations, shared
  File "/Users/shlokabhalgat/opt/anaconda3/envs/myenv/lib/python3.9/site-packages/airbyte/models/operations/__init__.py", line 3, in <module>
    from .canceljob import *
  File "/Users/shlokabhalgat/opt/anaconda3/envs/myenv/lib/python3.9/site-packages/airbyte/models/operations/canceljob.py", line 6, in <module>
"""Code generated by Speakeasy (https://speakeasyapi.dev). DO NOT EDIT."""
    from ..shared import jobresponse as shared_jobresponse
  File "/Users/shlokabhalgat/opt/anaconda3/envs/myenv/lib/python3.9/site-packages/airbyte/models/shared/__init__.py", line 114, in <module>
    from .source_aws_cloudtrail import *
  File "/Users/shlokabhalgat/opt/anaconda3/envs/myenv/lib/python3.9/site-packages/airbyte/models/shared/source_aws_cloudtrail.py", line 14, in <module>
    class SourceAwsCloudtrail:
  File "/Users/shlokabhalgat/opt/anaconda3/envs/myenv/lib/python3.9/site-packages/airbyte/models/shared/source_aws_cloudtrail.py", line 23, in SourceAwsCloudtrail
    start_date: Optional[date] = dataclasses.field(default=dateutil.parser.parse('1970-01-01').date(), metadata={'dataclasses_json': { 'letter_case': utils.get_field_name('start_date'), 'encoder': utils.dateisoformat(True), 'decoder': utils.datefromisoformat, 'exclude': lambda f: f is None }})
NameError: name 'dateutil' is not defined

Error in `Airbyte.connections.list_connections` with offset equal to the number of connections available

As described in the following code:

import airbyte


airbyte_workspace = airbyte.Airbyte(
    server_url=SERVER_URL
)


response = airbyte_workspace.connections.list_connections(
    airbyte.models.operations.ListConnectionsRequest(
        workspace_ids=[WORKSPACE_ID],
        offset=0, limit=100
    )
)
# There are only 20 connections
assert len(response.connections_response.data) == 20

response = airbyte_workspace.connections.list_connections(
    airbyte.models.operations.ListConnectionsRequest(
        workspace_ids=[WORKSPACE_ID],
        offset=0, limit=20
    )
)
# Still, with a limit of 20, a "next" page is provided
assert response.connections_response.next is not None

# But it breaks if we try to fetch data from it
airbyte_workspace.connections.list_connections(
    airbyte.models.operations.ListConnectionsRequest(
        workspace_ids=[WORKSPACE_ID],
        offset=20, limit=20
    )
)

which raises:

KeyError                                  Traceback (most recent call last)
     28 assert response.connections_response.next is not None
     30 # But it breaks if we try to fetch data from it
---> 31 airbyte_workspace.connections.list_connections(
     32     airbyte.models.operations.ListConnectionsRequest(
     33         workspace_ids=[WORKSPACE_ID],
     34         offset=20, limit=20
     35     )
     36 )

File .../.venv/lib/python3.11/site-packages/airbyte/connections.py:135, in Connections.list_connections(self, request)
    133 if http_res.status_code == 200:
    134     if utils.match_content_type(content_type, 'application/json'):
--> 135         out = utils.unmarshal_json(http_res.text, Optional[shared.ConnectionsResponse])
    136         res.connections_response = out
    137     else:

File .../.venv/lib/python3.11/site-packages/airbyte/utils/utils.py:695, in unmarshal_json(data, typ, decoder)
    693 json_dict = json.loads(data)
    694 try:
--> 695     out = unmarshal.from_dict({"res": json_dict})
    696 except AttributeError as attr_err:
    697     raise AttributeError(
    698         f'unable to unmarshal {data} as {typ} - {attr_err}') from attr_err

File .../.venv/lib/python3.11/site-packages/dataclasses_json/api.py:70, in DataClassJsonMixin.from_dict(cls, kvs, infer_missing)
     65 @classmethod
     66 def from_dict(cls: Type[A],
     67               kvs: Json,
     68               *,
     69               infer_missing=False) -> A:
---> 70     return _decode_dataclass(cls, kvs, infer_missing)

File .../.venv/lib/python3.11/site-packages/dataclasses_json/core.py:220, in _decode_dataclass(cls, kvs, infer_missing)
    218     init_kwargs[field.name] = value
    219 elif _is_supported_generic(field_type) and field_type != str:
--> 220     init_kwargs[field.name] = _decode_generic(field_type,
    221                                               field_value,
    222                                               infer_missing)
    223 else:
    224     init_kwargs[field.name] = _support_extended_types(field_type,
    225                                                       field_value)

File .../.venv/lib/python3.11/site-packages/dataclasses_json/core.py:309, in _decode_generic(type_, value, infer_missing)
    307 type_arg = _get_type_arg_param(type_, 0)
    308 if is_dataclass(type_arg) or is_dataclass(value):
--> 309     res = _decode_dataclass(type_arg, value, infer_missing)
    310 elif _is_supported_generic(type_arg):
    311     res = _decode_generic(type_arg, value, infer_missing)

File .../.venv/lib/python3.11/site-packages/dataclasses_json/core.py:172, in _decode_dataclass(cls, kvs, infer_missing)
    169 if not field.init:
    170     continue
--> 172 field_value = kvs[field.name]
    173 field_type = types[field.name]
    174 if field_value is None:

KeyError: 'data'

when executing stream_details = s.streams.get_stream_properties(req)

line 697, in unmarshal_json
raise AttributeError(
AttributeError: unable to unmarshal

reproduce -
install latest airbyte helm.
setup GITHUB and Snowflake connection

run this code -

from airbyte import Airbyte
from airbyte.connections import shared
from airbyte.models import errors, operations, shared
from airbyte.models.operations import ListConnectionsRequest

     
import airbyte
from airbyte.models import shared

s = Airbyte(server_url="http://127.0.0.1:8006/v1",
                security=shared.Security(
        basic_auth=shared.SchemeBasicAuth(
            password="<YOUR_PASSWORD_HERE>",
            username="<YOUR_USERNAME_HERE>",
        ),
    ),)
r = s.connections.list_connections(operations.ListConnectionsRequest())
for connection_id in r.raw_response.json()['data']:
    print(connection_id['connectionId'])
    cr = s.connections.get_connection(operations.GetConnectionRequest(connection_id['connectionId']))
    req = operations.GetStreamPropertiesRequest(
    destination_id=cr.raw_response.json()['destinationId'],
    source_id=cr.raw_response.json()['sourceId'],
    ignore_cache=True,
    )

    stream_details = s.streams.get_stream_properties(req)

Wrong source configuration class

Hello!

I have different sources types in my Airbyte
When getting them using sources.list_sources or sources.get_source all configurations are encapsulating in SourceAirtable class

Example here printing a source, you can see source_type is postgres, but configuration is wrong, I expect SourcePostgres

SourceResponse(configuration=SourceAirtable(credentials=None, SOURCE_TYPE=<SourceAirtableAirtable.AIRTABLE: 'airtable'>), name='my postgres', source_id='XXX', source_type='postgres', workspace_id='YYY')

NB: the API returns correct configuration data

Can we add a `CONTRIBUTING.md` page?

In starting the effort on

I'm not sure where to start and how to control / tweak / experiment on the codegen processes.

Do we have anything like a Contributing guide as of now, either in our repo or in the Speakeasy docs?

I'm happy to contribute to this one if somebody can point me in the right direction.

The SDK doesnt seem to work for custom sources/destination as source type seems non optional even though its optional in actual api

Script Used To Test

import airbyte
from airbyte.models import shared

s = airbyte.Airbyte(
    security=shared.Security(
        server_url="http://localhost:8006/v1",
        basic_auth=shared.SchemeBasicAuth(
            password=<pwd>,
            username=<username>,
        ),
    ),
)


req = operations.GetSourceRequest(
    source_id=<custom-source-id>,
)

res = airbyte_client.sources.get_source(req)
if res.source_response is not None:
    print("source fetched")

Error Observed

Traceback (most recent call last):
File "/mnt/c/ACA-Group/alpha-airbyte/web/api/source/test.py", line 173, in
result = AirbyteConnectorConfig.process_get({'source_id':'c5f9b2ed-60ec-4e46-93dc-b00a54fcab11'})
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/mnt/c/web/api/source/test.py", line 50, in process_get
res = airbyte_client.sources.get_source(req)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/mnt/c/web/api/.venv/lib/python3.11/site-packages/airbyte/sources.py", line 102, in get_source
out = utils.unmarshal_json(http_res.text, Optional[shared.SourceResponse])
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/mnt/c/web/api/.venv/lib/python3.11/site-packages/airbyte/utils/utils.py", line 695, in unmarshal_json
out = unmarshal.from_dict({"res": json_dict})
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/mnt/c/web/api/.venv/lib/python3.11/site-packages/dataclasses_json/api.py", line 70, in from_dict
return _decode_dataclass(cls, kvs, infer_missing)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/mnt/c/web/api/.venv/lib/python3.11/site-packages/dataclasses_json/core.py", line 220, in _decode_dataclass
init_kwargs[field.name] = _decode_generic(field_type,
^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/mnt/c/web/api/.venv/lib/python3.11/site-packages/dataclasses_json/core.py", line 309, in _decode_generic
res = _decode_dataclass(type_arg, value, infer_missing)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/mnt/c/web/api/.venv/lib/python3.11/site-packages/dataclasses_json/core.py", line 172, in _decode_dataclass
field_value = kvs[field.name]
~~~^^^^^^^^^^^^
KeyError: 'source_type'

Additionally

It is expecting source_type to be mandatorily available in SourceResponse which is true only for non custom connectors
Although hitting the API via postman works without issues (http://localhost:8006/v1/sources/)

Also looks like the dataclasses for source response is restrictive and tailor made only for noncustom connectors
as I see the source_type is not defined as Optional[str] and Configurations is a Union of defined sources only, so it would fail to accommodate for custom sources with custom configurations
https://github.com/airbytehq/airbyte-api-python-sdk/blob/main/src/airbyte/models/shared/sourceresponse.py#L207

A few questions on OAuth and Airbyte Enterprise

Hi,

A few questions:

Can this SDK be used against non-Cloud version of Airbyte, i.e. Airbyte OSS and Airbyte Enterprise?

I have noticed there is support for overriding the Airbase API base URL, but I don't see any support for OAuth, as used in the on-premise Airbyte Enterprise version.

Thanks,
Dan

Stripe Connection Creation Fails with status 422

Request-

request: shared.SourceCreateRequest = shared.SourceCreateRequest(
      configuration=shared.SourceStripe(
              'valid_account_id',
              'valid_client_secret',
              shared.SourceStripeStripe.STRIPE,
              datetime(2023, 1, 1),
          ),
      name="some_string",
      workspace_id="valid_workspace_id",
)

Response-

{
  "type": "https://reference.airbyte.com/reference/errors#unprocessable-entity",
  "title": "unprocessable-entity",
  "status": 422,
  "detail": "The provided configuration does not fulfill the specification. Errors: json schema validation failed when comparing the data to the json schema. \nErrors: $.start_date: 2023-01-01T00:00:00 is an invalid date-time, $.start_date: does not match the regex pattern ^[0-9]{4}-[0-9]{2}-[0-9]{2}T[0-9]{2}:[0-9]{2}:[0-9]{2}Z$ "
}

Resolve compatibility conflicts with PyAirbyte (rename library and rebuild)

Currently, this package (airbyte-api) imports with import airbyte. The new PyAirbyte library also imports with import airbyte.

I'm starting to work through this issue, trialing a few approaches in

Options I'm aware of:

  1. Rename the package to something like airbyte_api or airbyte_cdk.
  2. Create a namespaces package, where airbyte-api imports as airbyte.sdk, which would be grafted into the same top-level airbyte namespace as PyAirbyte, but withought the two packages conflicting.

AttributeError: unable to unmarshal <response> as StreamPropertiesResponse

Making the following request

airbyte.Airbyte(server_url=server_url).streams.get_stream_properties(
    airbyte.models.operations.GetStreamPropertiesRequest(
        destination_id, source_id))

raises

AttributeError: unable to unmarshal <response> as typing.Optional[airbyte.models.shared.streampropertiesresponse.StreamPropertiesResponse

while running

curl '{apiUrl}/streams?sourceId={sourceId}&destinationId={destinationId}' --header 'accept: application/json'

successfully returns the same <response>

Cannot use list workspaces

Hello,

Thank you so much for providing this toolset to interact with Airbyte, I am building a Pulumi provider so we can automate this via Pulumi.
But while writing some test code, I have found that some examples do not work, the case was with List Workspaces, here is an example snippet:

import airbyte
import dateutil.parser
from airbyte.models import operations, shared

source_conn = airbyte.Airbyte(
    server_url="http://127.0.0.1:8000/api/v1",
    security=shared.Security(
        basic_auth=shared.SchemeBasicAuth(
            username="airbyte",
            password="password"
        )
    )
)

request = operations.ListWorkspacesRequest()
result = source_conn.workspaces.list_workspaces(request)
print(result)

This gives the following error:

-> % ./examples/create_source.py
ListWorkspacesRequest(include_deleted=False, limit=20, offset=0, workspace_ids=None)
Traceback (most recent call last):
  File "/path/to/my/repo/pulumi-airbyte/./examples/create_source.py", line 23, in <module>
    result = source_conn.workspaces.list_workspaces(request)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/path/to/my/repo/venv/lib/python3.11/site-packages/airbyte/workspaces.py", line 176, in list_workspaces
    raise errors.SDKError('API error occurred', http_res.status_code, http_res.text, http_res)
airbyte.models.errors.sdkerror.SDKError: API error occurred: Status 404
Object not found.

So I did some research and inpection to try and find some error or mishuse on my side. Using wireshark to fetch the request this is what I get:

GET /api/v1/workspaces?includeDeleted=false&limit=20&offset=0 HTTP/1.1
Host: 127.0.0.1:8000
user-agent: speakeasy-sdk/python 0.44.3 2.237.2 1.0.0 airbyte-api
Accept-Encoding: gzip, deflate
Accept: application/json
Connection: keep-alive
Authorization: Basic YWlyYnl0ZTpwYXNzd29yZA==

HTTP/1.1 404 Not Found
Server: nginx/1.25.3
Date: Thu, 01 Feb 2024 13:15:56 GMT
Content-Type: application/json
Content-Length: 17
Connection: keep-alive

Object not found.

I have replicated the request with insomnia and indeed I get the same response. Looking at your OpenAPI Docs The only request that exists is a POST /v1/workspaces/list and inspecting the source it looks like that is not the case.

Version information:

  • Airbyte Version: 0.50.45
  • Airbyte SDK Version: 0.44.3

I noticed that you are using https://speakeasyapi.dev for the code generation, maybe needs to be re run against the API? Or am I using it wrong ?

Regards,
Alfredo Palhares

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.