Giter VIP home page Giter VIP logo

aiochclient's Introduction

aiochclient

PyPI version Tests Documentation Status codecov Code style: black

An async http(s) ClickHouse client for python 3.6+ supporting type conversion in both directions, streaming, lazy decoding on select queries, and a fully typed interface.

Table of Contents

Installation

You can use it with either aiohttp or httpx http connectors.

To use with aiohttp install it with command:

> pip install aiochclient[aiohttp]

Or aiochclient[aiohttp-speedups] to install with extra speedups.

To use with httpx install it with command:

> pip install aiochclient[httpx]

Or aiochclient[httpx-speedups] to install with extra speedups.

Installing with [*-speedups] adds the following:

  • cChardet for aiohttp speedup
  • aiodns for aiohttp speedup
  • ciso8601 for ultra-fast datetime parsing while decoding data from ClickHouse for aiohttp and httpx.

Additionally the installation process attempts to use Cython for a speed boost (roughly 30% faster).

Quick Start

Connecting to ClickHouse

aiochclient needs aiohttp.ClientSession or httpx.AsyncClient to connect to ClickHouse:

from aiochclient import ChClient
from aiohttp import ClientSession


async def main():
    async with ClientSession() as s:
        client = ChClient(s)
        assert await client.is_alive()  # returns True if connection is Ok

Querying the database

await client.execute(
    "CREATE TABLE t (a UInt8, b Tuple(Date, Nullable(Float32))) ENGINE = Memory"
)

For INSERT queries you can pass values as *args. Values should be iterables:

await client.execute(
    "INSERT INTO t VALUES",
    (1, (dt.date(2018, 9, 7), None)),
    (2, (dt.date(2018, 9, 8), 3.14)),
)

For fetching all rows at once use the fetch method:

all_rows = await client.fetch("SELECT * FROM t")

For fetching first row from result use the fetchrow method:

row = await client.fetchrow("SELECT * FROM t WHERE a=1")

assert row[0] == 1
assert row["b"] == (dt.date(2018, 9, 7), None)

You can also use fetchval method, which returns first value of the first row from query result:

val = await client.fetchval("SELECT b FROM t WHERE a=2")

assert val == (dt.date(2018, 9, 8), 3.14)

With async iteration on the query results stream you can fetch multiple rows without loading them all into memory at once:

async for row in client.iterate(
        "SELECT number, number*2 FROM system.numbers LIMIT 10000"
):
    assert row[0] * 2 == row[1]

Use fetch/fetchrow/fetchval/iterate for SELECT queries and execute or any of last for INSERT and all another queries.

Working with query results

All fetch queries return rows as lightweight, memory efficient objects. Before v1.0.0 rows were only returned as tuples. All rows have a full mapping interface, where you can get fields by names or indexes:

row = await client.fetchrow("SELECT a, b FROM t WHERE a=1")

assert row["a"] == 1
assert row[0] == 1
assert row[:] == (1, (dt.date(2018, 9, 8), 3.14))
assert list(row.keys()) == ["a", "b"]
assert list(row.values()) == [1, (dt.date(2018, 9, 8), 3.14)]

Documentation

To check out the api docs, visit the readthedocs site..

Type Conversion

aiochclient automatically converts types from ClickHouse to python types and vice-versa.

ClickHouse type Python type
Bool bool
UInt8 int
UInt16 int
UInt32 int
UInt64 int
UInt128 int
UInt256 int
Int8 int
Int16 int
Int32 int
Int64 int
Int128 int
Int256 int
Float32 float
Float64 float
String str
FixedString str
Enum8 str
Enum16 str
Date datetime.date
DateTime datetime.datetime
DateTime64 datetime.datetime
Decimal decimal.Decimal
Decimal32 decimal.Decimal
Decimal64 decimal.Decimal
Decimal128 decimal.Decimal
IPv4 ipaddress.IPv4Address
IPv6 ipaddress.IPv6Address
UUID uuid.UUID
Nothing None
Tuple(T1, T2, ...) Tuple[T1, T2, ...]
Array(T) List[T]
Nullable(T) None or T
LowCardinality(T) T
Map(T1, T2) Dict[T1, T2]
Nested(T1, T2, ...) List[Tuple[T1, T2, ...], Tuple[T1, T2, ...]]

Connection Pool Settings

aiochclient uses the aiohttp.TCPConnector to determine pool size. By default, the pool limit is 100 open connections.

Notes on Speed

It's highly recommended using uvloop and installing aiochclient with speedups for the sake of speed. Some recent benchmarks on our machines without parallelization:

  • 180k-220k rows/sec on SELECT
  • 50k-80k rows/sec on INSERT

Note: these benchmarks are system dependent

aiochclient's People

Contributors

aamalev avatar avmusorin avatar borisrozumnuk avatar chrisguidry avatar crmaxx avatar ericmccarthy7 avatar fibersel avatar filipecaixeta avatar gameram avatar george3d6 avatar leenr avatar maksim-burtsev avatar maxifom avatar maximdanilchenko avatar ns-gsavary avatar pysalt avatar tiejunhu avatar vicpopov avatar vivienm avatar xhochy avatar yiurule avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

aiochclient's Issues

Wrong unconvert for Decimal

There is a problem with unconverting Decimal. str(value) makes a float-like string that is cause of No operation equals between Decimal(19, 2) and Float64 error.

@staticmethod
    def unconvert(value: Decimal) -> bytes:
        return str(value).encode()

So, we made a monkey patch.

from aiochclient.types import Decimal, PY_TYPES_MAPPING

def unconvert_patch(value: Decimal) -> bytes:
    return f"toDecimal64('{value}', {abs(value.as_tuple().exponent)})".encode()

PY_TYPES_MAPPING[Decimal] = unconvert_patch

It had been working until the cython implementation was used inside docker =(
We are not good at cython, therefore there are no monkey patch for it or a pull request for the whole problem.

Query CSV Format

Hi is possible generate a query CSV fromated?
And return a csv?

Make create_chclient function, that returns initialized ChClient with connection pool size option

It is possible to implement it by yourself. But it will be great if aiochclient will have the default one.
It should initialize aiohttp.ClientSession and limit connection pool size with aiohttp.TCPConnector and may proxy some base args to aiohttp.ClientSession and all args for ChClient's.

Like this (create_chclient is needed function):

from aiochclient import create_chclient, ChClient

client = await create_chclient(maxsize=100, compress_response=True)  # type: ChClient
assert await client.is_alive()

It will be perfect for usage in asyncio web apps, with its startup/cleanup pattern.

New Release

Hello @maximdanilchenko

We are using the master branch of aiochclient in production for 15 days now, without any issue.
So I think aiochclient is now ready for a new release :D

Thanks

Query "EXISTS TABLE event" returns None

Something was changed between v1.2.1 and v1.3.0 releases. v2.0.0 doesn't work correctly either. The next code blob now doesn't work as expected -- ChClient.fetchrow returns None whether table event exists or not.

from aiochclient import ChClient
from aiohttp import ClientSession
import asyncio


async def main():
    async with ClientSession() as s:
        client = ChClient(s, url='http://localhost:49637')
        assert await client.is_alive()
        res = await client.fetchrow('EXISTS TABLE event')
        if res is None:
            print('None') # 1.3.1 +
        else:
            print(dict(res)) <= 1.2.1

asyncio.run(main())
# 1.3.1 +
None

# <= 1.2.1
{'result': 1}

impossible to import py2ch from _types

there's no binding for python function in _types.pyx

In [1]: import aiochclient._types

In [2]: import pprint

In [3]: pprint.pprint(dir(aiochclient._types))
['ArrayType',
 'ChClientError',
 'DateTime64Type',
 'DateTimeType',
 'DateType',
 'Decimal',
 'DecimalType',
 'FloatType',
 'IPv4Address',
 'IPv4Type',
 'IPv6Address',
 'IPv6Type',
 'Int16Type',
 'Int32Type',
 'Int64Type',
 'Int8Type',
 'LowCardinalityType',
 'NothingType',
 'NullableType',
 'RE_ARRAY',
 'RE_LOW_CARDINALITY',
 'RE_NULLABLE',
 'RE_TUPLE',
 'StrType',
 'TupleType',
 'UInt16Type',
 'UInt32Type',
 'UInt64Type',
 'UInt8Type',
 'UUID',
 'UUIDType',
 '__all__',
 '__builtins__',
 '__doc__',
 '__file__',
 '__loader__',
 '__name__',
 '__package__',
 '__spec__',
 '__test__',
 'date_parse',
 'datetime_parse',
 'datetime_parse_f',
 'empty_convertor',
 'json2ch',
 're',
 'rows2ch',
 'what_py_converter']

In [4]:

so this line will always fail

from aiochclient._types import rows2ch, json2ch, py2ch

Datetime fromisoformat method is not supported in python 3.6

A few days earlier I've caught the bug with my services on python 3.6, which uses async clickhouse client:

Traceback (most recent call last):
  File "aiochclient/_types.pyx", line 13, in aiochclient._types
ModuleNotFoundError: No module named 'ciso8601'

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  ...
  File "/requirements/aiochclient/__init__.py", line 1, in <module>
    from aiochclient.client import ChClient
  File "/requirements/aiochclient/client.py", line 8, in <module>
    from aiochclient.records import Record, RecordsFabric
  File "/requirements/aiochclient/records.py", line 6, in <module>
    from aiochclient._types import what_py_converter
  File "aiochclient/_types.pyx", line 19, in init aiochclient._types
AttributeError: type object 'datetime.datetime' has no attribute 'fromisoformat'

According to the doc page, the method was introduced in python 3.7, so now this code breaks compatibility with 3.6 versions.

release 2.0.X

Hello,

I'm testing the last master code of aiochclient. It works well for me for now.
(by the way sorry for the bad PR#52)

I think it's safe to release a 2.x.y version.

Thanks a lot!

⭐️ in README and Dockerfile

I'm trying to install aiochclient==1.0.2 in Dockerfile and get next error:

Collecting aiochclient==1.0.2 (from -r /app/requirements.txt (line 2))
  Downloading https://files.pythonhosted.org/packages/ca/5c/33326e9edb67be58b96ae2f24d30c3cdbfbd8876ef422dc0922a04140edd/aiochclient-1.0.2.tar.gz (118kB)
    ERROR: Command errored out with exit status 1:
     command: /usr/bin/python3.6 -c 'import sys, setuptools, tokenize; sys.argv[0] = '"'"'/tmp/pip-install-8wcsrg2m/aiochclient/setup.py'"'"'; __file__='"'"'/tmp/pip-install-8wcsrg2m/aiochclient/setup.py'"'"';f=getattr(tokenize, '"'"'open'"'"', open)(__file__);code=f.read().replace('"'"'\r\n'"'"', '"'"'\n'"'"');f.close();exec(compile(code, __file__, '"'"'exec'"'"'))' egg_info --egg-base pip-egg-info
         cwd: /tmp/pip-install-8wcsrg2m/aiochclient/
    Complete output (9 lines):
    Traceback (most recent call last):
      File "<string>", line 1, in <module>
      File "/tmp/pip-install-8wcsrg2m/aiochclient/setup.py", line 53, in <module>
        long_description=read('README.md'),
      File "/tmp/pip-install-8wcsrg2m/aiochclient/setup.py", line 45, in read
        content = fp.read()
      File "/usr/lib/python3.6/encodings/ascii.py", line 26, in decode
        return codecs.ascii_decode(input, self.errors)[0]
    UnicodeDecodeError: 'ascii' codec can't decode byte 0xe2 in position 4787: ordinal not in range(128)
    ----------------------------------------
ERROR: Command errored out with exit status 1: python setup.py egg_info Check the logs for full command output.

It seems that your ⭐️ in README is a problem.

Record Converter Exception for Invalid UTF-8 Bytestrings

Hi,

in the scenario where the byte-string content being queried from Clickhouse contains invalid bytes (they cannot be decoded to utf-8), the aiochclient will throw an exception. The relevant code is here: https://github.com/maximdanilchenko/aiochclient/blob/master/aiochclient/records.py#L74

So for example, let's suppose we pull some data that looks like this:

Column A  (UInt32Type) | Column B (StrType)
----------------------------------------------
             1         |  b'Hello\xf1aWorldTest'

Then the corresponding tuple expansion in decode will look like:

( (<function UInt32Type.convert>, b'1'),
(<function StrType.convert>, b'Hello\xf1aWorldTest'))

Which will throw an exception akin to UnicodeDecodeError: 'utf-8' codec can't decode byte 0xf1 in position 5: invalid continuation byte.

Now - while it is not generally advisable to store invalid bytestring content in one's database, there are scenarios where this might be required. In these scenarios it would be preferable if aiochclient could simply return the raw bytestring data instead of throwing an exception. Thank you!

Feature Request: Generalize Format Output of Query

Hey - would be great if the output format could be generalized to other output types like Parquet or Arrow. See: https://clickhouse.tech/docs/en/interfaces/formats/

Use Case
I have use cases where New-line delimited JSON works just fine. I have other uses where Parquet, for example, is desirable. This is mainly driven by the applications consuming data. Right now, when I need a format other than JSON, I use the CH HTTP Interface directly.

Potential Implementation
I think this could be achieved by adding argument format_type which would be a string of the supported types. Do some checks on whether CH supports it or not. In ChClient._execute() append the FORMAT {format_type} to the input query.

Where there might be complexity is how to handle type conversion.

Add Github Actions

  • Automatic builds, tests and releases for python 3.8, 3.9, 3.10 and 3.11

DB::Exception: Empty query when executing is_alive() through chproxy

code from aiochclient

async def is_alive(self) -> bool:
    """Checks if connection is Ok.

    Usage:

    .. code-block:: python

        assert await client.is_alive()

    :return: True if connection Ok. False instead.
    """
    async with self._session.get(
        url=self.url
    ) as resp:  # type: client.ClientResponse
        return resp.status == 200

Clickhouse exception stacktrace

2020.03.06 14:11:56.086811 [ 393 ] {15F9BC3CD4B6DFAF} <Error> HTTPHandler: Code: 62, e.displayText() = DB::Exception: Empty query, Stack trace:
0. /usr/bin/clickhouse-server(StackTrace::StackTrace()+0x30) [0x7f0ef70]
1. /usr/bin/clickhouse-server(DB::Exception::Exception(std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> > const&, int)+0x25) [0x3c53485]
2. /usr/bin/clickhouse-server() [0x3aa1529]
3. /usr/bin/clickhouse-server() [0x6eac2b8]
4. /usr/bin/clickhouse-server(DB::executeQuery(DB::ReadBuffer&, DB::WriteBuffer&, bool, DB::Context&, std::function<void (std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> > const&)>, std::function<void (std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> > const&)>)+0x1aa) [0x6eada9a]
5. /usr/bin/clickhouse-server(DB::HTTPHandler::processQuery(Poco::Net::HTTPServerRequest&, HTMLForm&, Poco::Net::HTTPServerResponse&, DB::HTTPHandler::Output&)+0x1a5e) [0x3ca07fe]
6. /usr/bin/clickhouse-server(DB::HTTPHandler::handleRequest(Poco::Net::HTTPServerRequest&, Poco::Net::HTTPServerResponse&)+0x45a) [0x3ca2b5a]
7. /usr/bin/clickhouse-server(Poco::Net::HTTPServerConnection::run()+0x2a9) [0x79c0c59]
8. /usr/bin/clickhouse-server(Poco::Net::TCPServerConnection::start()+0x10) [0x79bba30]
9. /usr/bin/clickhouse-server(Poco::Net::TCPServerDispatcher::run()+0xed) [0x79bc14d]
10. /usr/bin/clickhouse-server(Poco::PooledThread::run()+0x81) [0x807ffc1]
11. /usr/bin/clickhouse-server(Poco::ThreadImpl::runnableEntry(void*)+0x3c) [0x807dd6c]
12. /usr/bin/clickhouse-server() [0xba53c60]
13. /lib/x86_64-linux-gnu/libpthread.so.0(+0x76ba) [0x7fa138b2c6ba]
14. /lib/x86_64-linux-gnu/libc.so.6(clone+0x6d) [0x7fa13835541d]
 (version 19.13.4.32 (official build))

chproxy config:

server:
  http:
      listen_addr: ":<listen-address>"
      allowed_networks: [<networks-list>]
users:
  - name: "readonly"
    allowed_networks: [<networks-list>]
    to_cluster: "clickhouse"
    to_user: "readonly"
    max_concurrent_queries: 6
    max_execution_time: 1m
    password: "$READONLY_PASS"

  - name: "default"
    allowed_networks: [<networks-list>]
    to_cluster: "clickhouse"
    to_user: "default"
    max_concurrent_queries: 6
    max_execution_time: 1m
    password: "$DEFAULT_PASS"

clusters:
  - name: "clickhouse"
    nodes: [
      "$CH_NODES"
    ]
    users:
    - name: "readonly"
      password: "$READONLY_PASS"
    - name: "default"
      password: "$DEFAULT_PASS"

SHOW TABLES does not work

Here is my example, I cannot get anything from SHOW TABLES

from aiohttp import ClientSession
from asyncio import run
async def get():
        async with ClientSession() as s:
            client = ChClient(s)
            print(await client.is_alive())
            print(await client.execute("CREATE TABLE test(uint8 UInt8) ENGINE=Null"))
            print(await client.fetchrow("SHOW TABLES;"))
            print(await client.execute("DROP TABLE test;"))

run(get()) outputs:

True
None
None
None

Sometimes error can't be decoded

Sometimes clickhouse returns error message that contains data, and data is truncated at arbitrary byte, which leads to the following error:

  File "/usr/local/lib/python3.6/dist-packages/aiochclient/client.py", line 161, in execute                                                                  
    async for _ in self._execute(query, *args):                                                                                                              
  File "/usr/local/lib/python3.6/dist-packages/aiochclient/client.py", line 131, in _execute                     
    raise ChClientError((await resp.read()).decode())                          

Fixing the line to to (await resp.read()).decode(errors='ignore') avoids the problem.

FYI, error looks like this:

aiochclient.exceptions.ChClientError: Code: 321, e.displayText() = DB::Exception: Expression returns value NULL, that is out of range of type UInt64, at: NULL,1,NULL,1),('Some long unicode text абвгдеё (version 19.7.3.9 (official build)) 

imposible to select data with type 'DateTime64(3)

It is imposible to select data with type 'DateTime64(3)

CREATE TABLE IF NOT EXISTS tablename (timestamp DateTime64(3),.........

SELECT timestamp from tablename

Traceback (most recent call last): File "aiochclient/_types.pyx", line 588, in aiochclient._types.what_py_type KeyError: 'DateTime64'

File "...../venv/lib/python3.7/site-packages/aiochclient/records.py", line 89, in <listcomp> what_py_converter(tp) for tp in tps.decode().strip().split("\t") File "aiochclient/_types.pyx", line 593, in aiochclient._types.what_py_converter File "aiochclient/_types.pyx", line 595, in aiochclient._types.what_py_converter File "aiochclient/_types.pyx", line 590, in aiochclient._types.what_py_type aiochclient.exceptions.ChClientError: Unrecognized type name: 'DateTime64(3)'

Question

Hi!

How can i insert some batch of rows?

      await client.execute(
          f"INSERT INTO {database_name}.{table_name} (col1, col2, col3) VALUES",
          [r.values() for r in batch],
      )

I got error:
Unrecognized type: '<class 'collections.abc.ValuesView'>'. The value type should be exactly one of int, float, str, dt.date, dt.datetime, tuple, list, uuid.UUID (or None). No subclasses yet.

LowCardinality type support

I got aiochclient.exceptions.ChClientError: Unrecognized type name: 'LowCardinality(String)'

Could you please add LowCardinality type support?

Is it possible?

Thank you.

DateTime64 Conversion Error

Hi there,

Currently running into a crash when trying to access a DateTime64 value at the precision level of nanoseconds instead of milliseconds.

I'll try to fix it locally but to be honest I'm completely lost.

Traceback:

Traceback (most recent call last):
  File "/home/user.name/anaconda3/envs/folder/lib/python3.8/runpy.py", line 194, in _run_module_as_main
    return _run_code(code, main_globals, None,
  File "/home/user.name/anaconda3/envs/folder/lib/python3.8/runpy.py", line 87, in _run_code
    exec(code, run_globals)
  File "/home/user.name/folder/folder/__main__.py", line 48, in <module>
    entry_point()
  File "/home/user.name/anaconda3/envs/folder/lib/python3.8/site-packages/click/core.py", line 829, in __call__
    return self.main(*args, **kwargs)
  File "/home/user.name/anaconda3/envs/folder/lib/python3.8/site-packages/click/core.py", line 782, in main
    rv = self.invoke(ctx)
  File "/home/user.name/anaconda3/envs/folder/lib/python3.8/site-packages/click/core.py", line 1066, in invoke
    return ctx.invoke(self.callback, **ctx.params)
  File "/home/user.name/anaconda3/envs/folder/lib/python3.8/site-packages/click/core.py", line 610, in invoke
    return callback(*args, **kwargs)
  File "/home/user.name/folder/folder/__main__.py", line 27, in entry_point
    asyncio.run(main(config))
  File "/home/user.name/anaconda3/envs/folder/lib/python3.8/asyncio/runners.py", line 44, in run
    return loop.run_until_complete(main)
  File "/home/user.name/anaconda3/envs/folder/lib/python3.8/asyncio/base_events.py", line 616, in run_until_complete
    return future.result()
  File "/home/user.name/folder/folder/__main__.py", line 41, in main
    artifacts = await current_comparer.main()
  File "/home/user.name/folder/folder/comparers/default_comparer.py", line 40, in main
    print(list(row.values()))
  File "/home/user.name/anaconda3/envs/folder/lib/python3.8/_collections_abc.py", line 762, in __iter__
    yield self._mapping[key]
  File "/home/user.name/anaconda3/envs/folder/lib/python3.8/site-packages/aiochclient/records.py", line 46, in __getitem__
    self._decode()
  File "/home/user.name/anaconda3/envs/folder/lib/python3.8/site-packages/aiochclient/records.py", line 74, in _decode
    self._row = tuple(
  File "/home/user.name/anaconda3/envs/folder/lib/python3.8/site-packages/aiochclient/records.py", line 75, in <genexpr>
    converter(val)
  File "aiochclient/_types.pyx", line 389, in aiochclient._types.DateTime64Type.convert
  File "aiochclient/_types.pyx", line 390, in aiochclient._types.DateTime64Type.convert
  File "aiochclient/_types.pyx", line 384, in aiochclient._types.DateTime64Type._convert
  File "aiochclient/_types.pyx", line 379, in aiochclient._types.DateTime64Type._convert
  File "stringsource", line 67, in cfunc.to_py.__Pyx_CFunc_datetime____unicode___to_py.wrap
  File "aiochclient/_types.pyx", line 20, in aiochclient._types._datetime_parse_f
  File "/home/user.name/anaconda3/envs/folder/lib/python3.8/_strptime.py", line 568, in _strptime_datetime
    tt, fraction, gmtoff_fraction = _strptime(data_string, format)
  File "/home/user.name/anaconda3/envs/folder/lib/python3.8/_strptime.py", line 352, in _strptime
    raise ValueError("unconverted data remains: %s" %
ValueError: unconverted data remains: 132

Thanks so much

microseconds are lost for datetime unconvert

>>> from aiochclient.types import py2ch
>>> dt = datetime.datetime.fromisoformat('2021-10-10 06:30:00.999')
>>> py2ch(dt)
b"'2021-10-10 06:30:00'"

same behavior for cython extension

Password may leak through the `aiohttp.ClientOSError` exception when server is behaving badly

Hi.

aiochclient, as of current master (27d93c7, also tested on v2.2.0 from PyPI), with aiohttp==3.8.4, may leak the password to the client's logs if ClickHouse server will close the connection immediately before the client will send the entire request.
In such a case, aiohttp will raise aiohttp.ClientOSError with the full URL in the formatted message - which may include the password since aiochclient sends it as a query parameter by default (if password=... is passed to the aiochclient.ChClient constructor).

Reproducer

I wrote a simple reproducer for the issue: https://gist.github.com/leenr/d23e8043d54545d1d16d2d7f54204475.

In there, a TCP server is created in one thread on localhost:1234, which will immediately close the connection when one is made. In another thread, aiochclient.ChClient instance is created which points to the aforementioned server, and a simple SELECT 1 query is sent.

Executing the script will reproduce the issue:

$ python --version
Python 3.11.2
$ pip freeze
aiochclient @ git+https://github.com/maximdanilchenko/aiochclient@27d93c7e7e145e46d7af8ab5361d351d62111611
aiohttp==3.8.4
aiosignal==1.3.1
async-timeout==4.0.2
attrs==22.2.0
charset-normalizer==3.1.0
frozenlist==1.3.3
idna==3.4
multidict==6.0.4
sqlparse==0.4.3
yarl==1.8.2
$ python password-leak-final.py
ERROR:root:Leak!
Traceback (most recent call last):
  File ".../site-packages/aiohttp/client_reqrep.py", line 581, in write_bytes
    await self.body.write(writer)
  File ".../site-packages/aiohttp/payload.py", line 247, in write
    await writer.write(self._value)
  File ".../site-packages/aiohttp/http_writer.py", line 115, in write
    self._write(chunk)
  File ".../site-packages/aiohttp/http_writer.py", line 75, in _write
    raise ConnectionResetError("Cannot write to closing transport")
ConnectionResetError: Cannot write to closing transport

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "password-leak-final.py", line 38, in amain
    await ch_client.fetchval('SELECT 1')
  File ".../site-packages/aiochclient/client.py", line 342, in fetchval
    async for row in self._execute(
  File ".../site-packages/aiochclient/client.py", line 182, in _execute
    names=await response.__anext__(),
          ^^^^^^^^^^^^^^^^^^^^^^^^^^
  File ".../site-packages/aiochclient/http_clients/aiohttp.py", line 25, in post_return_lines
    async with self._session.post(url=url, params=params, data=data) as resp:
  File ".../site-packages/aiohttp/client.py", line 1141, in __aenter__
    self._resp = await self._coro
                 ^^^^^^^^^^^^^^^^
  File ".../site-packages/aiohttp/client.py", line 560, in _request
    await resp.start(conn)
  File ".../site-packages/aiohttp/client_reqrep.py", line 899, in start
    message, payload = await protocol.read()  # type: ignore[union-attr]
                       ^^^^^^^^^^^^^^^^^^^^^
  File ".../site-packages/aiohttp/streams.py", line 616, in read
    await self._waiter
aiohttp.client_exceptions.ClientOSError: [Errno None] Can not write request body for http://localhost:1234/?user=admin&password=secret-password&database=database
ERROR:asyncio:Unclosed client session
client_session: <aiohttp.client.ClientSession object at 0x7fd3d81ccc50>

Notice the password=secret-password in the exception string.

It's not a made-up issue: unfortunately, it occurs regularly in our production setup.
Also, I would imagine, password leak may occur when there is a proxy server between the client and a server which write a conventional access log with the URL as-is.

Workaround for users

Instead of using password= argument for aiochclient.ChClient, you may supply username in password in the url= argument - in this case aiohttp will convert them into Basic authentication string in Authorization header, and will not save the password in the aiohttp.ClientOSError exception:

$ diff -u password-leak-final.py password-leak-final_test_no.py
--- password-leak-final.py      2023-03-27 04:11:15.982189013 +0300
+++ password-leak-final_test_no.py      2023-03-27 04:12:55.138649578 +0300
@@ -28,10 +28,8 @@
 def client_thread_target() -> None:
     async def amain() -> None:
         ch_client = aiochclient.ChClient(
-            url=f'http://localhost:{PORT}/',
-            database='database',
-            user='admin',
-            password='secret-password'
+            url=f'http://admin:secret-password@localhost:{PORT}/',
+            database='database'
         )
         while True:
             try:
$ python password-leak-final_test_no.py
ERROR:root:Leak!
...
aiohttp.client_exceptions.ClientOSError: [Errno None] Can not write request body for http://localhost:1234/?database=database
...

Proposed solution for the aiochclient

Do not use the password= query argument when talking to the ClickHouse server (in fact, official ClickHouse documentation does not recommend it's usage), and use a HTTP Basic Authentication or X-ClickHouse-Key header instead.
I will make the PR with the latter solution shortly.

Can't execute list of dict when inserting data into the table.

Getting error when inserting into the table as dict

insert_statement = f"INSERT INTO {table_name} VALUES"
await cls._client.execute(insert_statement, data_list)

data_list is a list of dict.

aiochclient.exceptions.ChClientError: Unrecognized type: '<class 'dict'>'. The value type should be exactly one of int, float, str, dt.date, dt.datetime, tuple, list, uuid.UUID (or None). No subclasses yet.

UUID type support

Hi!

Pls add transparent UUID type support to your client (Python UUID to ClickHouse UUID type)

Query escaping to prevent SQL injection?

I found function py2ch in types.py, but it has no usage. It would be nice if we can pass params to SELECT queries and the lib will automatically escape it to prevent SQL injections
#20

Cannot use fetch with sqlparse=0.4.4

Hi!

I am using aiochclient to connect to a clickhouse table. So I am sending a string: "SHOW TABLES from user" to get my tables. Using the aiochclient.ChClient().fetch function.

It works as expected with sqlparse=0.4.3.

However, updating to sqlparse=0.4.4 breaks it and return the error below.

So, I guess it should either be corrected or the requirements should be updated to force sqlparse=0.4.3.

File "/Users/renaud/Documents/platforms/backend-python/common/clickhouse/clickhouse.py", line 196, in get_tables
  result = await self.__clickhouse_connection.client.fetch(
File "/Users/renaud/Library/Caches/pypoetry/virtualenvs/platforms-FlyH1ih9-py3.10/lib/python3.10/site-packages/aiochclient/client.py", line 258, in fetch
  return [
File "/Users/renaud/Library/Caches/pypoetry/virtualenvs/platforms-FlyH1ih9-py3.10/lib/python3.10/site-packages/aiochclient/client.py", line 258, in <listcomp>
  return [
File "/Users/renaud/Library/Caches/pypoetry/virtualenvs/platforms-FlyH1ih9-py3.10/lib/python3.10/site-packages/aiochclient/client.py", line 145, in _execute
  need_fetch, is_json, statement_type = self._parse_squery(query)
File "/Users/renaud/Library/Caches/pypoetry/virtualenvs/platforms-FlyH1ih9-py3.10/lib/python3.10/site-packages/aiochclient/client.py", line 409, in _parse_squery
  statement = sqlparse.parse(query)[0]
File "/Users/renaud/Library/Caches/pypoetry/virtualenvs/platforms-FlyH1ih9-py3.10/lib/python3.10/site-packages/sqlparse/__init__.py", line 30, in parse
  return tuple(parsestream(sql, encoding))
File "/Users/renaud/Library/Caches/pypoetry/virtualenvs/platforms-FlyH1ih9-py3.10/lib/python3.10/site-packages/sqlparse/engine/filter_stack.py", line 26, in run
  stream = lexer.tokenize(sql, encoding)
File "/Users/renaud/Library/Caches/pypoetry/virtualenvs/platforms-FlyH1ih9-py3.10/lib/python3.10/site-packages/sqlparse/lexer.py", line 155, in tokenize
  return Lexer.get_default_instance().get_tokens(sql, encoding)
File "/Users/renaud/Library/Caches/pypoetry/virtualenvs/platforms-FlyH1ih9-py3.10/lib/python3.10/site-packages/sqlparse/lexer.py", line 52, in get_default_instance
  cls._default_intance.default_initialization()
File "/Users/renaud/Library/Caches/pypoetry/virtualenvs/platforms-FlyH1ih9-py3.10/lib/python3.10/site-packages/sqlparse/lexer.py", line 59, in default_initialization
  self.set_SQL_REGEX(keywords.SQL_REGEX)
File "/Users/renaud/Library/Caches/pypoetry/virtualenvs/platforms-FlyH1ih9-py3.10/lib/python3.10/site-packages/sqlparse/lexer.py", line 78, in set_SQL_REGEX
  self._SQL_REGEX = [
File "/Users/renaud/Library/Caches/pypoetry/virtualenvs/platforms-FlyH1ih9-py3.10/lib/python3.10/site-packages/sqlparse/lexer.py", line 79, in <listcomp>
  (re.compile(rx, FLAGS).match, tt)
File "/Users/renaud/.asdf/installs/python/3.10.5/lib/python3.10/re.py", line 251, in compile
  return _compile(pattern, flags)
File "/Users/renaud/.asdf/installs/python/3.10.5/lib/python3.10/re.py", line 302, in _compile
  raise TypeError("first argument must be string or compiled pattern")

Rename cursor method to iterate

From

async for row in client.cursor(
    "SELECT number, number*2 FROM system.numbers LIMIT 10000"
):
    assert row[0] * 2 == row[1]

to

async for row in client.iterate(
    "SELECT number, number*2 FROM system.numbers LIMIT 10000"
):
    assert row[0] * 2 == row[1]

because it is more logical naming for such case.

Support for Python 3.11

Hi team,
I noticed that currently the python package requirements specifically mention < 3.11a0
Now that 3.11 is GA, would it be possible to allow installation on python 3.11?

Thanks!

No error forwarding when talking to clickhouse in "cluster mode"

How to reproduce

import httpx, asyncio, aiochclient

c = aiochclient.ChClient(httpx.AsyncClient(), url="http://localhost:8123")


async def main():
   # Errors responded by clickhouse in non-cluster mode are correctly forwarded as Exceptions
    try:
        res = await c.execute("CREATE TABLE x (a StupidType) ENGINE MergeTree()")
        print(res)
    except Exception as e:
        print("THIS ERROR IS CORRECTLY FORWARDED: ", e)

   # Errors in cluster mode are not raised, nor are they retrievable from the HTTP response... simply lost
    try:
        res = await c.execute("CREATE TABLE x ON CLUSTER mycluster (a StupidType) ENGINE MergeTree()")
        print(res) # Execute returns nothing, so you can't even retrieve the JSON-formatted error output
    except Exception as e:
        # We should get some exception...
        print("ERRORS FROM THE CLUSTER'S NODES ARE NOT FORWARDED, YOU'LL NEVER SEE THIS MESSAGE", e)

    await c.close()

asyncio.run(main())

Why it happens

  • Clickhouse always responds with HTTP code 200, even if some node has failed to execute the statement. The error is embedded a JSON response like, e.g., :
{
	"meta":
	[
		{
			"name": "host",
			"type": "String"
		},
		{
			"name": "port",
			"type": "UInt16"
		},
		{
			"name": "status",
			"type": "Int64"
		},
		{
			"name": "error",
			"type": "String"
		},
		{
			"name": "num_hosts_remaining",
			"type": "UInt64"
		},
		{
			"name": "num_hosts_active",
			"type": "UInt64"
		}
	],

	"data":
	[
		["server101", 9000, "450", "Code: 450, e.displayText() = DB::Exception: TTL Expression GROUP BY key should be a prefix of primary key (version 21.3.20.1 (official build))", "5", "0"],
		["server2", 9000, "450", "Code: 450, e.displayText() = DB::Exception: TTL Expression GROUP BY key should be a prefix of primary key (version 21.3.20.1 (official build))", "4", "0"],
		["server103", 9000, "450", "Code: 450, e.displayText() = DB::Exception: TTL Expression GROUP BY key should be a prefix of primary key (version 21.3.20.1 (official build))", "3", "0"],
		["server3", 9000, "450", "Code: 450, e.displayText() = DB::Exception: TTL Expression GROUP BY key should be a prefix of primary key (version 21.3.20.1 (official build))", "2", "0"],
		["server102", 9000, "450", "Code: 450, e.displayText() = DB::Exception: TTL Expression GROUP BY key should be a prefix of primary key (version 21.3.20.1 (official build))", "1", "0"],
		["server1", 9000, "450", "Code: 450, e.displayText() = DB::Exception: TTL Expression GROUP BY key should be a prefix of primary key (version 21.3.20.1 (official build))", "0", "0"]
  • We don't retrieve the HTTP response body for queries that went well (HTTP 200 code) in ChClient.execute => we don't have any access to those error messages embedded in the JSON response

Incriminated code

https://github.com/maximdanilchenko/aiochclient/blob/master/aiochclient/client.py#L189

we only exploit the HTTP response when we anticipate some records as a result. As CREATE TABLE and the like never return records, we simply drop the useful error messages when calling self._http_client.post_no_return(...).

Solution

Would you accept a PR that won't drop HTTP response body and when body embeds some error messages wrap the messages into some raised exception at ChClient._execute level ?

Is there any bad side effect to do this ?

Cannot use an INSERT query with a special format

ClickHouse handle different input as a format, for some case, it can be really useful to insert a file like a CSV/TSV for optimization purpose.

https://clickhouse.tech/docs/en/interfaces/formats/

According to the documentation, the requirement for the *args is only to be an iterable, but when we try to send an iterable who aren't a tuple, but who are just a list of string who represent a CSV, aiochclient crash as it's expect than we send a list of tuple when we do a INSERT query.

https://github.com/maximdanilchenko/aiochclient/blob/master/aiochclient/types.py#L365-L366

It would be really good for the library to have a feature to let more liberty for the developers to use formats who aren't only a simple list of tuple when we do an INSERT query

Clickhouse 'Map' type not supported in inserts via ChClient.execute()

CREATE TABLE map_test(
map_field Map(String, String)
)

client.execute("INSERT INTO map_test VALUES", {'hello': 'world'})

Gives:
aiochclient.exceptions.ChClientError: Unrecognized type: '<class 'dict'>'. The value type should be exactly one of int, float, str, dt.date, dt.datetime, tuple, list, uuid.UUID (or None). No subclasses yet.

Shame :(

CREATE LIVE VIEW and WATCH support

I wonder if the LIVE VIEW and WATCH statements support is in plan. ClickHouse 19.16.3.6 introduced this feature. The client can get notified when a new data is inserted. It eliminates polling using SELECT.

Some of the things that I thought about.

  • A cursor should be able to watch existing LIVE VIEW
  • A cursor should be able to create a TEMPORARY LIVE VIEW with hash ID (maybe application should handle this. Each app has a different security requirement)
  • handle response in event driven manner
  • handle heart beat for the long-lived connection

Should it be implemented in a different package? Any thoughts?

Relevant articles
https://www.altinity.com/blog/2019/11/13/making-data-come-to-life-with-clickhouse-live-view-tables
https://www.altinity.com/blog/2019/12/05/taking-a-closer-look-at-clickhouse-live-view-tables

Map type with cython doesn't work

python 3.11 with cython when selecting map type:
TypeError: Expected unicode, got dict
and
TypeError: Argument 'string' has incorrect type (expected str, got dict)

Support macros in sql queries

ClickHouse macros are extremely useful when you're working on clustered environment. This library is great, but it doesn't support them.

Steps to reproduce:

  1. Configure clustered ClickHouse. I use 2 shards and 2 replicas.
  2. Run the code
import asyncio
from aiochclient import ChClient
from aiohttp import ClientSession

async def main():
    async with ClientSession() as s:
        client = ChClient(s)
        await client.execute("drop table if exists user_shard on cluster '{cluster}'")

if __name__ == '__main__':
    asyncio.run(main())

The string drop table if exists user_shard on cluster '{cluster}' is a valid sql code.

Expected behaviour:
no errors, user_shard table is created on all nodes.

Observed behaviour
There is an exception

Traceback (most recent call last):
  File "test.py", line 11, in <module>
    asyncio.run(main())
  File "/usr/local/Cellar/[email protected]/3.9.7_1/Frameworks/Python.framework/Versions/3.9/lib/python3.9/asyncio/runners.py", line 44, in run
    return loop.run_until_complete(main)
  File "/usr/local/Cellar/[email protected]/3.9.7_1/Frameworks/Python.framework/Versions/3.9/lib/python3.9/asyncio/base_events.py", line 642, in run_until_complete
    return future.result()
  File "test.py", line 8, in main
    await client.execute("drop table if exists user_shard on cluster '{cluster}'")
  File "venv/lib/python3.9/site-packages/aiochclient/client.py", line 233, in execute
    async for _ in self._execute(
  File "venv/lib/python3.9/site-packages/aiochclient/client.py", line 145, in _execute
    query = query.format(**query_params)
KeyError: 'cluster'

P.S.
The library automatically tries to replace variables in curly braces. I found the following code in client.py file

    async def _execute(
        self,
        query: str,
        *args,
        json: bool = False,
        query_params: Optional[Dict[str, Any]] = None,
        query_id: str = None,
        decode: bool = True,
    ) -> AsyncGenerator[Record, None]:
        query_params = self._prepare_query_params(query_params)
        query = query.format(**query_params)

Is it possible to add some parameter to execute method to make such transformation optionally? If I comment the string "query = query.format(**query_params)" the method will work fine.

As a temporary solution I use the following trick

import asyncio
from aiochclient import ChClient
from aiohttp import ClientSession


async def main():
    async with ClientSession() as s:
        client = ChClient(s)
        await client.execute("drop table if exists user_shard on cluster {cluster}",
                             params={"cluster": "{cluster}"})


if __name__ == '__main__':
    asyncio.run(main())

but it's not good because sql code is invalid. Creating Replicated* table is a more complicated case

CREATE TABLE IF NOT EXISTS user_ on cluster '{cluster}'
(
    userid       UInt64,
    emailaddress String,
)
 ENGINE = ReplicatedMergeTree('/clickhouse/{installation}/{cluster}/tables/{shard}/{database}/{table}', '{replica}')

I think the clickhouse data types could be improved

Now the clickhouse data types of only a few kinds,it can be perfected like SimpleAggregateFunction...
CH_TYPES_MAPPING = { "UInt8": IntType, "UInt16": IntType, "UInt32": IntType, "UInt64": IntType, "Int8": IntType, "Int16": IntType, "Int32": IntType, "Int64": IntType, "Float32": FloatType, "Float64": FloatType, "String": StrType, "FixedString": StrType, "Enum8": StrType, "Enum16": StrType, "Date": DateType, "DateTime": DateTimeType, "DateTime64": DateTime64Type, "Tuple": TupleType, "Array": ArrayType, "Nullable": NullableType, "Nothing": NothingType, "UUID": UUIDType, "LowCardinality": LowCardinalityType, "Decimal": DecimalType, "Decimal32": DecimalType, "Decimal64": DecimalType, "Decimal128": DecimalType, "IPv4": IPv4Type, "IPv6": IPv6Type, }

aiohttp.client_exceptions.ServerDisconnectedError

When I executed the following code
async with ClientSession() as s: client = ChClient(s,url='http://192.168.1.90:8123/',user='algorithmrw',password='dd',database='dd') alive = await client.is_alive() all_rows = await client.fetch("SELECT * FROM phalgorithm.DataAnalyse")
I ran into this problem “aiohttp.client_exceptions.ServerDisconnectedError”.

But,when I execute the insert statement, there is no problem

Enexpected params behaviour: Syntax error

Got enexpected params behaviour

Example from docs:

await client.execute(
    "INSERT INTO {table_name} VALUES",
    (1, (dt.date(2018, 9, 7), None)),
    (2, (dt.date(2018, 9, 8), 3.14)),
    params={"table_name": "t"}
)

My request

row = await client.fetchrow(
    "SELECT * FROM {table_name}",
    params={"table_name": "sales"}
)

Exception

aiochclient.exceptions.ChClientError: Code: 62, e.displayText() = DB::Exception: Syntax error: failed at position 15 (''sales''): 'sales' FORMAT TSVWithNamesAndTypes. Expected one of: VIEW, SELECT subquery, compound identifier, identifier, element of expression with optional alias, list of elements, function, table, table function, subquery or list of joined tables, table or subquery or table function (version 21.3.18.4 (official build))

geohash TypeError

I would like to use the builtin CH function geohashDecode().
geohashDecode returns a Tuple(longitude <float>, latitude <float>)

I added the patch in the aiochclient tests:

diff --git a/tests.py b/tests.py
index 872d030..a78e386 100644
--- a/tests.py
+++ b/tests.py
@@ -686,6 +686,9 @@ class TestTypes:
         assert record[0] == result
         assert record["datetime"] == result
 
+    async def test_geohash(self):
+        assert await self.ch.fetchval("SELECT geohashDecode(geohashEncode(1.0, 2.0))")
+
 
 @pytest.mark.fetching
 @pytest.mark.usefixtures("class_chclient")

Here what I found:

aiochclient/_types.pyx:624: in aiochclient._types.what_py_type
    return CH_TYPES_MAPPING[ch_type](name, container=container)
aiochclient/_types.pyx:404: in genexpr
    self.types = tuple(what_py_type(tp, container=True).p_type for tp in tps.split(","))
aiochclient/_types.pyx:404: in genexpr
    self.types = tuple(what_py_type(tp, container=True).p_type for tp in tps.split(","))
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

>   raise ChClientError(f"Unrecognized type name: '{name}'")
E   aiochclient.exceptions.ChClientError: Unrecognized type name: 'longitude Float64'

I'm not sure how to fix this myself. @maximdanilchenko if you don't have the time to do it, I'm open to have some tips to try to do it myself.

Problem with request answer parsing

Hello!
I have a problem with your driver in request answer parsing. When I request array of arrays of ints from the database, i get ints misplaced in sub arrays.

This is create table code:
CREATE TABLE IF NOT EXISTS user_logs
(
created_at Int64, user_id Int32, event_id Int32, entity String, intention String, result Array(Array(Int32))
) ENGINE = MergeTree
PARTITION BY (toYYYYMM(toDate(intDiv(created_at, 1000))))
ORDER BY (created_at)

Insert data code:
INSERT INTO user_logs (created_at, user_id, event_id, entity, intention, result) values
[[1579524424420, 3562, 4, 'camera_contract_binding_to_stage', '{}', [[1, 1], [2, 2]]]]

Request:
sql = """SELECT created_at, user_id, event_id, entity, intention, result
FROM user_logs
WHERE hasAny(result, [[toInt32(1), toInt32(1)]])"""

Executing request:
rows: List[aiochclient.records.Record] = await client.fetch(sql)
print([list(row.values()) for row in rows])

Here is what I expect:
[['1579256044578', '3562', '4', 'camera_contract_binding_to_stage', '{}', '[[1, 1], [2, 2]]']]

Here is what I get:
[['1579256044578', '3562', '4', 'camera_contract_binding_to_stage', '{}', '[[1], [1], [2], [2]]']]

I`ve checked this behavior with 1.2.1 and 1.3.0rc0

results encoding twice with httpx

With aiohttp, we get results in bytes, then we decode them in the Fabric* but for httpx it returns directly unicode, so we need to convert result before sending the Fabric*

aiohttp (bytes) -> http_client (keep bytes) -> Fabric* decode() (unicode)
httpx (unicode) -> http_client encode() -> Fabric* decode() (unicode)

We could avoid this by changing the http_client interface and converting to unicode at this level.

aiohttp (bytes) -> http_client decode() -> Fabric*
httpx (unicode) -> http_client -> Fabric*

I think performances should be almost the same for aiohttp, but should be a boost to httpx.
Thanks

sqlparse is a hard requirement but not listed as such

See the following traceback:

import: 'aiochclient'
Traceback (most recent call last):
  File "/home/conda/feedstock_root/build_artifacts/aiochclient_1582137580250/test_tmp/run_test.py", line 2, in <module>
    import aiochclient
  File "/home/conda/feedstock_root/build_artifacts/aiochclient_1582137580250/_test_env_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_plac/lib/python3.6/site-packages/aiochclient/__init__.py", line 1, in <module>
    from aiochclient.client import ChClient
  File "/home/conda/feedstock_root/build_artifacts/aiochclient_1582137580250/_test_env_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_plac/lib/python3.6/site-packages/aiochclient/client.py", line 10, in <module>
    from aiochclient.sql import sqlparse
  File "/home/conda/feedstock_root/build_artifacts/aiochclient_1582137580250/_test_env_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_plac/lib/python3.6/site-packages/aiochclient/sql.py", line 3, in <module>
    import sqlparse.keywords
ModuleNotFoundError: No module named 'sqlparse'

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.