jayvynl / django-clickhouse-backend Goto Github PK
View Code? Open in Web Editor NEWDjango clickhouse database backend.
License: MIT License
Django clickhouse database backend.
License: MIT License
Hi,
I am just starting on using Clickhouse with Django and found this. Seems like a life saviour.
I tried to set it up to connect with my clickhouse cloud DB on my macbook. I have already have an AWS RDS instance and its replica that I am connecting to.
I added this in my DATABASES:
'clickhouse': {
'ENGINE': 'clickhouse_backend.backend',
'NAME': 'Demo cluster',
'HOST': env.str("CLICKHOUSE_HOST", default='localhost'),
'PORT': env.int("CLICKHOUSE_PORT", default=8443),
'USER': env.str("CLICKHOUSE_USERNAME", default='default'),
'PASSWORD': env.str("CLICKHOUSE_PASSWORD", default=''),
'TEST': {
'fake_transaction': True
}
}
I am getting this error:
django.db.utils.OperationalError: Code: 210. Connection reset by peer
I am on:
Django==4.1.4
clickhouse-connect==0.6.3
clickhouse-driver==0.2.6
clickhouse-pool==0.5.3
django-clickhouse-backend==1.0.3
and on:
Python 3.10.9
Any guidance on how to bypass this?
Is there any way to view the query that is generated during migrations?
I get this error when I try to apply migrations and want to understand what's the matter:
2024-02-06 15:11:23 Traceback (most recent call last):
2024-02-06 15:11:23 File "/usr/local/lib/python3.11/site-packages/clickhouse_driver/dbapi/cursor.py", line 111, in execute
2024-02-06 15:11:23 response = execute(
2024-02-06 15:11:23 ^^^^^^^^
2024-02-06 15:11:23 File "/usr/local/lib/python3.11/site-packages/clickhouse_backend/driver/client.py", line 53, in execute
2024-02-06 15:11:23 rv = self.process_ordinary_query(
2024-02-06 15:11:23 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
2024-02-06 15:11:23 File "/usr/local/lib/python3.11/site-packages/clickhouse_driver/client.py", line 571, in process_ordinary_query
2024-02-06 15:11:23 return self.receive_result(with_column_types=with_column_types,
2024-02-06 15:11:23 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
2024-02-06 15:11:23 File "/usr/local/lib/python3.11/site-packages/clickhouse_driver/client.py", line 204, in receive_result
2024-02-06 15:11:23 return result.get_result()
2024-02-06 15:11:23 ^^^^^^^^^^^^^^^^^^^
2024-02-06 15:11:23 File "/usr/local/lib/python3.11/site-packages/clickhouse_driver/result.py", line 50, in get_result
2024-02-06 15:11:23 for packet in self.packet_generator:
2024-02-06 15:11:23 File "/usr/local/lib/python3.11/site-packages/clickhouse_driver/client.py", line 220, in packet_generator
2024-02-06 15:11:23 packet = self.receive_packet()
2024-02-06 15:11:23 ^^^^^^^^^^^^^^^^^^^^^
2024-02-06 15:11:23 File "/usr/local/lib/python3.11/site-packages/clickhouse_driver/client.py", line 237, in receive_packet
2024-02-06 15:11:23 raise packet.exception
2024-02-06 15:11:23 clickhouse_driver.errors.ServerException: Code: 62.
2024-02-06 15:11:23 DB::Exception: Syntax error: failed at position 268 ('<'): <clickhouse_backend.models.fields.integer.UInt64Field>)). Expected one of: literal, NULL, number, Bool, true, false, string literal, SELECT query, possibly with UNION, list of union elements, SELECT query, subquery, possibly with UNION, SELECT subquery, SELECT query, WITH, FROM, SELECT, EXPLAIN, token, Comma, ClosingRoundBracket, CAST operator, NOT, INTERVAL, CASE, DATE, TIMESTAMP, tuple, collection of literals, array, asterisk, qualified asterisk, compound identifier, list of elements, identifier, COLUMNS matcher, COLUMNS, qualified COLUMNS matcher, substitution, MySQL-style global variable, end of query. Stack trace:
2024-02-06 15:11:23
2024-02-06 15:11:23 0. DB::Exception::Exception(DB::Exception::MessageMasked&&, int, bool) @ 0x000000000c07e5a8 in /usr/bin/clickhouse
2024-02-06 15:11:23 1. DB::Exception::createDeprecated(String const&, int, bool) @ 0x0000000007aa41c0 in /usr/bin/clickhouse
2024-02-06 15:11:23 2. DB::parseQueryAndMovePosition(DB::IParser&, char const*&, char const*, String const&, bool, unsigned long, unsigned long) @ 0x0000000012669bd8 in /usr/bin/clickhouse
2024-02-06 15:11:23 3. DB::executeQueryImpl(char const*, char const*, std::shared_ptr<DB::Context>, bool, DB::QueryProcessingStage::Enum, DB::ReadBuffer*) @ 0x00000000112e6eb0 in /usr/bin/clickhouse
2024-02-06 15:11:23 4. DB::executeQuery(String const&, std::shared_ptr<DB::Context>, bool, DB::QueryProcessingStage::Enum) @ 0x00000000112e63fc in /usr/bin/clickhouse
2024-02-06 15:11:23 5. DB::TCPHandler::runImpl() @ 0x0000000011f4abdc in /usr/bin/clickhouse
2024-02-06 15:11:23 6. DB::TCPHandler::run() @ 0x0000000011f5bfc8 in /usr/bin/clickhouse
2024-02-06 15:11:23 7. Poco::Net::TCPServerConnection::start() @ 0x0000000014462e44 in /usr/bin/clickhouse
2024-02-06 15:11:23 8. Poco::Net::TCPServerDispatcher::run() @ 0x0000000014464360 in /usr/bin/clickhouse
2024-02-06 15:11:23 9. Poco::PooledThread::run() @ 0x00000000145a3b7c in /usr/bin/clickhouse
2024-02-06 15:11:23 10. Poco::ThreadImpl::runnableEntry(void*) @ 0x00000000145a1d24 in /usr/bin/clickhouse
2024-02-06 15:11:23 11. start_thread @ 0x0000000000007624 in /lib/libpthread.so.0
2024-02-06 15:11:23 12. ? @ 0x00000000000d149c in /lib/libc.so.6
你好~ 我在 django models 中定义 req_start_time = models.DateTime64Field(precision=3)
,执行 migrate
之后,发现数据库中的字段类型是 req_start_time DateTime64(3, 'UTC')
,此时在插入 req_start_time=1684278000000
,clickhouse 会存储为 -8 h 的时间;
我应该如何设置时区?比如设置成 Asia/Shanghai
仓库代码是这样定义的
"DateTime64Field": "DateTime64(%(precision)s, 'UTC')" if settings.USE_TZ else "DateTime64(%(precision)s)",
是否可以扩展一下,支持自定义时区?比如 req_start_time = models.DateTime64Field(precision=3, time_zone="Asia/Shanghai")
Add this into operations.py
def format_for_duration_arithmetic(self, sql):
# https://clickhouse.com/docs/en/sql-reference/data-types/special-data-types/interval
return "INTERVAL %s MICROSECOND" % sql
Without it our project crashing.
And may be you could enable PRs? Planned to create PR but they are disabled.
Is the syntax for creating array fields correct? Migrations currently error with the syntax Int64[]
but run with Array(Int64)
.
Describe the bug
When I try to apply migrations (python src/manage.py migrate --database clickhouse
) for clickhose database I catch error like this:
File "/usr/local/lib/python3.11/site-packages/clickhouse_driver/dbapi/cursor.py", line 117, in execute
raise OperationalError(orig)
django.db.utils.OperationalError: Code: 81.
DB::Exception: Database INFORMATION_SCHEMA doesn't exist. Stack trace:
To Reproduce
Project structure
docker/
scripts/
core_app_startup.sh
docker-compose.env
docker-compose.yml
Dockerfile
src/
core/
parser/
migrations/
0001_initial.py
models.py
project/
settings.py
dbrouters.py
manage.py
python src/manage.py migrate --database default
python src/manage.py migrate --database clickhouse
python src/manage.py runserver 0.0.0.0:8000
POSTGRES_HOST="postgres_database"
POSTGRES_PORT="5432"
POSTGRES_DB="postgres"
POSTGRES_USER="postgres"
POSTGRES_PASSWORD="postgres"
API_SECRET_KEY="***"
CLICKHOUSE_DB="clickhouse_database"
CLICKHOUSE_USER="clickhouse"
CLICKHOUSE_PASSWORD="clickhouse"
CLICKHOUSE_HOST="clickhouse"
CLICKHOUSE_PORT="9000"
version: '3.8'
services:
api:
container_name: dt_api_container
build:
dockerfile: docker/Dockerfile
context: ..
command: [ "/app/docker/scripts/core_app_startup.sh" ]
ports:
- "8000:8000"
volumes:
- ${PWD}/src:/app/src
depends_on:
- postgres_db
postgres_db:
image: postgres:15.4
container_name: dt_postgres_db_container
env_file:
- ./docker-compose.env
volumes:
- ${PWD}/.postgres_data:/var/lib/postgresql/data/
ports:
- "5432:5432"
clickhouse_db:
image: yandex/clickhouse-server:21.3
container_name: dt_clickhouse_db_container
hostname: clickhouse
env_file:
- ./docker-compose.env
volumes:
- ${PWD}/.clickhouse_data:/var/lib/clickhouse
ports:
- "8123:8123"
- "9011:9000"
FROM python:3.11.4
ENV PYTHONDONTWRITEBYTECODE 1
ENV PYTHONUNBUFFERED 1
ENV PIP_ROOT_USER_ACTION=ignore
WORKDIR /app
COPY ../requirements.txt ./requirements.txt
COPY ../docker/docker-compose.env ./docker/docker-compose.env
COPY ../docker/scripts/*.sh ./docker/scripts/
RUN pip install --upgrade pip
RUN pip install -r requirements.txt
RUN chmod a+x docker/scripts/*.sh
# Generated by Django 4.2.4 on 2023-09-13 14:54
import clickhouse_backend.models
from django.db import migrations, models
import django.utils.timezone
class Migration(migrations.Migration):
dependencies = [
]
operations = [
migrations.CreateModel(
name='Event',
fields=[
('id', models.BigAutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('ip', clickhouse_backend.models.GenericIPAddressField(default='::')),
('ipv4', clickhouse_backend.models.GenericIPAddressField(default='127.0.0.1')),
('ip_nullable', clickhouse_backend.models.GenericIPAddressField(null=True)),
('port', clickhouse_backend.models.UInt16Field(default=0)),
('protocol', clickhouse_backend.models.StringField(default='', low_cardinality=True)),
('content', clickhouse_backend.models.StringField(default='')),
('timestamp', clickhouse_backend.models.DateTime64Field(default=django.utils.timezone.now)),
('created_at', clickhouse_backend.models.DateTime64Field(auto_now_add=True)),
('action', clickhouse_backend.models.EnumField(choices=[(1, 'Pass'), (2, 'Drop'), (3, 'Alert')], default=1)),
],
options={
'verbose_name': 'Network event',
'db_table': 'event',
'ordering': ['-id'],
'engine': clickhouse_backend.models.ReplacingMergeTree(enable_mixed_granularity_parts=1, index_granularity=1024, index_granularity_bytes=1048576, order_by=['id'], partition_by=models.Func('timestamp', function='toYYYYMMDD')),
'indexes': [clickhouse_backend.models.Index(fields=['ip'], granularity=4, name='ip_set_idx', type=clickhouse_backend.models.Set(1000)), clickhouse_backend.models.Index(fields=['ipv4'], granularity=1, name='ipv4_bloom_idx', type=clickhouse_backend.models.BloomFilter(0.001))],
},
),
migrations.AddConstraint(
model_name='event',
constraint=models.CheckConstraint(check=models.Q(('port__gte', 0), ('port__lte', 65535)), name='port_range'),
),
]
from clickhouse_backend import models
from django.db.models import CheckConstraint, Func, IntegerChoices, Q
from django.utils import timezone
class Event(models.ClickhouseModel):
class Action(IntegerChoices):
PASS = 1
DROP = 2
ALERT = 3
ip = models.GenericIPAddressField(default="::")
ipv4 = models.GenericIPAddressField(default="127.0.0.1")
ip_nullable = models.GenericIPAddressField(null=True)
port = models.UInt16Field(default=0)
protocol = models.StringField(default="", low_cardinality=True)
content = models.StringField(default="")
timestamp = models.DateTime64Field(default=timezone.now)
created_at = models.DateTime64Field(auto_now_add=True)
action = models.EnumField(choices=Action.choices, default=Action.PASS)
class Meta:
verbose_name = "Network event"
ordering = ["-id"]
db_table = "event"
engine = models.ReplacingMergeTree(
order_by=["id"],
partition_by=Func("timestamp", function="toYYYYMMDD"),
index_granularity=1024,
index_granularity_bytes=1 << 20,
enable_mixed_granularity_parts=1,
)
indexes = [
models.Index(
fields=["ip"],
name="ip_set_idx",
type=models.Set(1000),
granularity=4
),
models.Index(
fields=["ipv4"],
name="ipv4_bloom_idx",
type=models.BloomFilter(0.001),
granularity=1
)
]
constraints = (
CheckConstraint(
name="port_range",
check=Q(port__gte=0, port__lte=65535),
),
)
import os
import sys
from pathlib import Path
from dotenv import load_dotenv
# Базовая директория проекта (== корневая папка репозитория) ----------------------------------------------------------#
BASE_DIR = Path(__file__).resolve().parent.parent.parent
# Загрузка переменных окружения ---------------------------------------------------------------------------------------#
if 'test' in sys.argv:
load_dotenv(dotenv_path=f'{BASE_DIR}/docker/docker-compose.example.env')
else:
load_dotenv(dotenv_path=f'{BASE_DIR}/docker/docker-compose.env')
# Основные настройки --------------------------------------------------------------------------------------------------#
SECRET_KEY = os.getenv('API_SECRET_KEY')
DEBUG = True
ALLOWED_HOSTS = [
'127.0.0.1',
'0.0.0.0'
]
CORS_ALLOWED_ORIGINS = [
'http://127.0.0.1:8080',
]
INSTALLED_APPS = [
'django.contrib.admin',
'django.contrib.auth',
'django.contrib.contenttypes',
'django.contrib.sessions',
'django.contrib.messages',
'django.contrib.staticfiles',
'rest_framework',
'rest_framework.authtoken',
'dj_rest_auth',
'django.contrib.sites',
'allauth',
'allauth.account',
'dj_rest_auth.registration',
'corsheaders',
'core.apps.CoreConfig',
'parser.apps.ParserConfig'
]
SITE_ID = 1
EMAIL_BACKEND = 'django.core.mail.backends.console.EmailBackend'
ACCOUNT_AUTHENTICATION_METHOD = "email"
ACCOUNT_EMAIL_REQUIRED = True
ACCOUNT_USERNAME_REQUIRED = False
ACCOUNT_EMAIL_VERIFICATION = 'optional'
AUTHENTICATION_BACKENDS = [
'django.contrib.auth.backends.ModelBackend',
'allauth.account.auth_backends.AuthenticationBackend',
]
MIDDLEWARE = [
'django.middleware.security.SecurityMiddleware',
'django.contrib.sessions.middleware.SessionMiddleware',
'corsheaders.middleware.CorsMiddleware',
'django.middleware.common.CommonMiddleware',
'django.middleware.csrf.CsrfViewMiddleware',
'django.contrib.auth.middleware.AuthenticationMiddleware',
'django.contrib.messages.middleware.MessageMiddleware',
'django.middleware.clickjacking.XFrameOptionsMiddleware',
]
ROOT_URLCONF = 'project.urls'
WSGI_APPLICATION = 'project.wsgi.application'
# Шаблоны (HTML) ------------------------------------------------------------------------------------------------------#
TEMPLATES = [
{
'BACKEND': 'django.template.backends.django.DjangoTemplates',
'DIRS': [
os.path.join(BASE_DIR, 'templates')
],
'APP_DIRS': True,
'OPTIONS': {
'context_processors': [
'django.template.context_processors.debug',
'django.template.context_processors.request',
'django.contrib.auth.context_processors.auth',
'django.contrib.messages.context_processors.messages',
],
},
},
]
# Подключаемые базы данных ------------------------------------------------------------------------------------------- #
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.postgresql_psycopg2',
'NAME': os.getenv('POSTGRES_DB'),
'USER': os.getenv('POSTGRES_USER'),
'PASSWORD': os.getenv('POSTGRES_PASSWORD'),
'HOST': os.getenv('POSTGRES_HOST'),
'PORT': os.getenv('POSTGRES_PORT')
},
"clickhouse": {
"ENGINE": "clickhouse_backend.backend",
"NAME": os.getenv('CLICKHOUSE_DB'),
"USER": os.getenv('CLICKHOUSE_USER'),
"PASSWORD": os.getenv('CLICKHOUSE_PASSWORD'),
"HOST": os.getenv('CLICKHOUSE_HOST'),
'PORT': os.getenv('CLICKHOUSE_PORT')
}
}
DATABASE_ROUTERS = ["project.dbrouters.ClickHouseRouter"]
# Валидации паролей -------------------------------------------------------------------------------------------------- #
AUTH_PASSWORD_VALIDATORS = [
{
'NAME': 'django.contrib.auth.password_validation.UserAttributeSimilarityValidator',
},
{
'NAME': 'django.contrib.auth.password_validation.MinimumLengthValidator',
},
{
'NAME': 'django.contrib.auth.password_validation.CommonPasswordValidator',
},
{
'NAME': 'django.contrib.auth.password_validation.NumericPasswordValidator',
},
]
# Локализация (язык, формат даты и времени) -------------------------------------------------------------------------- #
LANGUAGE_CODE = 'ru-ru'
DATETIME_FORMAT = 'd-m-Y H:i:s'
TIME_ZONE = 'Europe/Chisinau'
USE_L10N = False
USE_I18N = True
USE_TZ = True
# Статические файлы (CSS, JavaScript, Images) ------------------------------------------------------------------------ #
STATIC_URL = 'static/'
# Тип первичного ключа по умолчанию для таблиц, созданных на основе моделей -------------------------------------------#
DEFAULT_AUTO_FIELD = 'django.db.models.BigAutoField'
# REST фреймворк ----------------------------------------------------------------------------------------------------- #
REST_FRAMEWORK = {
'DEFAULT_PAGINATION_CLASS': 'rest_framework.pagination.PageNumberPagination',
'PAGE_SIZE': 10,
'DEFAULT_PERMISSION_CLASSES': [
'rest_framework.permissions.DjangoModelPermissionsOrAnonReadOnly',
],
'DEFAULT_AUTHENTICATION_CLASSES': [
'rest_framework.authentication.TokenAuthentication',
],
}
REST_AUTH = {
'SESSION_LOGIN': False,
}
from clickhouse_backend.models import ClickhouseModel
def get_subclasses(class_):
classes = class_.__subclasses__()
index = 0
while index < len(classes):
classes.extend(classes[index].__subclasses__())
index += 1
return list(set(classes))
class ClickHouseRouter:
def __init__(self):
self.route_model_names = set()
for model in get_subclasses(ClickhouseModel):
if model._meta.abstract:
continue
self.route_model_names.add(model._meta.label_lower)
def db_for_read(self, model, **hints):
if (model._meta.label_lower in self.route_model_names
or hints.get("clickhouse")):
return "clickhouse"
return None
def db_for_write(self, model, **hints):
if (model._meta.label_lower in self.route_model_names
or hints.get("clickhouse")):
return "clickhouse"
return None
def allow_migrate(self, db, app_label, model_name=None, **hints):
if (f"{app_label}.{model_name}" in self.route_model_names
or hints.get("clickhouse")):
return db == "clickhouse"
elif db == "clickhouse":
return False
return None
Note: Clickhouse database container works fine (i can get access from PyCharm UI to database)
Expected behavior
Created table event
Versions
SELECT version()
query - 21.3.20.1
.3.11.4
.0.2.6
.4.2.4
.1.1.1
.I am using OAuth to authenticate the user. On the first login of any user I get an Exception:
"ValueError at /oauth2/callback User: [email protected] needs to have a value for field "id" before this many-to-many relationship can be used."
Request Method: GET
Request URL: http://localhost:8000/oauth2/callback?code=0.Aa8AyJqK8agTxUGeKQfNkaLngKyK9yFuRJlLowdUuQ9I7zOvAAA.AgABAAIAAAD--DLA3VO7QrddgJg7WevrAgDs_wUA9P-Q7GuTyy8frIzyqyW92Mp9vr-fkoVb-0PTvBG0gIC1pcWPLSdjvNxeFSRykIW2CDlOjH3hHUha20WZ-6J9BZVwFwLVrksxT1N27gjGEy8Q4aVjfjX_OwJEbJVD0ZNePRVWiquxgTzaa10t3hT6X0SeVp_sQcktyHoLgJnjlOvq2zL0Cp01L_FrS2lrDvZzA0rviS8yrh6z2hmB2imoSGtEOMOu3zXNoDbjfAdMPQb3K2mAcK129X0D78C_uSEJ2t6tpVxqOnuQA7v_NSpMPvSWEgQX5pR7UtvkU_vzf2AMzWa-S3PxBs2GD2nJ8WPuxR-XY6lADvhHdWm82OVgiQuCmojMjkzvYpk7EXZr7f9Q1tlxE9zC6SxnDcBUr1GQZ29KBAPlYA7CcEnqxuKL99bgGU55B7Vx3G5crGJKK80oXrMnyuWG4UrjLipzHmzWMqa75ppqEC-COJy8DpaLeQkF548scn2fr5_I_w0csyFUCqRSadJXllGafZDA2HuOOe1ozXQztu8kanojALpzNoOQlPbdoDTUnVR3ybjqxd5d2RgIJqQBIn3tgLnaLNRV_r15XoxdJKAlUJ2H5QV8qp1xTRnDXTyU&state=L2luZGV4Lmh0bWw%3d&session_state=778f99e7-882e-47f7-a107-281cb2c2260a
Django Version: 4.1.1
Exception Type: ValueError
Exception Value:
"<User: [email protected]>" needs to have a value for field "id" before this many-to-many relationship can be used.
Exception Location: /home/xxxxx/env/lib64/python3.11/site-packages/django/db/models/fields/related_descriptors.py, line 979, in init
Raised during: django_auth_adfs.views.OAuth2CallbackView
Python Executable: /home/xxxxx/env/bin/python3
Python Version: 3.11.4
Python Path:
['/home/xxx/work/repos/project/app',
'/usr/lib64/python311.zip',
'/usr/lib64/python3.11',
'/usr/lib64/python3.11/lib-dynload',
'/home/xxx/env/lib64/python3.11/site-packages',
'/home/xxx/work/repos/project/tests',
'/home/xxx/work/repos/project/pymodules',
'/home/xxx/env/lib/python3.11/site-packages']
On Django console:
Connected to ClickHouse server version 23.5.3, revision: 54462
Query: SELECT "auth_user"."id", "auth_user"."password", "auth_user"."last_login", "auth_user"."is_superuser", "auth_user"."username", "auth_user"."first_name", "auth_user"."last_name", "auth_user"."email", "auth_user"."is_staff", "auth_user"."is_active", "auth_user"."date_joined" FROM "auth_user" WHERE "auth_user"."username" = '[email protected]' LIMIT 21
Block "" send time: 0.000024
Query: INSERT INTO "auth_user"("password", "last_login", "is_superuser", "username", "first_name", "last_name", "email", "is_staff", "is_active", "date_joined") VALUES
Block "" send time: 0.000030
Block "" send time: 0.000224
Block "" send time: 0.000032
User '[email protected]' has been created.
Attribute 'first_name' for instance '[email protected]' was set to 'Test'.
Attribute 'last_name' for instance '[email protected]' was set to 'User E2E'.
Attribute 'email' for instance '[email protected]' was set to '[email protected]'.
Internal Server Error: /oauth2/callback
....
But the entry seem somehow to be generated, if I go back to the start page, everything works fine, the user is authenticated and can access the site. Login/logout works fine, no issues...
If a second user logs in the same happens. Afterwards I get a second problem (not sure if related to the missing id, that's why I put it In here):
get() returned more than one User -- it returned 2!
Request Method: GET
Request URL: http://localhost:8000/oauth2/callback?code=0.Aa8AyJqK8agTxUGeKQfNkaLngKyK9yFuRJlLowdUuQ9I7zOvAAA.AgABAAIAAAD--DLA3VO7QrddgJg7WevrAgDs_wUA9P8uM_O2KwxsGRB2rqMqALBcUu1_IlMBqqmTqBCiuP1wbmVCTyMrWUK5-chrWgWyAhTFGPa4wgKCvnlSx6A_uUDEG1MvLsTQwjFH2TP3dWgIfWZDdV2AwLGk_L20BepiAkSVB_TtE9tY3vBU30PPFug18jL9hyWSgoZPBvoHuCaECr_ANRBUnZjHmSf6PgW1bqT05LQixHwp7WRcbxYTNdRa_WjDt-ts47iuMlO3hX2lZ18spPF2VSmfZjp9poJrjSANWuoe7Do_oSsUNi70qDzcltqvrn1IhLN8YHYBFwg1SaNgfEJiQfj5z0lubymID1Dd6wyiq2qclm0ycfoW2E7Gii2B3W81oSRr5KEh9JDkKMa3D9qHGjzbZDwr7m3E8-GQ8ZCH2OdUTnJWagMi1nQYRnmOSnNx_MSjSQIRUHMaf_l_Jk7XdNCZ0tzzOxMGGiwR5DIIh0nqT28BGpnb50mUYehh8ETzvcWY51BqF1QPl8rLIFUb7kqzKF156Xi_W7FJP3Rgnm9qAkBvYDp40x8Qebzjs0egPTLFcPJYMIvf9ED1VKeJrA2i4n4MlqTvmF2sKd02eSyMekSUWOsZagKgNdEZwrdXmw&state=L2luZGV4Lmh0bWw%3d&session_state=5860aa97-abc1-4b8e-9764-3b101f4fadd9
Django Version: 4.1.1
Exception Type: MultipleObjectsReturned
Exception Value:
get() returned more than one User -- it returned 2!
Exception Location: /home/xxx/env/lib64/python3.11/site-packages/django/db/models/query.py, line 653, in get
Raised during: django_auth_adfs.views.OAuth2CallbackView
Python Executable: /home/xxx/env/bin/python3
Python Version: 3.11.4
Python Path:
['/home/xxx/work/repos/projekt/app',
'/usr/lib64/python311.zip',
'/usr/lib64/python3.11',
'/usr/lib64/python3.11/lib-dynload',
'/home/xxx/env/lib64/python3.11/site-packages',
'/home/xxx/work/repos/projekt/tests',
'/home/xxx/work/repos/projekt/pymodules',
'/home/xxx/env/lib/python3.11/site-packages']
Server time: Tue, 18 Jul 2023 11:56:54 +0000
In the database I see two identical (!) entries, same username, same timestamp...
When using an update query on the main (postgres) database, I get the following error:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <django.db.models.sql.subqueries.UpdateQuery object at 0x7fa278f94130>
def clone(self):
> obj = super().clone()
E TypeError: super(type, obj): obj must be an instance or subtype of type
.../lib/python3.10/site-packages/clickhouse_backend/models/sql/query.py:20: TypeError
I believe this code is the reason:
for query_class in [subqueries.UpdateQuery, subqueries.DeleteQuery]:
for attr in ['clone', 'explain']:
setattr(query_class, attr, getattr(Query, attr))
Describe the bug
python manage.py migrate not working
To Reproduce
Create a new django app try connecting to Clickhouse then run migrate command
Expected behavior
It should create all default tables without any error
Logs
$ python manage.py migrate
Operations to perform:
Apply all migrations: admin, auth, contenttypes, polls, sessions
Running migrations:
Traceback (most recent call last):
File "C:\Users\vignesh\AppData\Local\Programs\Python\Python38\lib\site-packages\clickhouse_driver\dbapi\cursor.py", line 111, in execute
response = execute(
File "C:\Users\vignesh\AppData\Local\Programs\Python\Python38\lib\site-packages\clickhouse_backend\driver\client.py", line 53, in execute
rv = self.process_ordinary_query(
File "C:\Users\vignesh\AppData\Local\Programs\Python\Python38\lib\site-packages\clickhouse_driver\client.py", line 571, in process_ordinary_query
return self.receive_result(with_column_types=with_column_types,
File "C:\Users\vignesh\AppData\Local\Programs\Python\Python38\lib\site-packages\clickhouse_driver\client.py", line 204, in receive_result
return result.get_result()
File "C:\Users\vignesh\AppData\Local\Programs\Python\Python38\lib\site-packages\clickhouse_driver\result.py", line 50, in get_result
for packet in self.packet_generator:
File "C:\Users\vignesh\AppData\Local\Programs\Python\Python38\lib\site-packages\clickhouse_driver\client.py", line 220, in packet_generator
packet = self.receive_packet()
File "C:\Users\vignesh\AppData\Local\Programs\Python\Python38\lib\site-packages\clickhouse_driver\client.py", line 237, in receive_packet
raise packet.exception
clickhouse_driver.errors.ServerException: Code: 57.
DB::Exception: Table django_local.django_migrations already exists. Stack trace:
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "C:\Users\vignesh\AppData\Local\Programs\Python\Python38\lib\site-packages\django\db\backends\utils.py", line 87, in _execute
return self.cursor.execute(sql)
File "C:\Users\vignesh\AppData\Local\Programs\Python\Python38\lib\site-packages\clickhouse_backend\driver\connection.py", line 105, in execute
super().execute(operation, parameters)
File "C:\Users\vignesh\AppData\Local\Programs\Python\Python38\lib\site-packages\clickhouse_driver\dbapi\cursor.py", line 117, in execute
raise OperationalError(orig)
clickhouse_driver.dbapi.errors.OperationalError: Code: 57.
DB::Exception: Table django_local.django_migrations already exists. Stack trace:
Versions
SELECT version()
query. - 23.10.1.1976When executing the command: "python manage.py migrate --database clickhouse" an error occurs
"django.db.utils.OperationalError: Code: 102. Unexpected packet from server localhost:8123 (expected Hello or Exception, got Unknown packet)"
My settings:
INSTALLED_APPS = [
# ...
clickhouse_backend,
# ...
]
DATABASES = {
'default': {
"ENGINE": "django.db.backends.postgresql_psycopg2",
"NAME": "test",
"USER": "postgres",
"PASSWORD": "",
"HOST": "localhost",
"PORT": "5432",
},
"clickhouse": {
"ENGINE": "clickhouse_backend.backend",
"NAME": "test",
"HOST": "localhost",
"PORT": "8123",
"USER": "default",
"PASSWORD": "",
}
}
DATABASE_ROUTERS = ['dbrouters.ClickHouseRouter']
Clickhouse is deployed in a docker container. I have tried different versions.
Describe the bug
It seems that GROUP_BY contains extra fields
To Reproduce
qs = model.objects.filter(
timestamp__gte=start, sku=self.kwargs.get("sku")
).annotate(
c=Count("sku")
).values(
"sku",
"marketplace",
)
with this request - I get sql
SELECT
"prices"."sku", "prices"."marketplace"
FROM "prices"
WHERE ("prices"."sku" = 126129665 AND "prices"."timestamp" >= 2023-08-29 15:59:08+00:00)
GROUP BY "prices"."timestamp", "prices"."sku", "prices"."name", "prices"."priceU", "prices"."salePriceU", "prices"."marketplace", "prices"."url", "prices"."brandId", "prices"."categoryId", "prices"."rating", "prices"."saleQuantity", "prices"."review", "prices"."brand"
Expected behavior
GROUP BY should only contain "sku", "marketplace"
Versions
Describe the bug
Migrations will be run several times if the Django application connects to a specific cluster shard.
I manually added the table on the cluster to fix the issue:
CREATE TABLE IF NOT EXISTS django_migrations_local ON CLUSTER '{cluster}'
(
`id` Int64,
`app` FixedString(255),
`name` FixedString(255),
`applied` DateTime64(6, 'UTC')
)
ENGINE = ReplicatedMergeTree('/clickhouse/tables/{cluster}/default/{table}', '{replica}')
ORDER BY id
SETTINGS index_granularity = 8192;
CREATE TABLE IF NOT EXISTS django_migrations ON CLUSTER '{cluster}' AS django_migrations_local
ENGINE = Distributed('{cluster}', default, django_migrations_local, id);
However, this assumes that you have a macro for the cluster name. That way, you ensure the migration data is synced on every shard and every replica.
insert_distributed_sync=1 might be needed before running the migration.
Describe the bug
partition via toYYYYMMDD raise TypeError: Value after * must be an iterable, not toYYYYMMDD.
To Reproduce
from clickhouse_backend import models
class Event(models.ClickhouseModel):
ip = models.GenericIPAddressField(default='::')
ipv4 = models.IPv4Field(default="127.0.0.1")
ip_nullable = models.GenericIPAddressField(null=True)
port = models.UInt16Field(default=0)
protocol = models.StringField(default='', low_cardinality=True)
content = models.JSONField()
timestamp = models.DateTime64Field()
class Meta:
engine = models.ReplacingMergeTree(
order_by=['id'],
partition_by=models.toYYYYMMDD('timestamp')
)
Exception raised when migrate.
Expected behavior
No exception.
Versions
DATABASES = {
'clickhouse': {
'ENGINE': 'clickhouse_backend.backend',
'NAME': 'demo',
'HOST': '127.0.0.1',
"PORT": '9000',
"USER": "default",
"PASSWORD": "demo",
'OPTIONS': {
"connections_min": 40,
"connections_max": 100,
"settings": {
"mutations_sync": 1,
},
},
}
}
clickhouse_backend.driver.pool.ClickhousePool() got multiple values for keyword argument 'connections_min'
Thank you for this great package! Superb work.
Describe the bug
I am running an raw SQL statement to insert records to one table based on records from another table (INSERT...SELECT). I am passing no data to the execute() statement, but because the statement starts with "INSERT INTO", the backend's client class identifies it as an insert statement that needs data, so it calls process_insert_query
. This calls receive_sample_block
, expecting ClickHouse to send a DATA packet. But since ClickHouse does not need any data for the insert, a PROGRESS packet is sent instead. The result is an exception:
Traceback (most recent call last):
File "zzz/lib/python3.8/site-packages/clickhouse_driver/dbapi/cursor.py", line 111, in execute
response = execute(
File "zzz/lib/python3.8/site-packages/clickhouse_backend/driver/client.py", line 36, in execute
rv = self.process_insert_query(
File "zzz/lib/python3.8/site-packages/clickhouse_driver/client.py", line 596, in process_insert_query
sample_block = self.receive_sample_block()
File "zzz/lib/python3.8/site-packages/clickhouse_driver/client.py", line 623, in receive_sample_block
raise errors.UnexpectedPacketFromServerError(message)
clickhouse_driver.errors.UnexpectedPacketFromServerError: Code: 102. Unexpected packet from server localhost:9000 (expected Data, Exception, Log or TableColumns, got Progress)
The fix should be to call process_ordinary_query
when the user did not provide any data to insert, even if the statement begins with "INSERT INTO".
Versions
Describe the bug
Due to the fact that django will create a new connection object for every thread, connection pool does not take effect.
To Reproduce
Open django console.
import threading
from django.db import connection
def print_connection():
connection.ensure_connection()
print(f"thread {threading.current_thread().name}: {repr(connection.connection)}")
print_connection()
t = threading.Thread(target=print_connection)
t.start()
t.join()
# thread MainThread: <connection object at 0x7f303d62d480; closed: False>
# thread Thread-62 (print_connection): <connection object at 0x7f303d5e3400; closed: False>
Expected behavior
Share connection object between threads so that connection pool take effect.
Versions
Hi Does this package support Django Rest Framework?
Describe the bug
i have such error while python manage.py migrate --database clickhouse -
DB::Exception: There's no column 'django_migrations.deleted' in table 'django_migrations': While processing django_migrations.deleted: While processing SELECT django_migrations.id, django_migrations.app, django_migrations.name, django_migrations.applied, django_migrations.deleted FROM django_migrations WHERE NOT django_migrations.deleted. Stack trace:
.
To Reproduce
I do not know what changed by i just run python manage.py migrate --database clickhouse
Expected behavior
No errors.
Versions
SELECT version()
query.连接线上分布式的数据,会出现数据库和数据表都只创建在本地中,未同步到各个节点和节点的分布式表中,包括插入数据也一样。
Describe the bug
Using models.ReplicatedReplacingMergeTree engine reported an error:
AttributeError: 'ReplicatedReplacingMergeTree' object has no attribute 'expressions'
To Reproduce
class Student(models.ClickhouseModel):
name = models.StringField()
address = models.StringField()
score = models.Int8Field()
class Meta:
engine = models.ReplicatedReplacingMergeTree(
"/clickhouse/tables/{uuid}/{shard}",
"{replica}",
order_by="id"
)
cluster = "cluster"
Expected behavior
Same as using models.ReplicatedMergeTree logic
Versions
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "/app/./manage.py", line 22, in <module>
main()
File "/app/./manage.py", line 18, in main
execute_from_command_line(sys.argv)
File "/app/pkgs/django/core/management/__init__.py", line 446, in execute_from_command_line
utility.execute()
File "/app/pkgs/django/core/management/__init__.py", line 440, in execute
self.fetch_command(subcommand).run_from_argv(self.argv)
File "/app/pkgs/django/core/management/base.py", line 402, in run_from_argv
self.execute(*args, **cmd_options)
File "/app/pkgs/django/core/management/base.py", line 448, in execute
output = self.handle(*args, **options)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/app/pkgs/django/core/management/base.py", line 96, in wrapped
res = handle_func(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/app/pkgs/django/core/management/commands/migrate.py", line 376, in handle
emit_post_migrate_signal(
File "/app/pkgs/django/core/management/sql.py", line 52, in emit_post_migrate_signal
models.signals.post_migrate.send(
File "/app/pkgs/django/dispatch/dispatcher.py", line 176, in send
return [
^
File "/app/pkgs/django/dispatch/dispatcher.py", line 177, in <listcomp>
(receiver, receiver(signal=self, sender=sender, **named))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/app/pkgs/django/contrib/auth/management/__init__.py", line 51, in create_permissions
create_contenttypes(
File "/app/pkgs/django/contrib/contenttypes/management/__init__.py", line 142, in create_contenttypes
ContentType.objects.using(using).bulk_create(cts)
File "/app/pkgs/django/db/models/query.py", line 816, in bulk_create
returned_columns = self._batched_insert(
^^^^^^^^^^^^^^^^^^^^^
File "/app/pkgs/django/db/models/query.py", line 1825, in _batched_insert
self._insert(
File "/app/pkgs/django/db/models/query.py", line 1791, in _insert
return query.get_compiler(using=using).execute_sql(returning_fields)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/app/pkgs/clickhouse_backend/models/sql/compiler.py", line 150, in execute_sql
cursor.execute(sql, params)
File "/app/pkgs/django/db/backends/utils.py", line 102, in execute
return super().execute(sql, params)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/app/pkgs/django/db/backends/utils.py", line 67, in execute
return self._execute_with_wrappers(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/app/pkgs/django/db/backends/utils.py", line 80, in _execute_with_wrappers
return executor(sql, params, many, context)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/app/pkgs/django/db/backends/utils.py", line 84, in _execute
with self.db.wrap_database_errors:
File "/app/pkgs/django/db/utils.py", line 91, in __exit__
raise dj_exc_value.with_traceback(traceback) from exc_value
File "/app/pkgs/django/db/backends/utils.py", line 89, in _execute
return self.cursor.execute(sql, params)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/app/pkgs/clickhouse_backend/driver/connection.py", line 105, in execute
super().execute(operation, parameters)
File "/app/pkgs/clickhouse_driver/dbapi/cursor.py", line 117, in execute
raise OperationalError(orig)
django.db.utils.OperationalError: Code: 53. Type mismatch in VALUES section. Repeat query with types_check=True for detailed info. Column id: 'i' format requires -2147483648 <= number <= 2147483647
My base model
class UUIDBasedModel(models.ClickhouseModel):
id = models.UUIDField(primary_key=True, default=uuid.uuid4, editable=False)
class Meta:
abstract = True
Table in database is created but this exception
CityHash64 is a most common function used in Distributed table to shard data based on sharding key.
ClickhouseManager
available in history model.settings
query method available in ClickhouseManager
.ClickhouseManager
importable from clickhouse_backend.models
(also change it's deconstruct
method).Json support is now available in clickhouse
Is it possible to use the "Integration Engines" from https://clickhouse.com/docs/en/engines/table-engines ?
For example:
ENGINE = PostgreSQL('host:port', 'database', 'table', 'user', 'password'[, `schema`]);
There are two places where field.null
is handled:
if new_rel.field.null:
rel_type = "Nullable(%s)" % rel_type
if field.null:
sql = "Nullable(%s)" % sql
As a result, I get Nullable(Nullable(UUID))
which turns into
DB::Exception: Nested type Nullable(UUID) cannot be inside Nullable type.
CREATE TABLE test_db.shard_table on cluster default
(
`cmo_id` String,
`year` UInt16,
`code` LowCardinality(String),
`clp_id` AggregateFunction(uniq, Int64),
`plp_id` AggregateFunction(uniq, Int64)
)
ENGINE = AggregatingMergeTree
PARTITION BY year
ORDER BY (code, cmo_id, year)
SETTINGS index_granularity = 8192
Hi @jayvynl any method to define above table definition as a unmanaged model using the package?
I see there is a branch with LowCardinality and timezone support. Is it ready for merge?
Hi,
I am just starting on using ClickHouse with Django 3.2.18 ,and Python 3.8.7.But it not worked.
I added this in my Database setting:
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.mysql',
'NAME': APP_CODE + '-dev',
'USER': 'root',
'PASSWORD': 'jlpay@159357',
'HOST': '172.20.4.90',
'PORT': 13306,
"init_command": "SET foreign_key_checks = 0;"
},
'clickhouse': {
'ENGINE': 'clickhouse_backend.backend',
'NAME': 'default',
'HOST': '172.20.4.90',
'USER': 'admin',
'PASSWORD': 'xxxxxx',
'TEST': {
'fake_transaction': True
}
}
}
DATABASE_ROUTERS = ['dbrouters.ClickHouseRouter']
DEFAULT_AUTO_FIELD = 'django.db.models.BigAutoField'
and when I import in databaseroutes.py :
from clickhouse_backend.models import ClickhouseModel
same errors was happened:
Exception has occurred: AppRegistryNotReady
Apps aren't loaded yet.
File "D:\vscode\opsmanage\opsmanage\conf\dev.py", line 41, in <module>
from clickhouse_backend.models import ClickhouseModel
File "D:\vscode\opsmanage\opsmanage\settings.py", line 13, in <module>
_module = __import__(DJANGO_CONF_MODULE, globals(), locals(), ['*'])
File "D:\vscode\opsmanage\manage.py", line 27, in <module>
execute_from_command_line(sys.argv)
expect your's
thanks
i want user this code but Window() func not working
exclude_next = Q(value=F('next_value')) | Q(value=F('second_next_value'))
exclude_prev = Q(value=F('prev_value')) | Q(value=F('second_prev_value'))
queryset = ValueChart.objects.filter(signal=signal.id,
date_time__range=(
validated_data.get('date_time_before', None),
validated_data.get('date_time_after', None))
).annotate(next_value=Window(
expression=Lead('value')),
second_next_value=Window(
expression=Lead('value', 2)),
prev_value=Window(Lag('value')),
second_prev_value=Window(Lag('value', 2))
).order_by('date_time').exclude(exclude_next, exclude_prev
).annotate(x=F('date_time'),
y=F('value')).order_by(
'date_time').values('x','y')
Describe the bug
inspectdb dosent work with leatest backend version
on version 1.1.1 - it definitely worked
(at that time I had a different version of the clickhouse server without a cluster, maybe that’s the problem)
To Reproduce
run python manage.py inspectdb --database=clickhouse
and get
# This is an auto-generated Django model module.
# You'll have to do the following manually to clean this up:
# * Rearrange models' order
# * Make sure each model has one field with primary_key=True
# * Make sure each ForeignKey and OneToOneField has `on_delete` set to the desired behavior
# * Remove `managed = False` lines if you wish to allow Django to create, modify, and delete the table
# Feel free to rename the models, but don't rename db_table values or field names.
from django.db import models
without any models
Expected behavior
models should be generated
Versions
Seems like a new version was released (1.1.4) but the changelog does not describe it
Describe the bug
error when trying to apply migrations to another database
To Reproduce
I have 3 databases on my project:
default - pgsql
external - pgsql
clickhouse - clickhouse
I added clickhouse recently
before its addition there were models for the external pgsql database. migrations were applied without problems.
after adding the clickhouse - I added several models for it where there was a field with the DateTimeField type
I dropped data from an external database and wanted to apply old migrations with the command: python manage.py migrate --database=external , but got this traceback
Expected behavior
migrations must apply
Versions
Traceback (most recent call last):
File "/usr/lib/python3.11/runpy.py", line 198, in _run_module_as_main
return _run_code(code, main_globals, None,
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/lib/python3.11/runpy.py", line 88, in _run_code
exec(code, run_globals)
File "/home/fleapse/.vscode/extensions/ms-python.python-2023.14.0/pythonFiles/lib/python/debugpy/adapter/../../debugpy/launcher/../../debugpy/__main__.py", line 39, in <module>
cli.main()
File "/home/fleapse/.vscode/extensions/ms-python.python-2023.14.0/pythonFiles/lib/python/debugpy/adapter/../../debugpy/launcher/../../debugpy/../debugpy/server/cli.py", line 430, in main
run()
File "/home/fleapse/.vscode/extensions/ms-python.python-2023.14.0/pythonFiles/lib/python/debugpy/adapter/../../debugpy/launcher/../../debugpy/../debugpy/server/cli.py", line 284, in run_file
runpy.run_path(target, run_name="__main__")
File "/home/fleapse/.vscode/extensions/ms-python.python-2023.14.0/pythonFiles/lib/python/debugpy/_vendored/pydevd/_pydevd_bundle/pydevd_runpy.py", line 321, in run_path
return _run_module_code(code, init_globals, run_name,
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/fleapse/.vscode/extensions/ms-python.python-2023.14.0/pythonFiles/lib/python/debugpy/_vendored/pydevd/_pydevd_bundle/pydevd_runpy.py", line 135, in _run_module_code
_run_code(code, mod_globals, init_globals,
File "/home/fleapse/.vscode/extensions/ms-python.python-2023.14.0/pythonFiles/lib/python/debugpy/_vendored/pydevd/_pydevd_bundle/pydevd_runpy.py", line 124, in _run_code
exec(code, run_globals)
File "/home/fleapse/Projects/django_comments/manage.py", line 22, in <module>
main()
File "/home/fleapse/Projects/django_comments/manage.py", line 18, in main
execute_from_command_line(sys.argv)
File "/home/fleapse/.cache/pypoetry/virtualenvs/django-comments-CMNQPpe7-py3.11/lib/python3.11/site-packages/django/core/management/__init__.py", line 446, in execute_from_command_line
utility.execute()
File "/home/fleapse/.cache/pypoetry/virtualenvs/django-comments-CMNQPpe7-py3.11/lib/python3.11/site-packages/django/core/management/__init__.py", line 440, in execute
self.fetch_command(subcommand).run_from_argv(self.argv)
File "/home/fleapse/.cache/pypoetry/virtualenvs/django-comments-CMNQPpe7-py3.11/lib/python3.11/site-packages/django/core/management/base.py", line 402, in run_from_argv
self.execute(*args, **cmd_options)
File "/home/fleapse/.cache/pypoetry/virtualenvs/django-comments-CMNQPpe7-py3.11/lib/python3.11/site-packages/django/core/management/base.py", line 448, in execute
output = self.handle(*args, **options)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/fleapse/.cache/pypoetry/virtualenvs/django-comments-CMNQPpe7-py3.11/lib/python3.11/site-packages/django/core/management/base.py", line 96, in wrapped
res = handle_func(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/fleapse/.cache/pypoetry/virtualenvs/django-comments-CMNQPpe7-py3.11/lib/python3.11/site-packages/django/core/management/commands/migrate.py", line 349, in handle
post_migrate_state = executor.migrate(
^^^^^^^^^^^^^^^^^
File "/home/fleapse/.cache/pypoetry/virtualenvs/django-comments-CMNQPpe7-py3.11/lib/python3.11/site-packages/django/db/migrations/executor.py", line 107, in migrate
self.recorder.ensure_schema()
File "/home/fleapse/.cache/pypoetry/virtualenvs/django-comments-CMNQPpe7-py3.11/lib/python3.11/site-packages/clickhouse_backend/patch/migrations.py", line 142, in ensure_schema
editor.create_model(self.Migration)
File "/home/fleapse/.cache/pypoetry/virtualenvs/django-comments-CMNQPpe7-py3.11/lib/python3.11/site-packages/django/db/backends/base/schema.py", line 444, in create_model
sql, params = self.table_sql(model)
^^^^^^^^^^^^^^^^^^^^^
File "/home/fleapse/.cache/pypoetry/virtualenvs/django-comments-CMNQPpe7-py3.11/lib/python3.11/site-packages/django/db/backends/base/schema.py", line 216, in table_sql
definition, extra_params = self.column_sql(model, field)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/fleapse/.cache/pypoetry/virtualenvs/django-comments-CMNQPpe7-py3.11/lib/python3.11/site-packages/django/db/backends/base/schema.py", line 348, in column_sql
field_db_params = field.db_parameters(connection=self.connection)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/fleapse/.cache/pypoetry/virtualenvs/django-comments-CMNQPpe7-py3.11/lib/python3.11/site-packages/django/db/models/fields/__init__.py", line 823, in db_parameters
type_string = self.db_type(connection)
^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/fleapse/.cache/pypoetry/virtualenvs/django-comments-CMNQPpe7-py3.11/lib/python3.11/site-packages/clickhouse_backend/models/fields/base.py", line 76, in db_type
self._check_backend(connection)
File "/home/fleapse/.cache/pypoetry/virtualenvs/django-comments-CMNQPpe7-py3.11/lib/python3.11/site-packages/clickhouse_backend/models/fields/base.py", line 70, in _check_backend
raise ImproperlyConfigured(
django.core.exceptions.ImproperlyConfigured: DateTimeField must only be used with django clickhouse backend.
class BaseRouter:
external_route_app_labels = {"external"}
clickhouse_route_app_labels = {"clickhouse"}
def db_for_read(self, model, **hints):
if model._meta.app_label in self.external_route_app_labels:
return "external"
if model._meta.app_label in self.clickhouse_route_app_labels:
return "clickhouse"
return "default"
def db_for_write(self, model, **hints):
if model._meta.app_label in self.external_route_app_labels:
return "external"
if model._meta.app_label in self.clickhouse_route_app_labels:
return "clickhouse"
return "default"
def allow_relation(self, obj1, obj2, **hints):
return True
def allow_migrate(self, db, app_label, model_name=None, **hints):
if app_label == "external":
return db == "external"
if app_label == "clickhouse":
return db == "clickhouse"
if db == "external" or db == "clickhouse":
return False
return None
Describe the bug
A database error is thrown during update of jsonfield.
backend | Internal Server Error: /
backend | Traceback (most recent call last):
backend | File "/usr/local/lib/python3.8/site-packages/clickhouse_driver/dbapi/cursor.py", line 111, in execute
backend | response = execute(
backend | File "/usr/local/lib/python3.8/site-packages/clickhouse_backend/driver/client.py", line 53, in execute
backend | rv = self.process_ordinary_query(
backend | File "/usr/local/lib/python3.8/site-packages/clickhouse_driver/client.py", line 571, in process_ordinary_query
backend | return self.receive_result(with_column_types=with_column_types,
backend | File "/usr/local/lib/python3.8/site-packages/clickhouse_driver/client.py", line 204, in receive_result
backend | return result.get_result()
backend | File "/usr/local/lib/python3.8/site-packages/clickhouse_driver/result.py", line 50, in get_result
backend | for packet in self.packet_generator:
backend | File "/usr/local/lib/python3.8/site-packages/clickhouse_driver/client.py", line 220, in packet_generator
backend | packet = self.receive_packet()
backend | File "/usr/local/lib/python3.8/site-packages/clickhouse_driver/client.py", line 237, in receive_packet
backend | raise packet.exception
backend | clickhouse_driver.errors.ServerException: Code: 386.
backend | DB::Exception: There is no supertype for types Object('json'), Tuple(key String) because some of them are Tuple and some of them are not: While processing _CAST(if(id = 1729526377540485120, _CAST(map('key', 'new_value'), 'Object(\'json\')'), content), 'Object(\'json\')'). Stack trace:
backend |
backend | 0. DB::Exception::Exception(DB::Exception::MessageMasked&&, int, bool) @ 0x000000000c2f4404 in /usr/bin/clickhouse
backend | 1. ? @ 0x000000000acbdf98 in /usr/bin/clickhouse
backend | 2. ? @ 0x00000000103ee170 in /usr/bin/clickhouse
backend | 3. std::shared_ptr<DB::IDataType const> DB::getLeastSupertype<(DB::LeastSupertypeOnError)0>(std::vector<std::shared_ptr<DB::IDataType const>, std::allocator<std::shared_ptr<DB::IDataType const>>> const&) @ 0x00000000103eddb0 in /usr/bin/clickhouse
backend | 4. ? @ 0x00000000099c48e0 in /usr/bin/clickhouse
backend | 5. ? @ 0x0000000008078d64 in /usr/bin/clickhouse
backend | 6. DB::IFunctionOverloadResolver::getReturnTypeWithoutLowCardinality(std::vector<DB::ColumnWithTypeAndName, std::allocator<DB::ColumnWithTypeAndName>> const&) const @ 0x000000000fe90280 in /usr/bin/clickhouse
backend | 7. DB::IFunctionOverloadResolver::getReturnType(std::vector<DB::ColumnWithTypeAndName, std::allocator<DB::ColumnWithTypeAndName>> const&) const @ 0x000000000fe8fee0 in /usr/bin/clickhouse
backend | 8. DB::IFunctionOverloadResolver::build(std::vector<DB::ColumnWithTypeAndName, std::allocator<DB::ColumnWithTypeAndName>> const&) const @ 0x000000000fe90b04 in /usr/bin/clickhouse
backend | 9. DB::ActionsDAG::addFunction(std::shared_ptr<DB::IFunctionOverloadResolver> const&, std::vector<DB::ActionsDAG::Node const*, std::allocator<DB::ActionsDAG::Node const*>>, String) @ 0x000000001055500c in /usr/bin/clickhouse
backend | 10. DB::ScopeStack::addFunction(std::shared_ptr<DB::IFunctionOverloadResolver> const&, std::vector<String, std::allocator<String>> const&, String) @ 0x000000001071dc2c in /usr/bin/clickhouse
backend | 11. ? @ 0x0000000010726c14 in /usr/bin/clickhouse
backend | 12. DB::ActionsMatcher::visit(DB::ASTFunction const&, std::shared_ptr<DB::IAST> const&, DB::ActionsMatcher::Data&) @ 0x0000000010720194 in /usr/bin/clickhouse
backend | 13. DB::ActionsMatcher::visit(DB::ASTFunction const&, std::shared_ptr<DB::IAST> const&, DB::ActionsMatcher::Data&) @ 0x0000000010720cdc in /usr/bin/clickhouse
backend | 14. ? @ 0x0000000010716ba0 in /usr/bin/clickhouse
backend | 15. DB::ExpressionAnalyzer::getRootActions(std::shared_ptr<DB::IAST> const&, bool, std::shared_ptr<DB::ActionsDAG>&, bool) @ 0x00000000106f9b14 in /usr/bin/clickhouse
backend | 16. DB::MutationsInterpreter::prepareMutationStages(std::vector<DB::MutationsInterpreter::Stage, std::allocator<DB::MutationsInterpreter::Stage>>&, bool) @ 0x0000000010f2d6c0 in /usr/bin/clickhouse
backend | 17. DB::MutationsInterpreter::prepare(bool) @ 0x0000000010f29828 in /usr/bin/clickhouse
backend | 18. DB::MutationsInterpreter::MutationsInterpreter(DB::MutationsInterpreter::Source, std::shared_ptr<DB::StorageInMemoryMetadata const>, DB::MutationCommands, std::vector<String, std::allocator<String>>, std::shared_ptr<DB::Context const>, DB::MutationsInterpreter::Settings) @ 0x0000000010f23a60 in /usr/bin/clickhouse
backend | 19. DB::MutationsInterpreter::MutationsInterpreter(std::shared_ptr<DB::IStorage>, std::shared_ptr<DB::StorageInMemoryMetadata const>, DB::MutationCommands, std::shared_ptr<DB::Context const>, DB::MutationsInterpreter::Settings) @ 0x0000000010f234f0 in /usr/bin/clickhouse
backend | 20. DB::InterpreterAlterQuery::executeToTable(DB::ASTAlterQuery const&) @ 0x0000000010d33d14 in /usr/bin/clickhouse
backend | 21. ? @ 0x0000000011179854 in /usr/bin/clickhouse
backend | 22. DB::executeQuery(String const&, std::shared_ptr<DB::Context>, bool, DB::QueryProcessingStage::Enum) @ 0x00000000111764e4 in /usr/bin/clickhouse
backend | 23. DB::TCPHandler::runImpl() @ 0x0000000011d5ddd8 in /usr/bin/clickhouse
backend | 24. DB::TCPHandler::run() @ 0x0000000011d70ae4 in /usr/bin/clickhouse
backend | 25. Poco::Net::TCPServerConnection::start() @ 0x0000000012a275e4 in /usr/bin/clickhouse
backend | 26. Poco::Net::TCPServerDispatcher::run() @ 0x0000000012a28b00 in /usr/bin/clickhouse
backend | 27. Poco::PooledThread::run() @ 0x0000000012be98bc in /usr/bin/clickhouse
backend | 28. Poco::ThreadImpl::runnableEntry(void*) @ 0x0000000012be7184 in /usr/bin/clickhouse
backend | 29. start_thread @ 0x0000000000007624 in /usr/lib/aarch64-linux-gnu/libpthread-2.31.so
backend | 30. ? @ 0x00000000000d149c in /usr/lib/aarch64-linux-gnu/libc-2.31.so
backend |
backend |
backend | During handling of the above exception, another exception occurred:
backend |
backend | Traceback (most recent call last):
To Reproduce
Updating jsonfield of a model and save it.
Created a simple docker-compose project to demonstrate.
Check app/event/views.py
for code.
Expected behavior
Should update the model successfully
Versions
Describe the bug
Observed a case where when the query string is long, Package is taking more time to render data. Also ran a profiler and checked that database is returning result instantly (less than 500ms) but django is taking more 30-35 seconds.
Query string length is 575316 characters, this is raw SQL length
With lower query string it works fine.
To Reproduce
Trigger API with higher query string (approx more than 575316)
Expected behavior
Django should return response within 1-2 seconds
Versions
SELECT version()
query. - 23.12.1.956Describe the bug
I get this bug after running migrate, in migrations i do not have changes related to ClickhouseModel
To Reproduce
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.sqlite3',
'NAME': BASE_DIR / 'db.sqlite3',
},
"clickhouse": {
"ENGINE": "clickhouse_backend.backend",
"NAME": os.environ["CLICKHOUSE_NAME"],
"HOST": os.environ["CLICKHOUSE_HOST"],
"USER": os.environ["CLICKHOUSE_USER"],
"PASSWORD": os.environ["CLICKHOUSE_PASSWORD"],
"PORT": os.environ["CLICKHOUSE_PORT"],
},
}
DATABASE_ROUTERS = ["configs.settings.dbrouters.ClickHouseRouter"]
ClickHouseRouter is from documentation with no changes
Expected behavior
success running migrations
Versions
SELECT version()
query.I hope the translation of my text is correct.
Hi all.
I have a problem with the clickhouse-driver, clickhouse-pool and django-clickouse-backed packages. The latest versions are used. The problem is that when connecting to the database, an unknown 'driver' is passed to the Connection class (clickhouse-backend). Please tell me how this can be fixed?
I did not touch the clickhouse-backend root folders.
clickhouse-pool==0.5.3
django-clickhouse-backend==1.1.6
clickhouse-driver==0.2.6
infi.clickhouse-orm==2.1.3
Bug: root@520015ac72a2:/project# python manage.py migrate --database clickhouse
Traceback (most recent call last):
File "/project/manage.py", line 22, in
main()
File "/project/manage.py", line 18, in main
execute_from_command_line(sys.argv)
File "/usr/local/lib/python3.12/site-packages/django/core/management/init.py", line 442, in execute_from_command_line
utility.execute()
File "/usr/local/lib/python3.12/site-packages/django/core/management/init.py", line 436, in execute
self.fetch_command(subcommand).run_from_argv(self.argv)
File "/usr/local/lib/python3.12/site-packages/django/core/management/base.py", line 412, in run_from_argv
self.execute(*args, **cmd_options)
File "/usr/local/lib/python3.12/site-packages/django/core/management/base.py", line 458, in execute
output = self.handle(*args, **options)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/django/core/management/base.py", line 106, in wrapper
res = handle_func(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/django/core/management/commands/migrate.py", line 117, in handle
executor = MigrationExecutor(connection, self.migration_progress_callback)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/django/db/migrations/executor.py", line 18, in init
self.loader = MigrationLoader(self.connection)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/django/db/migrations/loader.py", line 58, in init
self.build_graph()
File "/usr/local/lib/python3.12/site-packages/django/db/migrations/loader.py", line 235, in build_graph
self.applied_migrations = recorder.applied_migrations()
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/django/db/migrations/recorder.py", line 89, in applied_migrations
if self.has_table():
^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/clickhouse_backend/patch/migrations.py", line 69, in has_table
with self.connection.cursor() as cursor:
^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/django/utils/asyncio.py", line 26, in inner
return func(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/django/db/backends/base/base.py", line 316, in cursor
return self._cursor()
^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/django/db/backends/base/base.py", line 292, in _cursor
self.ensure_connection()
File "/usr/local/lib/python3.12/site-packages/django/utils/asyncio.py", line 26, in inner
return func(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/django/db/backends/base/base.py", line 275, in ensure_connection
self.connect()
File "/usr/local/lib/python3.12/site-packages/django/utils/asyncio.py", line 26, in inner
return func(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/django/db/backends/base/base.py", line 256, in connect
self.connection = self.get_new_connection(conn_params)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/django/utils/asyncio.py", line 26, in inner
return func(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/clickhouse_backend/backend/base.py", line 215, in get_new_connection
conn = Database.connect(**conn_params)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/clickhouse_backend/driver/init.py", line 38, in connect
return Connection(
^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/clickhouse_backend/driver/connection.py", line 115, in init
self.pool = ClickhousePool(
^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/clickhouse_backend/driver/pool.py", line 11, in init
super().init(**kwargs)
File "/usr/local/lib/python3.12/site-packages/clickhouse_pool/pool.py", line 54, in init
self._connect()
File "/usr/local/lib/python3.12/site-packages/clickhouse_backend/driver/pool.py", line 18, in _connect
client = Client(**self.connection_args)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/clickhouse_backend/driver/client.py", line 20, in init
super().init(*args, **kwargs)
File "/usr/local/lib/python3.12/site-packages/clickhouse_driver/client.py", line 130, in init
self.connections = deque([Connection(*args, **kwargs)])
^^^^^^^^^^^^^^^^^^^^^^^^^^^
TypeError: Connection.init() got an unexpected keyword argument 'driver'
你好 还有一个问题想请教下~ 我在 migrations 中想使用 RunPython 或者 RunSQL,如下:
def forwards(apps, schema_editor):
# 其他处理
pass
class Migration(migrations.Migration):
initial = True
dependencies = [
]
operations = [
# migrations.CreateModel
migrations.RunPython(forwards),
]
django settings 中 DATABASE 有两个,postgres(default) 和 clickhouse
在执行 python manage.py migrate
时,migrations.RunPython
会被运行;执行 python manage.py migrate --database clickhouse
时只会运行 migrations.CreateModel
部分,migrations.RunPython
不会被运行
想知道这个是本来就不支持嘛?
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.