dbt-msft / dbt-synapse Goto Github PK
View Code? Open in Web Editor NEWdbt adapter for Azure Synapse Dedicated SQL Pools
Home Page: https://dbt-msft.github.io/dbt-msft-docs/docs/dbt-synapse/overview
License: MIT License
dbt adapter for Azure Synapse Dedicated SQL Pools
Home Page: https://dbt-msft.github.io/dbt-msft-docs/docs/dbt-synapse/overview
License: MIT License
We still need sql login functionality, getting SQL Authentication is not supported by Microsoft Fabric, any options here to get this to work?
running an incremental model calls the macro alter_column_type
which has the incorrect syntax (possibly brought over from sql server?) of alter table <table_name> add column <column_name> <type>
when it should be alter table <table_name> add <column_name> <type>
Here is the log output (model/column names changed)
2021-09-23 16:00:50.330909 (Thread-3): Changing col type from character varying(20) to character varying(255) in table <table_name>
2021-09-23 16:00:50.343175 (Thread-3): Using synapse connection <model_name>.
2021-09-23 16:00:50.344859 (Thread-3): On <model_name>:
alter table <table_name> add column <alter_column> character varying(255);
update <table_name> set <alter_column> = <column_name>;
alter table <table_name> drop column <column_name> cascade;
alter table <table_name> rename column <alter_column> to <column_name>
2021-09-23 16:01:01.595936 (Thread-3): Database error: ('42000', "[42000] [Microsoft][ODBC Driver 17 for SQL Server][SQL Server]Parse e
rror at line: 2, column: 66: Incorrect syntax near 'column'. (103010) (SQLExecDirectW)")
2021-09-23 16:01:01.599486 (Thread-3): On <model_name>: ROLLBACK
2021-09-23 16:01:01.602760 (Thread-3): On <model_name>: Close
2021-09-23 16:01:01.605614 (Thread-3): Error running SQL: macro alter_column_type
Possible related to #59
Minor version v1.7
is targeted for final release on Ocotber 26, 2023. As a maintainer of a dbt adapter, we strongly encourage you to release a corresponding minor version increment to ensure users of your adapter can make use of this new minor version.
dbt-labs/dbt-core#8307 is an open discussion with more detailed information. If you have questions, please put them there!
Hi, I've been struggling to get dbt-synapse set up for a while but can't crack it.
When I run dbt debug
, it fails with:
Running with dbt=0.19.1
dbt version: 0.19.1
python version: 3.9.5
python path: .../bin/python
os info: macOS-11.2.3-x86_64-i386-64bit
Using profiles.yml file at .../.dbt/profiles.yml
Using dbt_project.yml file at .../dbt_project.yml
Configuration:
profiles.yml file [OK found and valid]
dbt_project.yml file [ERROR invalid]
Project loading failed for the following reason:
Runtime Error
Invalid config version: 1, expected 2
Error encountered in .../dbt_project.yml
Required dependencies:
- git [OK found]
Connection:
server: ...
database: ...
schema: ...
port: 1433
UID: ...
client_id: None
authentication: sql
encrypt: False
trust_cert: False
Connection test: OK connection ok
Could not load dbt_project.yml
Describe the bug
When using case sensitive collations, automate_dv.sat incremental updates fail due to the "latest_records" being declared as lower case and then used as upper case in the left join. This results in an invalid object error (on synapse)
Environment
dbt version: dbt-core 1.3.5, dbt-synapse 1.3.2
automate-dv version: 0.10.1
Database/Platform: Azure Synapse Dedicated SQL Pool, Collation = SQL_Latin1_General_CP1_CS_AS
Compiled SQL fragment
latest_records AS (
SELECT b.account_links_hk, b.account_links_hashdiff, b.accl_seqno, b.accl_last_change_date, b.accl_subacc_no, b.accl_last_change_by, b.accl_soc_seqno, b.accl_holder_pos, b.accl_soc_control, b.accl_dispensing_notice, b.accl_brn_code, b.accl_account_no, b.accl_old_no, b.accl_extract_required, b.accl_gact_type, b.accl_cust_seqno, b.accl_cust_usage, b.account_links_nk, b.account_and_subacc_no_nk, b.customer_nk, b.end_date, b.deleted, b.active, b.account_and_subacc_no_hk, b.customer_hk, b.start_date, b.load_datetime, b.source
FROM (......
records_to_insert AS (
SELECT frin.account_links_hk, frin.account_links_hashdiff, frin.accl_seqno, frin.accl_last_change_date, frin.accl_subacc_no, frin.accl_last_change_by, frin.accl_soc_seqno, frin.accl_holder_pos, frin.accl_soc_control, frin.accl_dispensing_notice, frin.accl_brn_code, frin.accl_account_no, frin.accl_old_no, frin.accl_extract_required, frin.accl_gact_type, frin.accl_cust_seqno, frin.accl_cust_usage, frin.account_links_nk, frin.account_and_subacc_no_nk, frin.customer_nk, frin.end_date, frin.deleted, frin.active, frin.account_and_subacc_no_hk, frin.customer_hk, frin.start_date, frin.load_datetime, frin.source
FROM first_record_in_set AS frin
LEFT JOIN LATEST_RECORDS lr
ON lr.account_links_hk = frin.account_links_hk
AND lr.account_links_hashdiff = frin.account_links_hashdiff
WHERE lr.account_links_hashdiff IS NULL
UNION
SELECT ........etc
We've just published the release cut of dbt-core 1.2.0, dbt-core 1.2.0rc1
(PyPI | GitHub release notes).
dbt-labs/dbt-core#5468 is an open discussion with more detailed information, and dbt-labs/dbt-core#5474 is for keeping track of the communities progress on releasing 1.2.0
Below is a checklist of work that would enable a successful 1.2.0 release of your adapter.
BaseDocsGenerate
and BaseDocsGenReferences
dbt-labs/dbt-core#5432 might make it into the second release cut in the next week, in which case, you'll also might want to:
I am trying to get this adapter working with a serverless pool. When trying to run the example project (after dbt init
, and changing table to view), the view gets correctly created in synapse, but I get following error.
Running with dbt=0.19.0
Found 1 model, 0 tests, 0 snapshots, 0 analyses, 184 macros, 0 operations, 0 seed files, 0 sources, 0 exposures
12:13:55 | Concurrency: 1 threads (target='dev')
12:13:55 |
12:13:55 | 1 of 1 START view model dbo.test..................................... [RUN]
12:13:55 | 1 of 1 ERROR creating view model dbo.test............................ [ERROR in 0.35s]
12:13:55 |
12:13:55 | Finished running 1 view model in 0.90s.
Completed with 1 error and 0 warnings:
Database Error in model test (models/example/test.sql)
('42000', "[42000] [Microsoft][ODBC Driver 17 for SQL Server][SQL Server]Incorrect syntax near 'dbo'. (102) (SQLExecDirectW)")
compiled SQL at target/run/my_new_project/models/example/test.sq
I tried this on both a windows machine, as a linux machine. The compiled SQL is as follows
create view "dbo"."test__dbt_tmp" as
/*
Welcome to your first dbt model!
Did you know that you can also configure models directly within SQL files?
This will override configurations stated in dbt_project.yml
Try changing "table" to "view" below
*/
with source_data as (
select 1 as id
union all
select null as id
)
select *
from source_data
/*
Uncomment the line below to remove records with null `id` values
*/
-- where id is not null
Still the view seems to have been created correctly.
Is there anything I'm missing or I could do to further debug this?
this Stack Overflow question and this dbt Slack thread brings up the issue.
to reiterate here, using the full three-part name is not an issue with an Azure Synapse Dedicated SQL pool (ASDSP). However, when trying to use SQL Server Data Tools in conjunction with ASDSP, two users report an error and asked if database name can be dropped. TBD if this would cause an issue for existing dbt-synapse users
We've just published the release cut of dbt-core 1.2.0, dbt-core 1.2.0rc1
(PyPI | GitHub release notes).
dbt-labs/dbt-core#5468 is an open discussion with more detailed information, and dbt-labs/dbt-core#5474 is for keeping track of the communities progress on releasing 1.2.0
Below is a checklist of work that would enable a successful 1.2.0 release of your adapter.
BaseDocsGenerate
and BaseDocsGenReferences
dbt-labs/dbt-core#5432 might make it into the second release cut in the next week, in which case, you'll also might want to:
Out of all this I was able to generate the views from the select SQL statements against the database, but not the tables.
Note: Under dbt.project.yml i clearly defined the materialization as table to perform the action.
When you create an incremental table, this is generally a 2 step process:
For both of these the adapter breaks this up into a few steps:
For the incremental steps, the temp table is created with the index and distribution settings of the final target table.
The best practice advice for transient data is to use a HEAP index. We should test if using this for the intermediate table would make the process faster. As currently there's a slow store to CCI (possibly ordered), but then that's ready and slowly stored to the second CCI.
I think this needs testing, but will potentially be a large speed up for users:
Example step that needs change:
CREATE TABLE "database"."schema"."users_table_name__dbt_tmp"
WITH(
DISTRIBUTION = HASH(HashColumn),
CLUSTERED COLUMNSTORE INDEX ORDER(OrderColumn)
)
AS (SELECT * FROM [schema].[users_table_name__dbt_tmp_temp_view])
Should be:
CREATE TABLE "database"."schema"."users_table_name__dbt_tmp"
WITH(
DISTRIBUTION = /*Unsure here - could take the dist or do RR, RR can load the DMS*/
HEAP
)
AS (SELECT * FROM [schema].[users_table_name__dbt_tmp_temp_view])
Hi,
I know this is clearly stated in the README, but I would love to use the opt-in hard deletes functionality that is available in dbt 0.19. Currently, executing dbt run
with dbt 0.19 results in:
Database Error
('42000', '[42000] [Microsoft][ODBC Driver 17 for SQL Server][SQL Server]USE statement is not supported to switch between databases. Use a new connection to connect to a different Database. (104455) (SQLExecDirectW)')
I am happy to investigate and contribute, but I would love some pointers to get started :).
Cheers,
Marvin
I am facing the below issue, can anyone help??
I am dockerizing the dbt and below is docker file
FROM python:3.8.5
RUN apt-get update -y &&
apt-get install --no-install-recommends -y -q
git libpq-dev python-dev &&
apt-get clean &&
rm -rf /var/lib/apt/lists/* /tmp/* /var/tmp/*
profile name is matching in both profiles.yml and dbt_project.yml, and profiles.yml is good
and in dbt_project.yml i am using version:1.0.0 and config-version:2
when I am running in my local with the image build from docker file using command, everyrhing is working fine
docker run -v github/profiles.yml/:/root/.dbt/profiles.yml dbt debug --profiles-dir /root/.dbt
github/profiles.yml( its the path of new profiles.yml mounting at run time in docker run)
but when i am running in codefresh as pipeline with same command
docker run -v github/profiles.yml/:/root/.dbt/profiles.yml dbt debug --profiles-dir /root/.dbt
I am getting the below error
using profiles.yml file at /root/.dbt/profiles.yml
using dbt_project.yml file at /root/.dbt/profiles.yml
configration :
profiles.yml file [ERROR INVALID]
dbt_project.yml [ok found and valid]
profile loading failed for following reason:
Runtime error
could not find profile named โcdmโ
can you help me why i am facing the issue with profiles.yml in codefresh, whereas in local its working fine (
@jtcohen6 is this something we need to do for this adapter and dbt-sqlserver
? or is it already taken care of by dbt-adapter-tests
?
The common way to know the version of dbt and installed adapters is command dbt --version
.
However, for dbt-synapse
and dbt-fabric
adapters there is no such information at all.
Is is possible to add such information there?
Here is the log of insatllation/re-installation and output for version command:
root@819a2d99aa78:/# pip install -U dbt-synapse
Requirement already satisfied: dbt-synapse in /usr/local/lib/python3.11/site-packages (1.7.0)
Requirement already satisfied: dbt-fabric~=1.7.4 in /usr/local/lib/python3.11/site-packages (from dbt-synapse) (1.7.4)
Requirement already satisfied: dbt-core~=1.7.2 in /usr/local/lib/python3.11/site-packages (from dbt-fabric~=1.7.4->dbt-synapse) (1.7.10)
Requirement already satisfied: pyodbc<5.1.0,>=4.0.35 in /usr/local/lib/python3.11/site-packages (from dbt-fabric~=1.7.4->dbt-synapse) (5.0.1)
Requirement already satisfied: azure-identity>=1.12.0 in /usr/local/lib/python3.11/site-packages (from dbt-fabric~=1.7.4->dbt-synapse) (1.15.0)
Requirement already satisfied: azure-core<2.0.0,>=1.23.0 in /usr/local/lib/python3.11/site-packages (from azure-identity>=1.12.0->dbt-fabric~=1.7.4->dbt-synapse) (1.30.1)
Requirement already satisfied: cryptography>=2.5 in /usr/local/lib/python3.11/site-packages (from azure-identity>=1.12.0->dbt-fabric~=1.7.4->dbt-synapse) (42.0.5)
Requirement already satisfied: msal<2.0.0,>=1.24.0 in /usr/local/lib/python3.11/site-packages (from azure-identity>=1.12.0->dbt-fabric~=1.7.4->dbt-synapse) (1.27.0)
Requirement already satisfied: msal-extensions<2.0.0,>=0.3.0 in /usr/local/lib/python3.11/site-packages (from azure-identity>=1.12.0->dbt-fabric~=1.7.4->dbt-synapse) (1.1.0)
Requirement already satisfied: agate~=1.7.0 in /usr/local/lib/python3.11/site-packages (from dbt-core~=1.7.2->dbt-fabric~=1.7.4->dbt-synapse) (1.7.1)
Requirement already satisfied: Jinja2<4,>=3.1.3 in /usr/local/lib/python3.11/site-packages (from dbt-core~=1.7.2->dbt-fabric~=1.7.4->dbt-synapse) (3.1.3)
Requirement already satisfied: mashumaro~=3.9 in /usr/local/lib/python3.11/site-packages (from mashumaro[msgpack]~=3.9->dbt-core~=1.7.2->dbt-fabric~=1.7.4->dbt-synapse) (3.12)
Requirement already satisfied: logbook<1.6,>=1.5 in /usr/local/lib/python3.11/site-packages (from dbt-core~=1.7.2->dbt-fabric~=1.7.4->dbt-synapse) (1.5.3)
Requirement already satisfied: click<9,>=8.0.2 in /usr/local/lib/python3.11/site-packages (from dbt-core~=1.7.2->dbt-fabric~=1.7.4->dbt-synapse) (8.1.7)
Requirement already satisfied: networkx<4,>=2.3 in /usr/local/lib/python3.11/site-packages (from dbt-core~=1.7.2->dbt-fabric~=1.7.4->dbt-synapse) (3.2.1)
Requirement already satisfied: colorama<0.5,>=0.3.9 in /usr/local/lib/python3.11/site-packages (from dbt-core~=1.7.2->dbt-fabric~=1.7.4->dbt-synapse) (0.4.6)
Requirement already satisfied: pathspec<0.12,>=0.9 in /usr/local/lib/python3.11/site-packages (from dbt-core~=1.7.2->dbt-fabric~=1.7.4->dbt-synapse) (0.11.2)
Requirement already satisfied: isodate<0.7,>=0.6 in /usr/local/lib/python3.11/site-packages (from dbt-core~=1.7.2->dbt-fabric~=1.7.4->dbt-synapse) (0.6.1)
Requirement already satisfied: sqlparse<0.5,>=0.2.3 in /usr/local/lib/python3.11/site-packages (from dbt-core~=1.7.2->dbt-fabric~=1.7.4->dbt-synapse) (0.4.4)
Requirement already satisfied: dbt-extractor~=0.5.0 in /usr/local/lib/python3.11/site-packages (from dbt-core~=1.7.2->dbt-fabric~=1.7.4->dbt-synapse) (0.5.1)
Requirement already satisfied: minimal-snowplow-tracker~=0.0.2 in /usr/local/lib/python3.11/site-packages (from dbt-core~=1.7.2->dbt-fabric~=1.7.4->dbt-synapse) (0.0.2)
Requirement already satisfied: dbt-semantic-interfaces~=0.4.2 in /usr/local/lib/python3.11/site-packages (from dbt-core~=1.7.2->dbt-fabric~=1.7.4->dbt-synapse) (0.4.4)
Requirement already satisfied: jsonschema>=3.0 in /usr/local/lib/python3.11/site-packages (from dbt-core~=1.7.2->dbt-fabric~=1.7.4->dbt-synapse) (4.21.1)
Requirement already satisfied: packaging>20.9 in /usr/local/lib/python3.11/site-packages (from dbt-core~=1.7.2->dbt-fabric~=1.7.4->dbt-synapse) (24.0)
Requirement already satisfied: protobuf<5,>=4.0.0 in /usr/local/lib/python3.11/site-packages (from dbt-core~=1.7.2->dbt-fabric~=1.7.4->dbt-synapse) (4.25.3)
Requirement already satisfied: pytz>=2015.7 in /usr/local/lib/python3.11/site-packages (from dbt-core~=1.7.2->dbt-fabric~=1.7.4->dbt-synapse) (2024.1)
Requirement already satisfied: pyyaml>=6.0 in /usr/local/lib/python3.11/site-packages (from dbt-core~=1.7.2->dbt-fabric~=1.7.4->dbt-synapse) (6.0.1)
Requirement already satisfied: typing-extensions>=3.7.4 in /usr/local/lib/python3.11/site-packages (from dbt-core~=1.7.2->dbt-fabric~=1.7.4->dbt-synapse) (4.10.0)
Requirement already satisfied: cffi<2.0.0,>=1.9 in /usr/local/lib/python3.11/site-packages (from dbt-core~=1.7.2->dbt-fabric~=1.7.4->dbt-synapse) (1.16.0)
Requirement already satisfied: idna<4,>=2.5 in /usr/local/lib/python3.11/site-packages (from dbt-core~=1.7.2->dbt-fabric~=1.7.4->dbt-synapse) (3.6)
Requirement already satisfied: requests<3.0.0 in /usr/local/lib/python3.11/site-packages (from dbt-core~=1.7.2->dbt-fabric~=1.7.4->dbt-synapse) (2.31.0)
Requirement already satisfied: urllib3~=1.0 in /usr/local/lib/python3.11/site-packages (from dbt-core~=1.7.2->dbt-fabric~=1.7.4->dbt-synapse) (1.26.18)
Requirement already satisfied: Babel>=2.0 in /usr/local/lib/python3.11/site-packages (from agate~=1.7.0->dbt-core~=1.7.2->dbt-fabric~=1.7.4->dbt-synapse) (2.14.0)
Requirement already satisfied: leather>=0.3.2 in /usr/local/lib/python3.11/site-packages (from agate~=1.7.0->dbt-core~=1.7.2->dbt-fabric~=1.7.4->dbt-synapse) (0.4.0)
Requirement already satisfied: parsedatetime!=2.5,>=2.1 in /usr/local/lib/python3.11/site-packages (from agate~=1.7.0->dbt-core~=1.7.2->dbt-fabric~=1.7.4->dbt-synapse) (2.6)
Requirement already satisfied: python-slugify>=1.2.1 in /usr/local/lib/python3.11/site-packages (from agate~=1.7.0->dbt-core~=1.7.2->dbt-fabric~=1.7.4->dbt-synapse) (8.0.4)
Requirement already satisfied: pytimeparse>=1.1.5 in /usr/local/lib/python3.11/site-packages (from agate~=1.7.0->dbt-core~=1.7.2->dbt-fabric~=1.7.4->dbt-synapse) (1.1.8)
Requirement already satisfied: six>=1.11.0 in /usr/local/lib/python3.11/site-packages (from azure-core<2.0.0,>=1.23.0->azure-identity>=1.12.0->dbt-fabric~=1.7.4->dbt-synapse) (1.16.0)
Requirement already satisfied: pycparser in /usr/local/lib/python3.11/site-packages (from cffi<2.0.0,>=1.9->dbt-core~=1.7.2->dbt-fabric~=1.7.4->dbt-synapse) (2.21)
Requirement already satisfied: importlib-metadata~=6.0 in /usr/local/lib/python3.11/site-packages (from dbt-semantic-interfaces~=0.4.2->dbt-core~=1.7.2->dbt-fabric~=1.7.4->dbt-synapse) (6.11.0)
Requirement already satisfied: more-itertools<11.0,>=8.0 in /usr/local/lib/python3.11/site-packages (from dbt-semantic-interfaces~=0.4.2->dbt-core~=1.7.2->dbt-fabric~=1.7.4->dbt-synapse) (10.2.0)
Requirement already satisfied: pydantic<3,>=1.10 in /usr/local/lib/python3.11/site-packages (from dbt-semantic-interfaces~=0.4.2->dbt-core~=1.7.2->dbt-fabric~=1.7.4->dbt-synapse) (2.6.4)
Requirement already satisfied: python-dateutil~=2.0 in /usr/local/lib/python3.11/site-packages (from dbt-semantic-interfaces~=0.4.2->dbt-core~=1.7.2->dbt-fabric~=1.7.4->dbt-synapse) (2.9.0.post0)
Requirement already satisfied: MarkupSafe>=2.0 in /usr/local/lib/python3.11/site-packages (from Jinja2<4,>=3.1.3->dbt-core~=1.7.2->dbt-fabric~=1.7.4->dbt-synapse) (2.1.5)
Requirement already satisfied: attrs>=22.2.0 in /usr/local/lib/python3.11/site-packages (from jsonschema>=3.0->dbt-core~=1.7.2->dbt-fabric~=1.7.4->dbt-synapse) (23.2.0)
Requirement already satisfied: jsonschema-specifications>=2023.03.6 in /usr/local/lib/python3.11/site-packages (from jsonschema>=3.0->dbt-core~=1.7.2->dbt-fabric~=1.7.4->dbt-synapse) (2023.12.1)
Requirement already satisfied: referencing>=0.28.4 in /usr/local/lib/python3.11/site-packages (from jsonschema>=3.0->dbt-core~=1.7.2->dbt-fabric~=1.7.4->dbt-synapse) (0.33.0)
Requirement already satisfied: rpds-py>=0.7.1 in /usr/local/lib/python3.11/site-packages (from jsonschema>=3.0->dbt-core~=1.7.2->dbt-fabric~=1.7.4->dbt-synapse) (0.18.0)
Requirement already satisfied: msgpack>=0.5.6 in /usr/local/lib/python3.11/site-packages (from mashumaro[msgpack]~=3.9->dbt-core~=1.7.2->dbt-fabric~=1.7.4->dbt-synapse) (1.0.8)
Requirement already satisfied: PyJWT<3,>=1.0.0 in /usr/local/lib/python3.11/site-packages (from PyJWT[crypto]<3,>=1.0.0->msal<2.0.0,>=1.24.0->azure-identity>=1.12.0->dbt-fabric~=1.7.4->dbt-synapse) (2.8.0)
Requirement already satisfied: portalocker<3,>=1.0 in /usr/local/lib/python3.11/site-packages (from msal-extensions<2.0.0,>=0.3.0->azure-identity>=1.12.0->dbt-fabric~=1.7.4->dbt-synapse) (2.8.2)
Requirement already satisfied: charset-normalizer<4,>=2 in /usr/local/lib/python3.11/site-packages (from requests<3.0.0->dbt-core~=1.7.2->dbt-fabric~=1.7.4->dbt-synapse) (3.3.2)
Requirement already satisfied: certifi>=2017.4.17 in /usr/local/lib/python3.11/site-packages (from requests<3.0.0->dbt-core~=1.7.2->dbt-fabric~=1.7.4->dbt-synapse) (2024.2.2)
Requirement already satisfied: zipp>=0.5 in /usr/local/lib/python3.11/site-packages (from importlib-metadata~=6.0->dbt-semantic-interfaces~=0.4.2->dbt-core~=1.7.2->dbt-fabric~=1.7.4->dbt-synapse) (3.18.1)
Requirement already satisfied: annotated-types>=0.4.0 in /usr/local/lib/python3.11/site-packages (from pydantic<3,>=1.10->dbt-semantic-interfaces~=0.4.2->dbt-core~=1.7.2->dbt-fabric~=1.7.4->dbt-synapse) (0.6.0)
Requirement already satisfied: pydantic-core==2.16.3 in /usr/local/lib/python3.11/site-packages (from pydantic<3,>=1.10->dbt-semantic-interfaces~=0.4.2->dbt-core~=1.7.2->dbt-fabric~=1.7.4->dbt-synapse) (2.16.3)
Requirement already satisfied: text-unidecode>=1.3 in /usr/local/lib/python3.11/site-packages (from python-slugify>=1.2.1->agate~=1.7.0->dbt-core~=1.7.2->dbt-fabric~=1.7.4->dbt-synapse) (1.3)
WARNING: Running pip as the 'root' user can result in broken permissions and conflicting behaviour with the system package manager. It is recommended to use a virtual environment instead: https://pip.pypa.io/warnings/venv
root@819a2d99aa78:/# dbt --version
Core:
- installed: 1.7.10
- latest: 1.7.10 - Up to date!
Plugins:
root@819a2d99aa78:/#
Hi.
I'm running some test cases evaluating dbt using the Synapse provider. I came across this error when adding a new column:
Test Case - Add column NEW_COLUMN
{{ config(materialized='incremental'
, index='clustered index(OLDACNUM)'
, dist='REPLICATE'
, on_schema_change='sync_all_columns' ) }}
SELECT TOP (1000)
[RecordCreateRunId]
,[DEBTORNUM]
,[OLDACNUM]
,[CREDCOLACTION]
,[LASTSTMTDATE]
,[CUST_TYPE]
,[LETTERDATE]
,[CHARGE_CODE]
,[NUM_REMINDERS]
,[PAYMENT_CODE]
,[CYCLE_GROUP]
,ODS_START_DATE
,1 AS NEW_COLUMN
FROM [DBO].[AR_DEBTOR]
{% if is_incremental() %}
-- this filter will only be applied on an incremental run
WHERE ODS_START_DATE > (select max(ODS_START_DATE) from {{ this }})
{% endif %}
if object_id ('"DBT"."STG_DEBTORS_NEW__dbt_tmp_temp_view"','V') is not null
begin
drop view "DBT"."STG_DEBTORS_NEW__dbt_tmp_temp_view"
end
if object_id ('"DBT"."STG_DEBTORS_NEW__dbt_tmp"','U') is not null
begin
drop table "DBT"."STG_DEBTORS_NEW__dbt_tmp"
end
EXEC('create view DBT.STG_DEBTORS_NEW__dbt_tmp_temp_view as
SELECT TOP (1000)
[RecordCreateRunId]
,[DEBTORNUM]
,[OLDACNUM]
,[CREDCOLACTION]
,[LASTSTMTDATE]
,[CUST_TYPE]
,[LETTERDATE]
,[CHARGE_CODE]
,[NUM_REMINDERS]
,[PAYMENT_CODE]
,[CYCLE_GROUP]
,ODS_START_DATE
,1 AS NEW_COLUMN
FROM [DBO].[AR_DEBTOR]
-- this filter will only be applied on an incremental run
WHERE ODS_START_DATE > (select max(ODS_START_DATE) from "DBNAME"."DBT"."STG_DEBTORS_NEW")
');
CREATE TABLE "DBT"."STG_DEBTORS_NEW__dbt_tmp"
WITH(
DISTRIBUTION = REPLICATE,
clustered index(OLDACNUM)
)
AS (SELECT * FROM DBT.STG_DEBTORS_NEW__dbt_tmp_temp_view)
if object_id ('"DBT"."STG_DEBTORS_NEW__dbt_tmp_temp_view"','V') is not null
begin
drop view "DBT"."STG_DEBTORS_NEW__dbt_tmp_temp_view"
end
...
3:51:52.439447 [debug] [Thread-1 (]:
In "DBNAME"."DBT"."STG_DEBTORS_NEW":
Schema changed: True
Source columns not in target: [<Column NEW_COLUMN (int)>]
Target columns not in source: []
New column types: []
03:51:52.479450 [debug] [Thread-1 (]: SQLServer adapter: Using synapse connection "model.dbtPOC.STG_DEBTORS_NEW".
03:51:52.480445 [debug] [Thread-1 (]: SQLServer adapter: On model.dbtPOC.STG_DEBTORS_NEW:
alter table "DBNAME"."DBT"."STG_DEBTORS_NEW"
add column NEW_COLUMN int
03:51:52.955452 [debug] [Thread-1 (]: SQLServer adapter: Database error: ('42000', "[42000] [Microsoft][ODBC Driver 17 for SQL Server][SQL Server]Parse error at line: 5, column: 20: Incorrect syntax near 'column'. (103010) (SQLExecDirectW)")
03:51:52.956517 [debug] [Thread-1 (]: On model.dbtPOC.STG_DEBTORS_NEW: ROLLBACK
03:51:52.958450 [debug] [Thread-1 (]: On model.dbtPOC.STG_DEBTORS_NEW: Close
03:51:52.963454 [debug] [Thread-1 (]: finished collecting timing info
03:51:52.964453 [debug] [Thread-1 (]: Database Error in model STG_DEBTORS_NEW (models\DataVault\CHANGE\STG_DEBTORS_NEW.sql)
('42000', "[42000] [Microsoft][ODBC Driver 17 for SQL Server][SQL Server]Parse error at line: 5, column: 20: Incorrect syntax near 'column'. (103010) (SQLExecDirectW)")
03:51:52.968452 [debug] [Thread-1 (]: Sending event: {'category': 'dbt', 'action': 'run_model', 'label': '96d48387-ac3a-4327-a899-1ccb95fb9418', 'context': [<snowplow_tracker.self_describing_json.SelfDescribingJson object at 0x000001CFFD4F1270>]}
03:51:52.970454 [error] [Thread-1 (]: 1 of 1 ERROR creating incremental model DBT.STG_DEBTORS_NEW .................... [๏ฟฝ[31mERROR๏ฟฝ[0m in 6.11s]
03:51:52.977454 [debug] [Thread-1 (]: Finished running node model.dbtPOC.STG_DEBTORS_NEW
alter table "DBNAME"."DBT"."STG_DEBTORS_NEW"
add column NEW_COLUMN int
should be:
alter table "DBNAME"."DBT"."STG_DEBTORS_NEW" add NEW_COLUMN int
I would suggest a fix, but I couldn't find an SQL server or synapse macro specifically for "add column" is this just defaulting to the default provider?
Versions dbt=1=1=2, sqlserver=1.1.0, synapse=1.1.0
original dbt Slack thread in #dbt-synapse
Think I found a bug in the
0.19.2
version of the dbt-synapse.
When making a snapshot with a new detected column, it will try this query:
alter table <db>.<schema>.<table> add COLUMN <name> <datatype>;
This should be:
alter table <db>.<schema>.<table> add COLUMN <name> <datatype>;
Now you'll receive this error in v0.19.2:Database error: ('42000', "[42000] [Microsoft][ODBC Driver 17 for SQL Server] [SQL Server]Parse error at line: 2, column: 75: Incorrect syntax near 'column'. (103010) (SQLExecDirectW)")
Compilation Error
dbt found two macros named "synapse__get_columns_in_relation" in the project
"dbt_synapse".
To fix this error, rename or remove one of the following macros:
- macros\adapters\columns.sql
- macros\adapters.sql
Hi Team, when i am doing dbt run or dbt compile in CLI i am seeing this error for synapse profile.
Moving over from dbt-labs/dbt-core#2843.
This is a minor bug, but does make seeds much less useful. I have not tested on a different adapter but folks at core seemed confident it was an adapter problem.
Describe the bug
A clear and concise description of what the bug is. What command did you run? What happened?
Running dbt seed -s mycsv leads to this error:
('22018', "[22018] [Microsoft][ODBC Driver 17 for SQL Server][SQL Server]Conversion failed when converting the varchar value 'None' to data type int. (245) (SQLExecDirectW)")
Steps To Reproduce
In as much detail as possible, please provide steps to reproduce the issue. Sample data that triggers the issue, example model code, etc is all very helpful here.
Below .csv fails with Errror
Id, col
1,1
2,NULL
3,3
Below .csv fails with Errror
Id, col
1,1
2,
3,3
Below .csv passes
Id, col
1,1
2,2
3,3
Expected behavior
A clear and concise description of what you expected to happen.
Blank columns that Python inteprets as Nones are interpreted as NULLS by DB
Screenshots and log output
If applicable, add screenshots or log output to help explain your problem.
System information
Which database are you using dbt with?
dbt-synapse adapter
The output of dbt --version:
installed version: 0.18.0
latest version: 0.18.1
The operating system you're using:
Windows
The output of python --version:
Python 3.8.6
Additional context
Add any other context about the problem here.
For each of these examples, give a snippet of the method to demonstrate how "diverse" the syntax can get using the csvs in https://dbtmsftblob.blob.core.windows.net/jaffle-shop
also we learned that the second option isn't supported for Azure SQL only SQL Server
this seems to be assumed behavior in dbt-utils
and has come up in dbt-msft/tsql-utils#17
dbt-sqlserver
uses the ?
but we're using row[column]
which I don't understand the difference at all. I must have copied it from somewhere...
the real challenge comes from Synapse not supporting SQL Server's Table Value Constructor. so we need quoted lists?? do dates need to be quoted still???
more context from a Stack Overflow question I asked:
An empty string, on the other hand, evaluates to 0, which in SQL Server is implicitly an integer representing the number of days since 1900-01-01.
add an elif
condition after line 27 that if the agate column type Text or Date, and the data is an empty string, then replace with NULL.
Hello,
With versions dbt-core==1.0.0, dbt-sqlserver==1.0.0, dbt-synapse==1.0.1
, when I run a dbt command such as dbt snapshot
, I get the error
20:32:03 Running with dbt=1.0.0
20:32:04 Partial parse save file not found. Starting full parse.
20:32:04 Encountered an error:
Compilation Error
dbt found two macros named "synapse__drop_relation" in the project
"dbt_synapse".
To fix this error, rename or remove one of the following macros:
- macros/adapters/relation.sql
- macros/adapters.sql
I do not get this error with versions dbt-core==0.20.2, dbt-sqlserver==0.20.1, dbt-synapse==0.20.0
. I'm using Python version 3.9.9.
Vanilla error:
Running with dbt=0.19.1
Error importing adapter: No module named 'dbt.adapters.sqlserver'
Encountered an error:
No module named 'dbt.adapters.sqlserver'
Detail error if run with dbt -d run
:
python39\lib\site-packages\dbt\adapters\synapse\connections.py <module>
from dbt.adapters.sqlserver import (SQLServerConnectionManager,
Is this just something needs adding to setup.py?
synapse__get_limit_subquery()
wraps provided SQL string (sql
) into a Subquery. If the subquery has a CTE, the SQL output by synapse__get_limit_subquery
will throw the below error because CTEs are not supported within subqueries
('42000', "[42000] [Microsoft]
[ODBC Driver 18 for SQL Server][SQL Server]
Parse error at line: 9, column: 5:
Incorrect syntax near 'ID'. (103010) (SQLExecDirectW)")
This is especially a problem with the dbt Cloud IDE which will, by default, always try to only fetch the first 500 rows when previewing and compiling models.
I thought at first that fabric__get_limit_subquery()
would work, but OFFSET
is not supported in Synapse.
dbt-synapse/dbt/include/synapse/macros/adapters/show.sql
Lines 16 to 22 in 7d414cd
Encountering a generic error when running in Linux, works fine on Windows, same setup:
{
"code": "Z002",
"data": {
"e": "'dbt_sqlserver://macros/adapters.sql'"
},
"invocation_id": "f68f1432-d313-4ee6-be64-b7dc85b4e863",
"level": "error",
"log_version": 1,
"msg": "Encountered an error:\n'dbt_sqlserver://macros/adapters.sql'",
"node_info": {},
"pid": 3687,
"thread_name": "MainThread",
"ts": "2022-03-09T01:08:33.663686Z",
"type": "log_line"
}
dbt version: 1.0.3
python version: 3.9.10
python path: /opt/hostedtoolcache/Python/3.9.10/x64/bin/python
os info: Linux-5.13.0-1014-azure-x86_64-with-glibc2.31
Using profiles.yml file at /home/vsts/.dbt/profiles.yml
Using dbt_project.yml file at /home/vsts/work/1/s/dbt_project.yml
Configuration:
profiles.yml file [OK found and valid]
dbt_project.yml file [OK found and valid]
installed version: 1.0.3
latest version: 1.0.3
Up to date!
Plugins:
Debug works fine against the target server, running anything else like docs generate, run, test, compile, just gives me the macros/adapters.sql error and quits. Target is Synapse and using MS ODBC 17.
Due to the presence of two identical 'types' in the adapters.
On my side dbt defaults to using the synapse connection, I'm not sure why.
I suspect that this would be fixed with an updated type: SqlServerSynapse but I'd have to read and understand the code better to be sure.
Actual error:
('42000', "[42000] [Microsoft][ODBC Driver 17 for SQL Server][SQL Server]Incorrect syntax near 'DISTRIBUTION'. (102) (SQLExecDirectW)")
Hi guys, I'm new to dbt. I'm trying to run "dbt init" project and edit the profile to my database. I can run the "dbt debug". But when, I run "dbt compile" or "dbt run" it shows this error:
Encountered an error:
Compilation Error
dbt found two macros named "synapse__snapshot_merge_sql" in the project
"dbt_synapse".
To fix this error, rename or remove one of the following macros:
- macros/materializations/snapshots/snapshot_merge.sql
- macros/materializations/snapshot/snapshot_merge.sql
I'm using dbt-core-1.0.3
dbt-extractor-0.4.0
dbt-sqlserver-1.0.0
dbt-synapse-1.0.1
v0.20.0 removes the need for this repeated dispatching... right @jtcohen6? I could be overgeneralizing the macro dispatch changes to also include adapter-level dispatch changes. My understanding is that we wouldn't need the below anymore.
Minor version v1.6
is targeted for final release on July 27, 2023. As a maintainer of a dbt adapter, we strongly encourage you to release a corresponding minor version increment to ensure users of your adapter can make use of this new minor version.
dbt-labs/dbt-core#7958 is an open discussion with more detailed information. If you have questions, please put them there!
The above linked guide has more information, but below is a high-level checklist of work that would enable a successful 1.6.0 release of your adapter:
1.7.0
FYI, dbt-core==1.7.0
is expected to be released on October 12, 2023 in time for Coalesce, the annual analytics engineering conference!
The latest version of dbt Core,dbt-core==1.4.0
, was published on January 25, 2023 (PyPI | Github). In fact, a patch, dbt-core==1.4.1
(PyPI | Github), was also released on the same day.
dbt-labs/dbt-core#6624 is an open discussion with more detailed information. If you have questions, please put them there! dbt-labs/dbt-core#6849 is for keeping track of the community's progress on releasing 1.4.0
The above linked guide has more information, but below is a high-level checklist of work that would enable a successful 1.4.0 release of your adapter.
FYI, dbt-core==1.5.0
is expected to be released at the end of April. Please plan on allocating a more effort to upgrade support compared to previous minor versions. Expect to hear more in the middle of April.
At a high-level expect much greater adapter test coverage (a very good thing!), and some likely heaving renaming and restructuring as the API-ification of dbt-core is now well underway. See https://github.com/dbt-labs/dbt-core/milestone/82 for more information.
There are important files that Microsoft projects should all have that are not present in this repository. A pull request has been opened to add the missing file(s). When the pr is merged this issue will be closed automatically.
Microsoft teams can learn more about this effort and share feedback within the open source guidance available internally.
If you have a snapshot table with a certain distribution and/or index type and you change one of those later on through a config, nothing happens.
However, these cannot be changed after the table was created so we'd have to drop them and recreate the table. This is also something we'd probably not want to do since in the case of snapshots, we would have data loss.
We've just published the release cut of dbt-core 1.2.0, dbt-core 1.2.0rc1
(PyPI | GitHub release notes).
dbt-labs/dbt-core#5468 is an open discussion with more detailed information, and dbt-labs/dbt-core#5474 is for keeping track of the communities progress on releasing 1.2.0
Below is a checklist of work that would enable a successful 1.2.0 release of your adapter.
BaseDocsGenerate
and BaseDocsGenReferences
dbt-labs/dbt-core#5432 might make it into the second release cut in the next week, in which case, you'll also might want to:
I've shut down the Azure subscription that was hosting our infrastructure for both dbt-sqlserver and dbt-synapse. The reason being that it was on my personal credit card.
Given the forthcoming 1.4.0 releases for both adapters, it's a requirement that we can run these integration tests against the required changes.
The GitHub Action workflows are still defined and configured properly, the remaining work is:
This csv fails to materialize because of line 63 where last_name=O'Gready
with the following error:
Expected but did not find a closing quotation mark after the character string
'
.
Here's the query. I escaped the inner '
because it also broke the GH syntax highlighting ๐
I'm sure the other adapters have a way to escape the quote delimiters within strings.
insert into "dbtsynapseci"."dbt_external_tables_integration_tests_synapse"."data_people" (
id, first_name, last_name, email, ip_address, created_at, is_active
)
SELECT '60','Anthia','Tolland','[email protected]','124.60.13.101','2016-02-06 14:38:37','True' UNION ALL
SELECT '61','Annamarie','Pipworth','[email protected]','53.219.191.107','2017-06-13 08:29:04','True' UNION ALL
SELECT '62','Price','O\'Gready','[email protected]','131.188.180.57','2016-09-28 08:44:38','' UNION ALL
SELECT '63','Sergei','Cicero','[email protected]','100.97.16.84','2017-10-02 15:58:45','' UNION ALL
SELECT '64','Dolorita','Lilion','[email protected]','150.43.252.51','2017-09-06 12:39:46','True' UNION ALL
SELECT '65','Perrine','Peetermann','[email protected]','93.27.202.229','2017-07-08 08:49:37','' UNION ALL
SELECT '66','Frieda','Gemelli','[email protected]','20.21.177.102','2016-04-18 05:58:59','' UNION ALL
SELECT '67','Webster','Tully','[email protected]','61.55.62.136','2016-02-18 11:01:23','True' UNION ALL
SELECT '68','Clara','Dadd','[email protected]','67.84.203.36','2017-06-10 22:20:50','' UNION ALL
SELECT '69','Gardener','Clarkin','[email protected]','211.175.17.92','2017-11-27 23:33:42','True'
The latest version of dbt Core,dbt-core==1.5.0rc1
, was published on April 13, 2023 (PyPI | Github).
dbt-labs/dbt-core#7213 is an open discussion with more detailed information. If you have questions, please put them there!
The above linked guide has more information, but below is a high-level checklist of work that would enable a successful 1.5.0 release of your adapter.
1.6.0
FYI, dbt-core==1.6.0
is expected to be released at the end of July, with a release cut at least two weeks prior.
Minor version v1.8
is targeted for final release within dbt Core on May 9, 2024.
As a maintainer of a dbt adapter, we strongly encourage you to release a corresponding minor version increment to ensure users of your adapter can make use of this new minor version.
As of dbt-core v1.8.0
, we no longer need to encourage you to release a new minor version anytime we do. After following the linked upgrade guide, we guarantee your adapter will be forward compatible with all future minor versions of dbt-core
(at least until v2.0
which is not yet planned).
Another major win: you can now make your adapter truly SemVer compliant, as you can release new versions of your adapter without needing to wait for a new dbt-core
release. You can actually follow
dbt-labs/dbt-core#9798 is an open discussion with more detailed information. If you have questions, please put them there!
When you add this macro as a post-hook, you get maximum recursion depth exceeded while calling a Python object
when running dbt compile
When running incremental models on Synpase Replicated tables an error is thrown:
[42000] [Microsoft][ODBC Driver 17 for SQL Server][SQL Server]Option 'REPLICATE User Temp Table' is not supported in this version of SQL Server. (104458) (SQLExecDirectW)
It appears the fix is to override the sqlserver__make_temp_relation
macro and remove the # ~ (temp table identifier) from {% set tmp_identifier = '#' ~ base_relation.identifier ~ suffix %}
I'm looking to configure tables created from seed files with a HEAP index and REPLICATE distribution, adding this to the seed config does not seem to work like it does for models.
version: 2
seeds:
- name: country_codes
config:
index: HEAP
dist: REPLICATE
The index and distribution of the resulting table can be retrieved using following SQL queries:
SELECT i.name AS index_name
,i.type_desc
FROM sys.indexes AS i
INNER JOIN sys.data_spaces AS ds ON i.data_space_id = ds.data_space_id
WHERE is_hypothetical = 0 AND i.index_id <> 0
AND i.object_id = OBJECT_ID('main.country_codes');
GO
SELECT
OBJECT_SCHEMA_NAME( object_id ) schemaName,
OBJECT_NAME( object_id ) tableName,
*
FROM sys.pdw_table_distribution_properties
WHERE OBJECT_NAME( object_id ) = 'country_codes'
Outcome
The table is created with a CLUSTERED COLUMNSTORE index and ROUND_ROBIN distribution which are the default
Expected outcome
The created table has a HEAP index and REPLICATE distribution
A workaround is to treat the seeded table as a staging table and recreate a new table using a model with the desired configuration.
Hi,
I want to load my parquet file from Azure Data Lake Gen2/Azure Blob Storage to the dedicated pool in Azure Synapse.
I tried to use
{{ config(materialized='table') }}
with my_table as (
SELECT
TOP 100 *
FROM
OPENROWSET(
BULK 'myparquetfile_url',
FORMAT = 'PARQUET'
) AS [result]
)
select * from my_table
but no luck.
It shows this error
('42000', "[42000] [Microsoft][ODBC Driver 17 for SQL Server][SQL Server]Parse error at line: 7, column: 9: Incorrect syntax near 'OPENROWSET'. (103010) (SQLExecDirectW)")
So, how can I load parquets file from blob storage to dedicated pool using dbt?
We've just published the release cut of dbt-core 1.2.0, dbt-core 1.2.0rc1
(PyPI | GitHub release notes).
dbt-labs/dbt-core#5468 is an open discussion with more detailed information, and dbt-labs/dbt-core#5474 is for keeping track of the communities progress on releasing 1.2.0
Below is a checklist of work that would enable a successful 1.2.0 release of your adapter.
BaseDocsGenerate
and BaseDocsGenReferences
dbt-labs/dbt-core#5432 might make it into the second release cut in the next week, in which case, you'll also might want to:
Do you have a plan to implement a feature that allows to access Azure Synapse with the serverless pool as well? Currently, it seems this is not possible because the Azure Synapse serverless pool is based on external tables.
testing 123
testing 123
Hi.
I'm running some test cases evaluating dbt using the Synapse provider. I came across this error when adding a new column:
Test Case - Rename Column
{{ config(materialized='incremental'
, index='clustered index(OLDACNUM)'
, dist='REPLICATE'
, on_schema_change='sync_all_columns' ) }}
SELECT TOP (1000)
[RecordCreateRunId]
,[DEBTORNUM]
,[OLDACNUM]
,[CREDCOLACTION]
,[LASTSTMTDATE]
,[CUST_TYPE] AS CustomerType
,[LETTERDATE]
,[CHARGE_CODE]
,[NUM_REMINDERS]
,[PAYMENT_CODE]
,[CYCLE_GROUP]
,ODS_START_DATE
,1 AS NEW_COLUMN
FROM [DBO].[DEBTOR]
{% if is_incremental() %}
-- this filter will only be applied on an incremental run
WHERE ODS_START_DATE > (select max(ODS_START_DATE) from {{ this }})
{% endif %}
Incremental update
...
05:14:39.102035 [debug] [Thread-1 (]: SQLServer adapter: SQL status: OK in 0.04 seconds
05:14:39.120042 [debug] [Thread-1 (]:
In "DBNAME"."DBT"."STG_DEBTORS_NEW":
Schema changed: True
Source columns not in target: [<Column CustomerType (character varying(2))>]
Target columns not in source: [<Column CUST_TYPE (character varying(2))>]
New column types: []
...
alter table "DBNAME"."DBT"."STG_DEBTORS_NEW"
add column CustomerType character varying(2)
,
drop column CUST_TYPE
Failed with Error Incorrect syntax near 'column'
Offending code was:
alter table "DBNAME"."DBT"."STG_DEBTORS_NEW"
add column CustomerType character varying(2)
,
drop column CUST_TYPE
should be:
alter table "DBT"."STG_DEBTORS_NEW" add CustomerType varchar(2);
alter table "DBT"."STG_DEBTORS_NEW" drop column CUST_TYPE;
I would suggest a fix, but I couldn't find an SQL server or synapse macro specifically for "rename column" is this just defaulting to the default provider? Happy to submit a PR if someone can point me in the right direction.
Versions dbt=1=1=2, sqlserver=1.1.0, synapse=1.1.0
The latest release cut for 1.3.0, dbt-core==1.3.0rc2
was published on October 3, 2022 (PyPI | Github). We are targeting releasing the official cut of 1.3.0 in time for the week of October 16 (in time for Coalesce conference).
We're trying to establish a following precedent w.r.t. minor versions:
Partner adapter maintainers release their adapter's minor version within four weeks of the initial RC being released. Given the delay on our side in notifying you, we'd like to set a target date of November 7 (four weeks from today) for maintainers to release their minor version
Timeframe | Date (intended) | Date (Actual) | Event |
---|---|---|---|
D - 3 weeks | Sep 21 | Oct 10 | dbt Labs informs maintainers of upcoming minor release |
D - 2 weeks | Sep 28 | Sep 28 | core 1.3 RC is released |
Day D | October 12 | Oct 12 | core 1.3 official is published |
D + 2 weeks | October 26 | Nov 7 | dbt-adapter 1.3 is published |
dbt-labs/dbt-core#6011 is an open discussion with more detailed information, and dbt-labs/dbt-core#6040 is for keeping track of the community's progress on releasing 1.2.0
Below is a checklist of work that would enable a successful 1.2.0 release of your adapter.
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.