gajus / slonik Goto Github PK
View Code? Open in Web Editor NEWA Node.js PostgreSQL client with runtime and build time type safety, and composable SQL.
License: Other
A Node.js PostgreSQL client with runtime and build time type safety, and composable SQL.
License: Other
Given the following connection URL:
postgres://user:[email protected]?ssl=true&sslcert=/path/to/db.cert&sslkey=/path/to/db.key
I get the following error:
ConnectionError: connection requires a valid client certificate
It appears Slonik does not recognize the sslcert
and sslkey
parameters. After some investigation, I believe this is related to iceddev/pg-connection-string#8
I see a few options:
Currently only the queryId is available in the interceptor query execution context. connectionId and transactionId would come in handy if observing connection/ transaction specific queries.
It looks like Slonik is using process.hrtime.bigint
to track query execution time. Unfortunately, this function is not available until Node.js 10.7.0, since V8 didn't have bigints for a while.
I did a fair chunk of work converting my code use Slonik (which is great), but I'm running on AWS Lambda which only supports Node 8 at the latest. Currently that means I can't use Slonik.
Would it be possible to switch to using the non-bigint version process.hrtime
that uses a tuple instead? That may not work, I suppose, as it would change the interface for interceptors, I think.
"connect" identifies a new connection, "acquire" identifies connection being acquired from an existing connection pool.
Provide feature equivalent to https://github.com/vitaly-t/pg-promise#configurable-transactions.
Now the setupTypeParsers
is executed for every new connection.
slonik/src/factories/createConnection.js
Line 64 in b180ebb
It should be enough to cache parsers for the entire pool.
Connection URI is the most convenient way to store database configuration. However, as witnessed by submitted examples, others use object-configuration. The latter requires to manually construct connection URI, e.g.
const connectString = `postgresql://${encodeURIComponent(config.user)}:${encodeURIComponent(config.password)}@${config.host}:${config.port}/${config.database}`
This would obscure problems such as missing part of configuration until late in execution.
A solution would be a simple utility to construct the connection string and warn about missing fragments of the configuration.
I'm adapting code that was using node-postgres directly, so I'm already used to getting Date
objects for my timestamp
fields. I wanted to keep this behavior, so I tried to provide an empty array for typeParsers
as the README suggests.
But it looks like passing an empty array still causes the default preset of type parsers to be used.
I was able to workaround this for now by choosing a type I don't plan to use (xml
in my case) and configuring a dummy type parser for it.
Does slonik have any way or would there be interest in implementing an abstraction over sending read queries to read replicas and write queries to a master server? I know you aren't a fan of ORMs (based on the medium article) but this is a feature which happens to be supported by Sequelize: http://docs.sequelizejs.com/manual/installation/usage.html#read-replication
There appears to be currently no way to configure them only for a particular instance of pg.
Waiting for an answer to this issue:
Redevelop the guard using Symbol.
sql.assignmentList
converts object keys to snake-case, e.g.
await connection.query(sql`
UPDATE foo
SET ${sql.assignmentList({
barBaz: sql.raw('to_timestamp($1)', ['qux']),
quuxQuuz: sql.raw('to_timestamp($1)', ['corge'])
})}
`);
Produces:
{
sql: 'UPDATE foo SET bar_baz = to_timestamp($1), quux_quuz = to_timestamp($2)',
values: [
'qux',
'corge'
]
}
This behaviour might be sometimes undesirable.
Add a way override this behaviour.
Please add TS definitions.
I noticed that while most of the query methods are available directly on the pool object, is this intentional?
If just doing a single insert, and thus not needing transactions or anything, it would be practical to just be able to call
await pool.insert(sql`
INSERT INTO example
(name, created)
VALUES (${name}, CURRENT_TIMESTAMP)
`)
instead of what's currently required:
pool.connect(async (connection) => {
await connection.insert(sql`
INSERT INTO example
(name, created)
VALUES (${name}, CURRENT_TIMESTAMP)
`)
})
I may have missed something, is there a reason that getting a connection is required for insert, but not the other query methods?
Hey,
I was thinking about what should be the canonical way to partially update a table.
For example, we have the following table:
books
- id
- name
- description
- author
I want to partially update a row from the books
table, e.g. update name
and description
, but not the author
.
Maybe it goes against the principle of the lib, namely:
Discourages ad-hoc dynamic generation of SQL.
but I think it's really a common problem that users need to solve.
const bookId = 1;
const newName = "New Book Name";
const newDescription = "New Book Description";
const newAuthor = "New Book Author";
const updateNameStatement = sql`name = ${newName}`;
const updateStatusStatement = sql`status = ${newDescription}`;
const updateAuthorStatement = sql`status = ${newAuthor}`;
const updateBookSql = sql`UPDATE books
SET
${updateNameStatement},
${updateStatusStatement},
${updateAuthorStatement}
WHERE id = ${bookId};`;
The most obvious way is to use the sql.raw
, but you need to take care of the placeholder index by yourself when generating the query.
A better solution is to construct it using sql
tag literal, for example:
const bookId = 1;
const newName = "New Book Name";
const newDescription = "New Book Description";
const updateNameStatement = sql`name = ${newName}`;
const updateStatusStatement = sql`status = ${newDescription}`;
const updateStatements = [updateNameStatement, updateStatusStatement];
const updateBookSqlWithoutWhere = updateStatements.reduce(
(updateQuery, currentUpdateStatement, index) => {
if (index === updateStatements.length - 1) {
return sql`${updateQuery} ${currentUpdateStatement}`;
}
return sql`${updateQuery} ${currentUpdateStatement},`;
},
sql`UPDATE books SET`
);
const updateBookSql = sql`${updateBookSqlWithoutWhere} WHERE id = ${bookId};`;
I think it's better than using sql.raw
approach defined in the docs. Do you see anything grossly wrong with this approach?
Note: Currently, I am not using the library, I'm using sequelize
with mostly raw SQL queries, but am considering to migrate to your library. :)
Not an issue: just wanted to ask why was Flow considered here (versus using TypeScript). I'm just collecting some data related to trends that'll be public.
The way that Slonik is using tagged template literals to generate code fragments and safely bind parameter values is particularly exciting. Therefore, I am exploring feasibility of allowing community to bring their own code-generating extensions that integrate into Slonik template language. These extensions will allow to abstract business-specific fragments of code (e.g. dynamic WHERE conditions for a particular business requirement); big development teams can use this pattern to restrict any use of raw.sql
in the main codebase. This enables separating integration and code generation tests.
The primary considerations:
sql.raw
tokens, but allows custom extensions.Something along the lines of:
const seatingAreaId = await connection.oneFirst(sql`
-- @cache-time 10 seconds
SELECT id
FROM seating_area
WHERE
seating_plan_id = ${seatingPlanId} AND
fuid = ${seat.seatingAreaFuid}
`);
is there any method for closing any active database connections?
When using jest with the following test:
it("test", async () => {
const slonik = createPool(
`postgres://postgres:[email protected]:7002/postgres`
);
await slonik.connect(async conn => {
await conn.many(sql`SELECT * FROM "users"`);
});
});
I get the following console output:
Test Suites: 1 passed, 1 total
Tests: 1 passed, 1 total
Snapshots: 0 total
Time: 5.968s, estimated 6s
Ran all test suites.
Jest did not exit one second after the test run has completed.
It seems like it is currently not possible to close connection of a pg.Pool
instance. If there is not would you accept a pull request that implements this functionality?
I am doing a lot of round(extract(epoch from intervalField))
.
It would be handy if Slonik defaulted to seconds.
Depends on: bendrucker/postgres-interval#23
await connection.query(sql`
UPDATE
task
SET
result = ?,
ended_at = NOW()
WHERE
id = ${task.id}
`, [
JSON.stringify(results && results.data)
]);
Is executed as result = task.id
and id = results.data
.
I've been struggling for a while with this error:
'use strict';
const config = require('config');
const slonik = require('slonik');
const createPool = slonik.createPool;
const sql = slonik.sql;
const pool = createPool(config.pgConnectionUri);
const good = async () =>
await pool.query(sql`
insert into ff.users (name, password, email, fk_role) values
('jon doe', 'none', '[email protected]', 'admin')`);
const bad = async () => {
const query = `
insert into ff.users (name, password, email, fk_role) values
('down jones', 'none', '[email protected]', 'admin')`;
return await pool.query(sql`${query}`);
};
good(); // inserts ok
bad(); // error: syntax error at or near \"$1\"\n
I thought those two ways to make a query would be equivalent.
If I copy console.log(query)
result and paste it on my psql
REPL the query works fine, so it took me a good amount of time to realize sql
template doesn't work that way.
Two questions:
My original idea was on this line:
const users = [ {name: '', password: '', email: '', fk_role: ''}, ...];
const userValues = () => users.map(u => `('${u.name}', '${u.password}', '${u.email}', '${u.fk_role}')`).join(',');
const query = `insert into ff.users (name, password, email, fk_role) values ${userValues()}`;
pool.query(sql`${query}`);
Which is a nice way to build SQL strings
Slonik
word is a diminutive word of slon
(elephant) in Russian, like a kitty
word )
Coveralls are failing with an error:
/home/travis/build/gajus/slonik/node_modules/coveralls/bin/coveralls.js:18
throw err;
^
Bad response: 422 {"message":"Couldn't find a repository matching this job.","error":true}
The command "nyc report --reporter=text-lcov | coveralls" exited with 1.
https://travis-ci.org/gajus/slonik/jobs/495612385#L767
This appears to be an issue with coveralls.io.
Temporarily disabled reporting of coverage.
I am currently working on a new project that uses Flow's strict mode pretty much everywhere. If I use slonik from these files it warns about slonik not being in strict mode (this can be fixed by using strict-local instead). I wanted to suggest to switch to strict mode for slonik. Projects like graphql-js
are already in strict mode. If so, I am also glad to try to do this change and send a PR.
Thanks for the awesome work with slonik and postloader!
await connection
.query(sql`
INSERT INTO sale
(
product_id,
person_id,
price
)
VALUES ${sql.valueList(values)}
`);
and the values that are:
[
[
1,
1,
100
],
[
2,
1,
100
],
[
3,
1,
100
]
]
Currently, this would result in a generated query such as:
INSERT INTO sale
(
product_id,
person_id,
price
)
VALUES
(
$1,
$2,
$3
),
(
$4,
$5,
$6
),
(
$7,
$8,
$9
)
and bound values:
[1, 1, 100, 2, 1, 100, 3, 1, 100]
Instead, we could detect the duplicate values and bind them to the same positional parameters.
The desired result would be:
INSERT INTO sale
(
product_id,
person_id,
price
)
VALUES
(
$1,
$1,
$2
),
(
$3,
$1,
$2
),
(
$4,
$1,
$2
)
and bound values:
[1, 100, 2, 3]
Cache interceptor should allow to cache results of a particular query for a defined period of time.
My current thinking is to have cache middleware that would look for comments in the query to pick which queries to cache, e.g.
const maybeId = await connection.maybeOneFirst(sql`
-- @cache-duration transaction
SELECT id
FROM cinema_foreign_seat_type
WHERE
cinema_id = ${cinemaId} AND
fuid = ${fuid}
`);
The first version of the cache interceptor should simply cache query results for the duration of a transaction.
there seems to be some information lost in a .catch
block in javascript somewhere:
setup:
create table if not exists person(id integer primary key, name text);
running this in pgAdmin:
insert into person(name) values ('bob');
outputs:
ERROR: null value in column "id" violates not-null constraint
DETAIL: Failing row contains (null, bob).
SQL state: 23502
running the same query with slonik:
slonik.query(sql`insert into person(name) values ('bob');`)
outputs no column name, and no information about the value I tried to insert:
NotNullIntegrityConstraintViolationError: Query violates a not NULL integrity constraint.
Is there a way of having all errors include the full text that came back from the db engine?
Hi all! ๐
I would love to manage my migrations using Slonik.
What do you think of adding migration management?
Great library, by the way!
If I try to insert an array column, I get error: type "varchar[][]" does not exist
await connection.query(sql`
INSERT INTO xx (id, txt)
SELECT *
FROM ${sql.unnest(
[
[1, ['"', "'"]],
[2, ["'", '"']],
],
[
'int4',
'varchar[]'
]
)}
`)
It would be useful to have an interceptor that tracks execution times of all the queries for the duration of the connection and summarises them at the end of the connection. As we do most data processing in batches, this would enable to quickly identify job-specific most time consuming queries.
I find myself aliasing every column name to camelCase format, e.g.
connection.one(sql`
SELECT
id,
access_key "accessKey",
secret_access_key "secretAccessKey"
FROM application
WHERE access_key = ${key}
`);
It would be therefore convenient to have a configuration that automates the conversion.
Often I need to get just the first column of a row. In which case I need to iterate through the results to collect just the first column, e.g.
// @flow
import type {
DatabaseConnectionType
} from 'mightyql';
const getCountryIds async (
connection: DatabaseConnectionType,
code: string
): Promise<number> => {
const rows = await connection
.any('SELECT id FROM country WHERE code = ? LIMIT 2', [
code
]);
return rows.map((row) => {
return row.id;
});
};
This is a fairly repetitive pattern. It would be nice to abstract it in a way compatible with one
, any
and many
.
This is primarily the question of how to expose this functionality. Should it simply be oneColumn
, anyColumn
and manyColumn
?
Slonik
is great BTW.
Only issue is when am trying to use it to connect to an Heroku
database from AWS Lambda
.
Every attempt results in SSL off, meaning the connection failed (I have tried added ?sslmode=require
to the connectionString with no luck).
If l use PG
with the ssl: true
config then l can successfully connect to the DB, is there any way to have this in Slonik
?
Is there a simple start-to-finish example somewhere? I tried the script below using typescript, but I'm getting Cannot find name 'sql'
import { createPool } from 'slonik'
const pool = createPool('postgres://')
function main(asyncMain) {
asyncMain()
.then(exitCode => {
process.exit(exitCode || 0)
})
.catch(err => {
console.log(err)
process.exit(1)
})
}
main(async () => {
pool.connect(async (connection) => {
console.log('hello')
console.log(await connection.query(sql`SELECT 1`));
});
})
Where should I specify my connection parameters BTW? Either a connectstring or host/username/etc?
Hi - I may be missing something but I don't see any way to build a query like this in the docs:
insert into post(tags) values('{"foo", "bar", "a \"quote-containing\" string"}');
where tags
is an array field like text[]
(e.g. created with create table post(id serial primary key, tags text[])
)
I'd like to be able to insert array values from variables with something like:
const myValues = [myValue1, myValues2, myValue3]
slonik.query(sql`insert into post(tags) values(${sql.arrayValues(myValues)})`)
which would generate the first query.
Right now in the absence of sql.arrayValues
I'm having to use
sql.raw(myValues.map(v => JSON.stringify(v)).join(', '))
which doesn't feel right at all.
sql.valueList
produces error: bind message supplies 3 parameters, but prepared statement "" requires 0
sql.array
is a completely different syntax that (I think) doesn't apply to insert
.
identifierList
actually works for most strings, but this is probably luck, since these are values, not identifiers, and it breaks for strings with quotes in them.
Now that we have sql
tagged template literals, there is a limited use case for "bare" string queries.
I propose that "bare" string query, such as:
await connection.query('SELECT ?');
would throw an error.
Instead, query should be created using sql
tagged template literal, e.g.
await connection.query(sql`SELECT ?`);
We can add sql.raw
method to allow importing of raw queries, e.g.
await connection.query(sql.raw('SELECT 1'));
The use case for the latter is importing queries stored in a file and dynamically generating queries.
await pool.connect(async (connection0) => {
await pool.connect(async (connection1) => {
const backendProcessId = await connection1.oneFirst(sql`SELECT pg_backend_pid()`);
setTimeout(() => {
connection0.query(sql`SELECT pg_terminate_backend(${backendProcessId})`)
}, 2000);
try {
await connection1.query(sql`SELECT pg_sleep(30)`);
} catch (error) {
}
console.log('user should not be able to get here; the connection() object is invalid');
});
});
e.g.
connection.query`INSERT INTO reservation_ticket (reservation_id, ticket_id) VALUES ${values}`;
Hi!
First of all, brilliant package.
I was wondering if it'd be possible to access the log output?
My usecase: I want to log all queries done through slonik.
SELECT '$1'
QueryPointResultType
is resulting it loads of errors throughout the codebase warning about attempts to cast an object into a string, e.g.
node_modules/mightyql/dist/types.js.flow:21
21: type QueryPointResultType = {|
^ object type. This type cannot be added to
78: const fuzzyAddressMatch = _.first(fuseUsingAddress.search(venueDetails.street + ', ' + venueDetails.postcode));
^^^^^^^^^^^^^^^^^^^^^^^^^^ string. See: src/bin/commands/update-venue-google-place-id.js:78
node_modules/mightyql/dist/types.js.flow:21
21: type QueryPointResultType = {|
^ object type. This type cannot be added to
78: const fuzzyAddressMatch = _.first(fuseUsingAddress.search(venueDetails.street + ', ' + venueDetails.postcode));
^^^^ string. See: src/bin/commands/update-venue-google-place-id.js:78
Found 3 errors
venueDetails
is an instance of QueryResultRowType
.
This is a valid error... technically venueDetails.street
can be an object. Though guarding against every possible case as such is adding way too much boilerplate code.
Must avoid the likes of:
await connnection.query('GRANT SELECT ON ALL TABLES IN SCHEMA public TO "' + user + '"');
await connnection.query('GRANT EXECUTE ON ALL FUNCTIONS IN SCHEMA public TO "' + user + '"');
The current behaviour is that 0 will disable the timeout.
This can lead to unexpected behaviour, e.g. #63
Instead of using 0, use a constant exported from Slonik to signal intent to disable the timeout.
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.