Giter VIP home page Giter VIP logo

slonik's People

Contributors

adevine avatar alxndrsn avatar aravindanve avatar aust1nz avatar bdbernardy avatar ctusch avatar dependabot[bot] avatar dwickern avatar freedomben avatar gajus avatar indeyets avatar issa-tseng avatar janpaepke avatar kallberg avatar mgreenw avatar mik546 avatar mmkal avatar ncurbez avatar nem035 avatar nponiros avatar oguimbal avatar ovstetun avatar paulovieira avatar pimm avatar sebald avatar skarbovskiy avatar skaterdad avatar tdavis avatar thecodeboss avatar yoganlava avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

slonik's Issues

Unable to specify SSL cert and key

Given the following connection URL:

postgres://user:[email protected]?ssl=true&sslcert=/path/to/db.cert&sslkey=/path/to/db.key

I get the following error:

ConnectionError: connection requires a valid client certificate

It appears Slonik does not recognize the sslcert and sslkey parameters. After some investigation, I believe this is related to iceddev/pg-connection-string#8

I see a few options:

Node 8: process.hrtime.bigint is not a function

It looks like Slonik is using process.hrtime.bigint to track query execution time. Unfortunately, this function is not available until Node.js 10.7.0, since V8 didn't have bigints for a while.

I did a fair chunk of work converting my code use Slonik (which is great), but I'm running on AWS Lambda which only supports Node 8 at the latest. Currently that means I can't use Slonik.

Would it be possible to switch to using the non-bigint version process.hrtime that uses a tuple instead? That may not work, I suppose, as it would change the interface for interceptors, I think.

Add a helper function to construct connection URI

Connection URI is the most convenient way to store database configuration. However, as witnessed by submitted examples, others use object-configuration. The latter requires to manually construct connection URI, e.g.

const connectString = `postgresql://${encodeURIComponent(config.user)}:${encodeURIComponent(config.password)}@${config.host}:${config.port}/${config.database}`

#43 (comment)

This would obscure problems such as missing part of configuration until late in execution.

A solution would be a simple utility to construct the connection string and warn about missing fragments of the configuration.

Cannot disable default type parsers

I'm adapting code that was using node-postgres directly, so I'm already used to getting Date objects for my timestamp fields. I wanted to keep this behavior, so I tried to provide an empty array for typeParsers as the README suggests.

But it looks like passing an empty array still causes the default preset of type parsers to be used.

I was able to workaround this for now by choosing a type I don't plan to use (xml in my case) and configuring a dummy type parser for it.

Allow to override sql.assignmentList snake-case normalisation

sql.assignmentList converts object keys to snake-case, e.g.

await connection.query(sql`
  UPDATE foo
  SET ${sql.assignmentList({
    barBaz: sql.raw('to_timestamp($1)', ['qux']),
    quuxQuuz: sql.raw('to_timestamp($1)', ['corge'])
  })}
`);

Produces:

{
  sql: 'UPDATE foo SET bar_baz = to_timestamp($1), quux_quuz = to_timestamp($2)',
  values: [
    'qux',
    'corge'
  ]
}

This behaviour might be sometimes undesirable.

Add a way override this behaviour.

Insert not available on pool?

I noticed that while most of the query methods are available directly on the pool object, is this intentional?

If just doing a single insert, and thus not needing transactions or anything, it would be practical to just be able to call

      await pool.insert(sql`
        INSERT INTO example
          (name, created)
          VALUES (${name}, CURRENT_TIMESTAMP)
      `)

instead of what's currently required:

      pool.connect(async (connection) => {
        await connection.insert(sql`
          INSERT INTO example
            (name, created)
            VALUES (${name}, CURRENT_TIMESTAMP)
        `)
      })

I may have missed something, is there a reason that getting a connection is required for insert, but not the other query methods?

Configurable `pg.Pool`

Hi! Great work on the library ๐Ÿ‘

Are you open to allowing pg.Pool to be passed in as an argument to createPool? In my case, that would allow me to share the same connection pool between TypeORM and Slonik.

Canonical way of partially updating a table

Hey,
I was thinking about what should be the canonical way to partially update a table.

Setup

For example, we have the following table:

books
- id
- name
- description
- author

What I want to do?

I want to partially update a row from the books table, e.g. update name and description, but not the author.

Maybe it goes against the principle of the lib, namely:

Discourages ad-hoc dynamic generation of SQL.

but I think it's really a common problem that users need to solve.

How can you do it if you update all columns?

const bookId = 1;
const newName = "New Book Name";
const newDescription = "New Book Description";
const newAuthor = "New Book Author";
const updateNameStatement = sql`name = ${newName}`;
const updateStatusStatement = sql`status = ${newDescription}`;
const updateAuthorStatement = sql`status = ${newAuthor}`;

const updateBookSql = sql`UPDATE books
SET
  ${updateNameStatement},
  ${updateStatusStatement},
  ${updateAuthorStatement}
WHERE id = ${bookId};`;

How can you do it if you update a subset of columns?

The most obvious way is to use the sql.raw, but you need to take care of the placeholder index by yourself when generating the query.
A better solution is to construct it using sql tag literal, for example:

const bookId = 1;
const newName = "New Book Name";
const newDescription = "New Book Description";
const updateNameStatement = sql`name = ${newName}`;
const updateStatusStatement = sql`status = ${newDescription}`;
const updateStatements = [updateNameStatement, updateStatusStatement];

const updateBookSqlWithoutWhere = updateStatements.reduce(
  (updateQuery, currentUpdateStatement, index) => {
    if (index === updateStatements.length - 1) {
      return sql`${updateQuery} ${currentUpdateStatement}`;
    }

    return sql`${updateQuery} ${currentUpdateStatement},`;
  },
  sql`UPDATE books SET`
);

const updateBookSql = sql`${updateBookSqlWithoutWhere} WHERE id = ${bookId};`;

I think it's better than using sql.raw approach defined in the docs. Do you see anything grossly wrong with this approach?

Note: Currently, I am not using the library, I'm using sequelize with mostly raw SQL queries, but am considering to migrate to your library. :)

Question: Flow rational

Not an issue: just wanted to ask why was Flow considered here (versus using TypeScript). I'm just collecting some data related to trends that'll be public.

Create interceptor for processing custom tagged template value expressions

The way that Slonik is using tagged template literals to generate code fragments and safely bind parameter values is particularly exciting. Therefore, I am exploring feasibility of allowing community to bring their own code-generating extensions that integrate into Slonik template language. These extensions will allow to abstract business-specific fragments of code (e.g. dynamic WHERE conditions for a particular business requirement); big development teams can use this pattern to restrict any use of raw.sql in the main codebase. This enables separating integration and code generation tests.

The primary considerations:

  • Design of value expression that does not compromise the strict type nature of the current tags.
  • Client configuration that restricts use of sql.raw tokens, but allows custom extensions.

Implement cache middleware

Something along the lines of:

const seatingAreaId = await connection.oneFirst(sql`
  -- @cache-time 10 seconds
  SELECT id
  FROM seating_area
  WHERE
    seating_plan_id = ${seatingPlanId} AND
    fuid = ${seat.seatingAreaFuid}
`);

How to do a graceful shutdown (close open connection)

is there any method for closing any active database connections?

When using jest with the following test:

it("test", async () => {
  const slonik = createPool(
    `postgres://postgres:[email protected]:7002/postgres`
  );
  await slonik.connect(async conn => {
    await conn.many(sql`SELECT * FROM "users"`);
  });
});

I get the following console output:

Test Suites: 1 passed, 1 total
Tests:       1 passed, 1 total
Snapshots:   0 total
Time:        5.968s, estimated 6s
Ran all test suites.
Jest did not exit one second after the test run has completed.

It seems like it is currently not possible to close connection of a pg.Pool instance. If there is not would you accept a pull request that implements this functionality?

Wrong parameter order

await connection.query(sql`
  UPDATE
    task
  SET
    result = ?,
    ended_at = NOW()
  WHERE
    id = ${task.id}
`, [
  JSON.stringify(results && results.data)
]);

Is executed as result = task.id and id = results.data.

non-intuitive use template strings: "error: syntax error at or near $1"

I've been struggling for a while with this error:

'use strict';

const config = require('config');
const slonik = require('slonik');

const createPool = slonik.createPool;
const sql = slonik.sql;

const pool = createPool(config.pgConnectionUri);

const good = async () =>
  await pool.query(sql`
    insert into ff.users (name, password, email, fk_role) values
    ('jon doe', 'none', '[email protected]', 'admin')`);

const bad = async () => {
  const query = `
    insert into ff.users (name, password, email, fk_role) values
    ('down jones', 'none', '[email protected]', 'admin')`;
  return await pool.query(sql`${query}`);
};

good(); // inserts ok
bad(); // error: syntax error at or near \"$1\"\n

I thought those two ways to make a query would be equivalent.

If I copy console.log(query) result and paste it on my psql REPL the query works fine, so it took me a good amount of time to realize sql template doesn't work that way.

Two questions:

  1. could the error message be improved?
  2. I read Inserting large number of rows but that's awful (the $1, $2, etc. is a hard to mentally map pattern), is there a plan to include a way to pass objects to queries? Or maybe accept the bad case in may example?

My original idea was on this line:

const users = [ {name: '', password: '', email: '', fk_role: ''}, ...];
const userValues = () => users.map(u => `('${u.name}', '${u.password}', '${u.email}', '${u.fk_role}')`).join(',');
const query = `insert into ff.users (name, password, email, fk_role) values ${userValues()}`;
pool.query(sql`${query}`);

Which is a nice way to build SQL strings

Why slonik?

Slonik word is a diminutive word of slon (elephant) in Russian, like a kitty word )

Re-enable coveralls

Coveralls are failing with an error:

/home/travis/build/gajus/slonik/node_modules/coveralls/bin/coveralls.js:18
        throw err;
        ^
Bad response: 422 {"message":"Couldn't find a repository matching this job.","error":true}
The command "nyc report --reporter=text-lcov | coveralls" exited with 1.

https://travis-ci.org/gajus/slonik/jobs/495612385#L767

This appears to be an issue with coveralls.io.

Temporarily disabled reporting of coverage.

8e6c7c2

Flow strict mode

I am currently working on a new project that uses Flow's strict mode pretty much everywhere. If I use slonik from these files it warns about slonik not being in strict mode (this can be fixed by using strict-local instead). I wanted to suggest to switch to strict mode for slonik. Projects like graphql-js are already in strict mode. If so, I am also glad to try to do this change and send a PR.

Thanks for the awesome work with slonik and postloader!

Avoiding sending the same value multiple times

await connection
  .query(sql`
    INSERT INTO sale
    (
      product_id,
      person_id,
      price
    )
    VALUES ${sql.valueList(values)}
  `);

and the values that are:

[
  [
    1,
    1,
    100
  ],
  [
    2,
    1,
    100
  ],
  [
    3,
    1,
    100
  ]
]

Currently, this would result in a generated query such as:

INSERT INTO sale
(
  product_id,
  person_id,
  price
)
VALUES
(
  $1,
  $2,
  $3
),
(
  $4,
  $5,
  $6
),
(
  $7,
  $8,
  $9
)

and bound values:

[1, 1, 100, 2, 1, 100, 3, 1, 100]

Instead, we could detect the duplicate values and bind them to the same positional parameters.

The desired result would be:

INSERT INTO sale
(
  product_id,
  person_id,
  price
)
VALUES
(
  $1,
  $1,
  $2
),
(
  $3,
  $1,
  $2
),
(
  $4,
  $1,
  $2
)

and bound values:

[1, 100, 2, 3]

Develop a cache interceptor

Cache interceptor should allow to cache results of a particular query for a defined period of time.

My current thinking is to have cache middleware that would look for comments in the query to pick which queries to cache, e.g.

const maybeId = await connection.maybeOneFirst(sql`
  -- @cache-duration transaction
  SELECT id
  FROM cinema_foreign_seat_type
  WHERE
    cinema_id = ${cinemaId} AND
    fuid = ${fuid}
`);

The first version of the cache interceptor should simply cache query results for the duration of a transaction.

Errors lose information

there seems to be some information lost in a .catch block in javascript somewhere:
setup:

create table if not exists person(id integer primary key, name text);

running this in pgAdmin:

insert into person(name) values ('bob');

outputs:

ERROR:  null value in column "id" violates not-null constraint
DETAIL:  Failing row contains (null, bob).
SQL state: 23502

running the same query with slonik:

slonik.query(sql`insert into person(name) values ('bob');`)

outputs no column name, and no information about the value I tried to insert:

NotNullIntegrityConstraintViolationError: Query violates a not NULL integrity constraint.

Is there a way of having all errors include the full text that came back from the db engine?

Migration management with Slonik

Hi all! ๐Ÿ‘‹

I would love to manage my migrations using Slonik.

What do you think of adding migration management?

Great library, by the way!

Develop an interceptor for benchmarking connection queries

It would be useful to have an interceptor that tracks execution times of all the queries for the duration of the connection and summarises them at the end of the connection. As we do most data processing in batches, this would enable to quickly identify job-specific most time consuming queries.

Design an interface for returning the first column

Often I need to get just the first column of a row. In which case I need to iterate through the results to collect just the first column, e.g.

// @flow

import type {
  DatabaseConnectionType
} from 'mightyql';

const getCountryIds async (
  connection: DatabaseConnectionType,
  code: string
): Promise<number> => {
  const rows = await connection
    .any('SELECT id FROM country WHERE code = ? LIMIT 2', [
      code
    ]);

  return rows.map((row) => {
    return row.id;
  });
};

This is a fairly repetitive pattern. It would be nice to abstract it in a way compatible with one, any and many.

This is primarily the question of how to expose this functionality. Should it simply be oneColumn, anyColumn and manyColumn?

How to force SSL Mode?

Slonik is great BTW.

Only issue is when am trying to use it to connect to an Heroku database from AWS Lambda.

Every attempt results in SSL off, meaning the connection failed (I have tried added ?sslmode=require to the connectionString with no luck).

If l use PG with the ssl: true config then l can successfully connect to the DB, is there any way to have this in Slonik?

Simple example?

Is there a simple start-to-finish example somewhere? I tried the script below using typescript, but I'm getting Cannot find name 'sql'

import { createPool } from 'slonik'

const pool = createPool('postgres://')

function main(asyncMain) {
  asyncMain()
    .then(exitCode => {
      process.exit(exitCode || 0)
    })
    .catch(err => {
      console.log(err)
      process.exit(1)
    })
}

main(async () => {
  pool.connect(async (connection) => {
    console.log('hello')
    console.log(await connection.query(sql`SELECT 1`));
  });
})

Where should I specify my connection parameters BTW? Either a connectstring or host/username/etc?

Inserting array values

Hi - I may be missing something but I don't see any way to build a query like this in the docs:

insert into post(tags) values('{"foo", "bar", "a \"quote-containing\" string"}');

where tags is an array field like text[] (e.g. created with create table post(id serial primary key, tags text[]))

I'd like to be able to insert array values from variables with something like:

const myValues = [myValue1, myValues2, myValue3]
slonik.query(sql`insert into post(tags) values(${sql.arrayValues(myValues)})`)

which would generate the first query.

Right now in the absence of sql.arrayValues I'm having to use

sql.raw(myValues.map(v => JSON.stringify(v)).join(', '))

which doesn't feel right at all.

sql.valueList produces error: bind message supplies 3 parameters, but prepared statement "" requires 0
sql.array is a completely different syntax that (I think) doesn't apply to insert.
identifierList actually works for most strings, but this is probably luck, since these are values, not identifiers, and it breaks for strings with quotes in them.

Do not allow use of bare strings

Now that we have sql tagged template literals, there is a limited use case for "bare" string queries.

I propose that "bare" string query, such as:

await connection.query('SELECT ?');

would throw an error.

Instead, query should be created using sql tagged template literal, e.g.

await connection.query(sql`SELECT ?`);

We can add sql.raw method to allow importing of raw queries, e.g.

await connection.query(sql.raw('SELECT 1'));

The use case for the latter is importing queries stored in a file and dynamically generating queries.

Reject connection promise if backend is terminated

await pool.connect(async (connection0) => {
  await pool.connect(async (connection1) => {
    const backendProcessId = await connection1.oneFirst(sql`SELECT pg_backend_pid()`);

    setTimeout(() => {
      connection0.query(sql`SELECT pg_terminate_backend(${backendProcessId})`)
    }, 2000);

    try {
      await connection1.query(sql`SELECT pg_sleep(30)`);
    } catch (error) {

    }

    console.log('user should not be able to get here; the connection() object is invalid');
  });
});

Access logger output

Hi!

First of all, brilliant package.

I was wondering if it'd be possible to access the log output?
My usecase: I want to log all queries done through slonik.

Decide how to handle QueryPointResultType QueryResultRowType subtype

QueryPointResultType is resulting it loads of errors throughout the codebase warning about attempts to cast an object into a string, e.g.

node_modules/mightyql/dist/types.js.flow:21
 21: type QueryPointResultType = {|
                                 ^ object type. This type cannot be added to
 78:     const fuzzyAddressMatch = _.first(fuseUsingAddress.search(venueDetails.street + ', ' + venueDetails.postcode));
                                                                   ^^^^^^^^^^^^^^^^^^^^^^^^^^ string. See: src/bin/commands/update-venue-google-place-id.js:78

node_modules/mightyql/dist/types.js.flow:21
 21: type QueryPointResultType = {|
                                 ^ object type. This type cannot be added to
 78:     const fuzzyAddressMatch = _.first(fuseUsingAddress.search(venueDetails.street + ', ' + venueDetails.postcode));
                                                                                         ^^^^ string. See: src/bin/commands/update-venue-google-place-id.js:78


Found 3 errors

venueDetails is an instance of QueryResultRowType.

This is a valid error... technically venueDetails.street can be an object. Though guarding against every possible case as such is adding way too much boilerplate code.

Add a way to escape identifiers

Must avoid the likes of:

await connnection.query('GRANT SELECT ON ALL TABLES IN SCHEMA public TO "' + user + '"');
await connnection.query('GRANT EXECUTE ON ALL FUNCTIONS IN SCHEMA public TO "' + user + '"');

Use constants to disable timeouts

The current behaviour is that 0 will disable the timeout.

This can lead to unexpected behaviour, e.g. #63

Instead of using 0, use a constant exported from Slonik to signal intent to disable the timeout.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.