liberapay / postgres.py Goto Github PK
View Code? Open in Web Editor NEWA nice PostgreSQL client library
Home Page: https://postgres-py.readthedocs.org/
License: MIT License
A nice PostgreSQL client library
Home Page: https://postgres-py.readthedocs.org/
License: MIT License
I want to have fine-grained control over statement_timeout.
It'd be nice to have a built-in API for lazily loading foreign records. See #22 for a pre-loading implementation.
Postgres >= 9.2 can parse URLs itself, so we should check versions and pass through.
We should also be smart about not including host in the connection string at all if it's not set in the URL, because that affects whether Postgres uses a Unix or TCP socket. See https://botbot.me/freenode/gittip/msg/10532161/.
I have a column of a custom type, and I want a reference to the parent for the child. I want to be able to do:
foo = Participant.from_username('foo')
foo.email.send(message)
... and inside send
have access to foo.username
, etc.
From @zwn at 935feba#commitcomment-5532271 :
I think the
default
should never be raised. Instead another param called mayberaises
should be added to make it clear at the call site that the function raises.
Often you are fine with zero or one result but more than one would indicate a bug. Right now you can do:
try:
rec = db.one(QUERY)
except TooFew:
rec = None
if rec is None:
pass
else:
pass
But it might be nicer to be able to do:
rec = db.one_or_zero(QUERY)
if rec is None:
pass
else:
pass
if I have a table in a schema other than public, Psycopg2 will check for the form 'schema.type' but postgres.py does not, so a model with typname 'type' will pass:
631 n = self.one( "SELECT count(*) FROM pg_type WHERE typname=%s"
632 , (typname,)
633 )
but it will fail in psycopg2.extras because of missing schema name.
This works:
try:
db.run('...')
except IntegrityError:
db.run('...')
This doesn't:
with db.get_cursor() as c:
try:
c.run('...')
except IntegrityError:
c.run('...')
Push to PyPI and figure out versioning on RTD.
But make it configurable as part of Postgres instantiation, and also per-call.
We ran into an issue over on gratipay/gratipay.com#1583 in that psycopg2 caches information about composite types, so if you alter the type in the database then the cache becomes stale, and the next time you request a record of the composite type, you get a DataError
. From @dvarrazzo:
The composite cache is local to where register_composite was called (globally, the connection, or the cursor). You can avoid the problem using it in a restricted scope (e.g. on a cursor, and dropping the cursor after the type is changed). Alternatively just rerun register_composite with the same arguments after changing the type.
And over on http://psycopg.lighthouseapp.com/projects/62710/tickets/183:
Caching of the attribute is necessary but it's local to the scope you have registered the composite.
Avoid using register_composite with global=True: use it on the connection or on the cursor, and dispose the connection or the cursor after the data type is changed. Alternatively just call register_composite again.
How do we eliminate this race condition?
"which is a psycopg2 compatible but it works on PyPy"
@dstufft in unlogged IRC
Our simple API: run
/ one
/ rows
one
and rows
aren't parallel. :-(
We should either change one
to row
or rows
to many
. I like row
better because it keeps the aliteration and one
and many
feel ORM-ish to me.
Since we've already released 1.0.0 we've got a backwards-compatibility issue. Since we're so early in the project (I released 1.0.0 about 8 or 9 hours ago), I think we can get away with aliasing row
as one
but not documenting it.
@kennethreitz has launched Records, "SQL for Humans™". Given the relative star count (~1,500 in 6 days vs. 36 in 3 years), it seems that the Python world now has a de facto DB-API 3.0—and it's not ours. :o)
Records is database-agnostic (cf. kennethreitz/records#41). So far it seems to me that there's value in continuing with Postgres.py because of the ORM layer. The proposal here is to change our top-level run
/one
/all
API to conform to what @kennethreitz has come up with:
row_iterator = db.query('select * from foo')
row_list = row_iterator.all()
So we'd essentially rename run
to query
, and chain all
(and one
, which Records doesn't have—yet?) onto the result.
Should make this configurable, actually.
with db.get_cursor(SQL) as cursor:
pass
The get_
there is clunky. Let's name these like constructors:
with db.Cursor(SQL) as cursor:
pass
It's redundant and cluttersome.
>>> with db.get_cursor() as cursor:
... cursor.execute("select * from foo")
...
>>> cursor
<cursor object at 0x10027d350; closed: 0>
needs to change version in:
setup.py
docs/conf.py
postgres/__init__.py
Eventually we need a test matrix with multiple versions of Python, psycopg2, and PostgreSQL. Let's start with one cell.
CC0 is problematic outside the U.S. SQLite solves the problem by charging people who care $1,000. We should just drop back to MIT for path of least resistance.
I tried once and hit some bug. For that matter make sure we can use all of the cursor attributes.
Makes for better docs. (query, vars) is what psycopg2 uses under the hood. Docs use (operation, parameters):
Why don't we have this? It seems to me that a column name can be checked with a simple regexp, and a placeholder different than %s
(e.g. %c
) could be used to insert a column name into a query.
This is probably more of a psycopg2 issue, but let's discuss it here first, especially since we're considering porting to asyncpg (#58).
Postgres.all()
and Postgres.one()
support the back_as
argument, but the same methods on cursors don't, which for example prevents doing (cursor or db).all(..., back_as=dict)
.
I am mostly interested in the info that this can actually run on Python 3.
I would submit a patch, but I don't know what maturity level you consider this to be.
These names are taken from DB-API 2.0, but the semantics differ enough that to use the same names is confusing. Specifically, in DB-API you always call execute
before you call fetchone
or fetchall
.
postgres.py
is a great library. one_or_zero
is a bad method name.
It does not return 0
.
one_or_none
would be better.
Or maybe just one
. It's a nice reflection of all
.
Or maybe
if you want to give Haskellites warm-fuzzies (if they get warm-fuzzies).
Or if you really like one_or_zero
then I'll open a ticket to rename all
to all_or_nothing
. ;-)
Reference: gratipay/gratipay.com#3282
Doesn't work:
db.one("SELECT 1 FROM participants LIMIT 1")
Works:
db.one("SELECT 1 AS something FROM participants LIMIT 1")
.
Postgres returns ?column?
as the column name if it can't find one, here's the error thrown on our side:
ValueError: Type names and field names can only contain alphanumeric characters and underscores: '?column?'
E.g., in docs/conf.py
.
Easier to inspect and work with a list. If you're returning enough data to warrant a generator then drop down to a lower level.
The following returns an object of whatever type bar
is: we dereference bar
when it's the only field requested.
bar = db.one('select foo from bar')
But what about this?
bar = db.one('select foo from bar', back_as=dict)
Right now the dereferencing trumps the back_as
, but @seanlinsley, at least, expected the back_as
to bind tighter. I can see that. Anyone else?
Right now get_cursor takes args and actually does an execute. It shouldn't, it should just return the cursor. This will make PostgresCursorContextManager simpler.
asyncpg is a database interface library designed specifically for PostgreSQL and Python/asyncio. asyncpg is an efficient, clean implementation of PostgreSQL server binary protocol for use with Python's asyncio framework.
Now that gratipay/gratipay.com#1320 is landed.
I have a model that I'd like to register for multiple types. The usecase is making a type that basically inherits from another type (though afaict Postgres doesn't implement type inheritance yet), overriding a foreign key field to actually subselect the record from the other table. By registering a single composite for multiple such types I could determine at query time whether I'd end up with the subselect or not.
That will make it easier to interchange db and cursor objects. Cropped up at gratipay/gratipay.com#2729 (comment). Similar to #39.
If you do:
db.one("create table foo (bar text)")
you'll get a TooFew: Got -1 rows instead of 1.
error, but the table will indeed be created. What would it be like to turn autocommit off and only commit db.one transactions when they returned exactly one row? What are the performance characteristics of autocommit?
That would allow postgres.Postgres() to connect to the default database (localhost, current user).
I think Robbelard discusses something like this in his talk.
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.