Giter VIP home page Giter VIP logo

dbr's People

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

dbr's Issues

how to insert sql command like NOW()

e.g. : InsertInto("user").
Columns("create_date").
Values( "NOW()").Exec()

how to do something like this?
the nowsentinel seems to use the server's time for this.
would b good to use the NOW() command in the db to ensure time consistency

Load into map[string]interface{}

Allow to use map[string]interface{} in function Load,
where key is column name, value is column value.

result := make(map[string]interface{})
count, err := s.Select("a", "b").From("table1").Where("a=1").Load(&result)
// expected result contains 2 items, a and b with appropriate column values.

Sqlite Support

Since sqlite3 SQL dialect is basically compatible with mysql it would be good if it were allowed in the Open method even if not officially supported or mentioned in the docs.

Table name alias with JOIN statements

I'd like to use table alias with JOIN statements.

// not use AS
sess.Select("*").From("suggestions").LeftJoin("people", "people.suggestion_id = suggestions.id")

// use AS only "From" => Success
sess.Select("*").From("suggestions AS s").LeftJoin("people", "people.suggestion_id = s.id")

// use AS both "From" and "LeftJoin" => Failure
sess.Select("*").From("suggestions AS s").LeftJoin("people AS p", "p.suggestion_id = s.id")
// Error 1146: Table 'mydb.people AS p' doesn't exist

It is better to be possible. Or, is there something good other way?

Time Parsing from DB

Hi Folks,

I like what you are doing here with this DB Wrapper, but I am having an issue with it. It seems that I may be missing something.

I'm selecting from a table with a 'timestamp' field. The struct into which that time is going contains this:

LastModified time.Time

However, all of my dates are incorrect. They come out as:

LastModified:0001-01-01 00:00:00 +0000 UTC

Your docs don't talk about time fields, so I assumed this would be a no-brainer. But they are all what looks to be the default value.

Thanks for your code, and you help in this matter.

feature request: .Set from a structure

Implement a func (b *UpdateStmt) Set(column string, value interface{}) *UpdateStmt taking a struct as value and selecting the appropriate struct field like in Load().

This way you could easily implement a Set(columns []string, value interface{}) updating multiple fields from the same structure.

Allow use map type for IN statement

It would be useful to allow use map type for IN statements. map keys will be used as value, like in python for instance, when dict object used as iterable.
In this case instead of converting manually map to slice, client will be allow to pass map as is.

Delete with Join

It would be great if dbr could add support for delete statements like this:

DELETE posts
FROM posts
INNER JOIN projects ON projects.project_id = posts.project_id
WHERE projects.client_id = ?

Every time opens new connection

package main

import (
	"database/sql"
	"encoding/json"
	"log"
	"os"

	_ "github.com/go-sql-driver/mysql"
	"github.com/gocraft/dbr"
	"github.com/gocraft/health"
	"gopkg.in/kataras/iris.v6"
	"gopkg.in/kataras/iris.v6/adaptors/httprouter"
	"gopkg.in/kataras/iris.v6/middleware/logger"
)

func main() {
	app := iris.New()
	app.Adapt(iris.DevLogger())

	// Adapt the "httprouter", faster,
	// but it has limits on named path parameters' validation,
	// you can adapt "gorillamux" if you need regexp path validation!
	app.Adapt(httprouter.New())
	customLogger := logger.New(logger.Config{
		// Status displays status code
		Status: true,
		// IP displays request's remote address
		IP: true,
		// Method displays the http method
		Method: true,
		// Path displays the request path
		Path: true,
	})

	app.Use(customLogger)
	app.HandleFunc("GET", "/", func(ctx *iris.Context) {
		ctx.Writef("hello world\n")
	})
	app.Get("/msg", GetMsg)

	app.Listen(":4000")
}

func GetMsg(ctx *iris.Context) {
	var stream = health.NewStream()
	stream.AddSink(&health.WriterSink{os.Stdout})

	// create a connection (e.g. "postgres", "mysql", or "sqlite3")
	conn, err := dbr.Open("mysql", "root:@/demo", stream)
	// conn.SetMaxIdleConns(10)
	// conn.SetMaxOpenConns(40)
	if err != nil {
		panic(err)
	}

	// create a session for each business unit of execution (e.g. a web request or goworkers job)
	sess := conn.NewSession(nil)
	sess.Exec("USE demo")
	// get a record
	var msg Msg
	msg.M = "What a nice emoji"
	tx, _ := sess.Begin()

	defer tx.RollbackUnlessCommitted()

	tx.InsertInto("msg").Columns("m").Record(msg)
	i, e := tx.Select("id", "m").From("msg").
		Where("m like ?", "What%").
		OrderBy("id").
		Limit(1).Load(&msg)
	if e != nil {
		panic(e)
	}
	tx.Commit()
	log.Println(i)

	// JSON-ready, with dbr.Null* types serialized like you want
	json.Marshal(&msg)
	ctx.JSON(iris.StatusOK, msg)
}

type Msg struct {
	ID NullInt64
	M  string
}

type NullInt64 struct {
	sql.NullInt64
}

func (ni NullInt64) MarshalJSON() ([]byte, error) {
	if ni.Valid {
		return json.Marshal(ni.Int64)
	}
	return json.Marshal(nil)
}
func (ni *NullInt64) UnmarshalJSON(data []byte) error {
	// Unmarshalling into a pointer will let us detect null
	var x *int64
	if err := json.Unmarshal(data, &x); err != nil {
		return err
	}
	if x != nil {
		ni.Valid = true
		ni.Int64 = *x
	} else {
		ni.Valid = false
	}
	return nil
}

> 170506 14:59:56 1833 Connect root@localhost on
1833 Query SELECT @@max_allowed_packet
1833 Query USE demo
1833 Query START TRANSACTION
1833 Query SELECT id, m FROM msg WHERE (m like 'What%') ORDER BY id LIMIT 1
1833 Query COMMIT
170506 15:00:03 1834 Connect root@localhost on
1834 Query SELECT @@max_allowed_packet
1834 Query USE demo
1834 Query START TRANSACTION
1834 Query SELECT id, m FROM msg WHERE (m like 'What%') ORDER BY id LIMIT 1
1834 Query COMMIT
^[[D^[[1;5D170506 15:00:35 1835 Connect root@localhost on demo
1835 Query SELECT @@max_allowed_packet
1835 Query USE demo
1835 Query START TRANSACTION
1835 Query SELECT id, m FROM msg WHERE (m like 'What%') ORDER BY id LIMIT 1
1835 Query COMMIT
1836 Connect root@localhost on demo
1836 Query SELECT @@max_allowed_packet
1836 Query USE demo
1836 Query START TRANSACTION
1836 Query SELECT id, m FROM msg WHERE (m like 'What%') ORDER BY id LIMIT 1
1836 Query COMMIT
1837 Connect root@localhost on demo
1837 Query SELECT @@max_allowed_packet
1837 Query USE demo
1837 Query START TRANSACTION
1837 Query SELECT id, m FROM msg WHERE (m like 'What%') ORDER BY id LIMIT 1
1837 Query COMMIT

JOIN with multiple conditions

When I run like below,

sess.Select("*").From("user").
LeftJoin("image", dbr.And(dbr.Eq("user.image_id", "image.id"), dbr.Eq("user.company_image_id", "image.id")))

.ToSql() is below.

SELECT * FROM user LEFT JOIN `image` ON ?

I expect below.

SELECT * FROM user LEFT JOIN `image` ON user.image_id = image.id AND user.company_image_id = image.id

Thanks.

Time format into DB

when I insert record with

type DBMetrics struct {
	Time      time.Time `db:"Time"`
}
location, _ = time.LoadLocation("Europe/Berlin")
cur := time.Now()
m := &DBMetrics{}
m.Time = time.Date(cur.Year(), cur.Month(), cur.Day(), cur.Hour(), 0, 0, 0, location)

I want to get - '2017-03-29 14:57:00 +0200 CEST'
but have - '2017-03-29 12:57:00.000000'
because UTC

Question: When to close Session/Connection?

I initialize Connection instance as a global variable, and also attach each request new Session through middleware.

Since I didn't find it through the doc, Should i close session (via .Close() method)? if yes what is proper way? Maybe after each request? if no does it have any impact on performance or memory leak?

Thanks a lot

Postgres last insert id

There does not seem to be a way of getting the record id of the last inserted record(s).

The correct way would be to append "RETURNING id" to the query, and then scan the ids, however, I don't see how that's possible currently.

Also, when using InsertBuilder.Record, an id field is only selected when its type is int64, why not allow any kind of integers?

error on simple select

panic: you need to pass in the address of a struct

games := []Game{}

sess := Db.NewSession(nil)
err := sess.Select(" name).From("game").LoadStruct(&games)
if err != nil {

}

Game is just a simple struct : type Game struct {

How to append SQL?

I use the returning keyword a lot in postgres

insert into users (user_name, password)
values ($1, $2)
returning id, created_at

ToSQL()

Might be messing this in the goDocs but can anyone tell me if there is ToSql() function to get the query string and []interface{} args that could be passed into db.query() by itself ?

Add support for []byte

Please add support for inserting []byte. e.g. insert a bcrypt password.

I got an error trying to instert using both Values and Record methods

user_test.go:37: Error 1241: Operand should contain 1 column(s)

ToSql output
2015/05/02 23:58:49 INSERT INTO users (`username`,`password`,`email`,`mobile_no`,`first_name`,`last_name`,`active`,`user_type`,`admin`,`domain_id`,`created_by`) VALUES (?,?,?,?,?,?,?,?,?,?,?) [alex4 [36 50 97 36 49 48 36 86 110 115 70 120 113 100 75 106 66 67 122 101 52 110 56 107 108 67 88 49 101 48 112 117 86 55 53 90 86 70 70 72 52 116 100 118 46 48 78 81 104 115 67 120 114 70 102 120 84 70 77 71] [email protected] {{08060737120 true}} Alex Adeyemo false 1 false 1 System]

Thanks for the good work.

Postgres support

dbr looks very interesting, and I'd love if it supported Postgres.

My general plan was to run the test suite with DBR_TEST_DRIVER="postgres" and add the github.com/lib/pq import to dbr_test.go and work through until the tests pass. Do you feel that's a good approach? How comprehensive do you feel the tests are?

Right off the bat I am hitting SQL syntax problems in the installFixtures() function in dbr_test.go. Looking through the codebase it looks like it was mostly designed with a single driver in mind. How would you like to split out the driver-specific parts of the codebase?

Adding support for REPLACE INTO

Hi,

Do you have anything to share about REPLACE INTO support ?

I see mysql/sqlite supports it, pgsql uses a slightly different approach as it needs a INSERT INTO ... ON CONFLICT DO NOTHING.

I have many to many tables that could benefit this.

Select fails if database name and table name match

Spent a while going crazy trying to figure out why my simple test app wouldn't work. Finally figured out that if the database and the table have the same name, queries do not work.

func main() {
    conn, _ := dbr.Open("mysql", "root:root@tcp(127.0.0.1:3306)/clients", nil)

    sess := conn.NewSession(nil)

    var clients []Client

    _, err := sess.Select("*").From("clients").Load(&clients)

    if err != nil {
        panic(err)
    }
}

panic: Error 1064: You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near 'clients' at line 1

Case affects return value of `Id`

Case 1: Id is returned correctly

type User struct {
  Id uint64
}

Case 2: ID is returned as 0

type User struct {
  ID uint64
}

(Sorry for the brevity as I'm in a bit of a time crunch right now. I wanted to get this up there.)

Thanks!

(PS: I'll see if I can look into a fix later.)

Some tweaks to the package interface

Hello, thank you for this library. It was the first library that suited me (except some parts). I decided to fork this library and tweak it to fit my needs. I have just released it but it is a work in progress:
https://github.com/mibk/ql.

I have forked it because I wanted to break the API of the dbr package. But now, I was wondering, whether some of the changes might interest you (especially Query method and type, quoting of columns using [col_name], All and One methods, and String methods for builders). In the README there
are the main differences from your package. Other changes could be read from commit log. If you like some of it, I could prepare a pull request.

In ideal it would be nice if we had the same goal and we could just merged it all. But it whould probably have to be some dbr 2 as the BC break wouldn't be pleasant for the current users.

dbr.InsertInto struct fields auto mapping

some db library have auto mapping of the struct field's db table name
so something like

type A struct {
  Id int
  Name string
  LongField string
}

will simply map to Columns("id", "name", "long_field")
so this way its much less verbose compared to manually adding the field name into the Columns()
and changing the struct requires the function to be updated as well.

support for ON DUPLICATE KEY UPDATE

do dbr support this 'ON DUPLICATE KEY UPDATE ' ?
or can i use UpdateBySql and put in "INSERT INTO ... ON DUPLICATE KEY UPDATE ... " query to perform this query?

multiple update on transaction?

ideally would be good to have dbr generate the queries with the value injected into the '?'
and the queries can appended together and put into tx.Tx.Exec(query) to do batch updates
currently the generated dbr's query still retain the '?' in the query -> this is not very useful

INSERT INTO Foreign Key

Hello,

I'm not sure if here is the right place to post question, but my question seems to be also a missing feature, so I'll add it here. Please tell me if this is not the right place.

I'd like to do an insert with a foreign key. Let's say my struct (and thus, SQL table) is like this:

type Post struct {
    Id        int64        `db:"id"`
    Title     string       `db:"title"`
    Content   string       `db:"content"`
    Author    *Contributor `db:"author"`
}

type Contributor struct {
    Id        int64     `db:"id"`
    Name      string    `db:"name"`
    Password  string    `db:"password"`
}

And I want to INSERT a new row into Post, which will of course have a foreign key field, Author. Here's how I currently do it, which fails, and I can't seem to find anything in the readme:

// get title, content, contributor, etc

newPost := &models.Post{
    Title:   title,
    Content: content,
    Author:    contributor,
    Published: true,
    Created:   created,
}

_, err := c.Trx.InsertInto("posts").
    Columns("title", "content", "author", "published", "created").
    Record(newPost).Exec()

This method gives me an error of "trying to interpolate invalid value into query". I can't really find the root cause, but as far as I can see, it is because of the Author field, which have an object instead of, say int64.

I'd like to still have a reference to the author project from my struct for other purpose. Is there any way I can do this right now, aside from changing the struct?

If there's no such feature right now, can I try to implement it and submit a pull-request?

Thanks before.

Allow custom row parsers.

Sometimes I want to run custom code for each row returned from database. Following modification of load.go works for me - I can pass RowLoder as argument to SelectBuilder.Load()

type Scanner interface {
    Scan(dest ...interface{}) error
}

type RowLoader interface {
    LoadRow(Scanner) error
}

type ColumnsLoader interface {
    LoadColumns([]string) error
}

// Load loads any value from sql.Rows
 func Load(rows *sql.Rows, value interface{}) (int, error) {
    defer rows.Close()

    column, err := rows.Columns()
    if err != nil {
        return 0, err
    }
    if l, ok := value.(ColumnsLoader); ok {
        err = l.LoadColumns(column)
        if err != nil {
            return 0, err
        }
    }
    if rl, ok := value.(RowLoader); ok {
        count := 0
        for rows.Next() {
            err := rl.LoadRow(rows)
            if err != nil {
                rows.Close()
                return 0, err
            }
        }
        return count, nil
    }
    ...

Support timezones other than UTC

Right now the interpolate.go deals all time.Time vals as UTC , I'd like to see that dbr can treat the loc value of DSN setting correspondinly to meet our needs.

Why quoteIdent automatically?

It prevents simple things like builder.LeftJoin("table AS t") from working. I need to join a table to itself and had to fork the repository just to remove the automatic quoteIndent. Is there a way I'm missing?

Null type Marshallers only work with Ptr

I submit this issue knowing that 2.0 is going to encourage people to implement their own Null type handling, but there was mention that there would be a subpkg to start from. So, this issue should potentially be addressed in 2.0.

The MarshalJSON implementations for all the Nullable types works only on a Ptr receiver. See here: https://github.com/gocraft/dbr/blob/master/types.go#L44

If you create a struct like this:

type User struct {
  Name string
  Company NullString
}

Marshalling to JSON will result in something like this:

{
"name": "Some Name",
"company: {"String":"Widget Corp.", Valid: true}
}

If Company *NullString was used everything works as normal, but initializing a model becomes considerably more complex. The example code for dbr doesn't even do that, it uses all the Null types as straight value structs.

The fix is simple, change the Ptr receivers on those implementations to Value receivers.

func (n NullString) MarshalJSON() ([]byte, error) vs func (n *NullString) MarshalJSON() ([]byte, error)

I can submit a PR that does fixes the above.

As far as I can test, this does not harm anything in the tests, but if someone had a problem with any dbr.Null types they may of implemented their own MarshalJSON with different behaviour. Of course, they are overriding it, so it should continue to do so.

Practice: MySQL UPSERT

Hi, could you tell me more better practices to execute MySQL UPSERT.

my approach:

func sample() {
    conn, err := dbr.Open("mysql", USER+":"+PASSWORD+"@tcp("+HOST+":"+PORT+")/"+DB, nil)
    if err != nil {
        fmt.Println(err)
    }
    sess := conn.NewSession(nil)

    buf := dbr.NewBuffer()

    type Sample struct {
        ID      string // PRIMARY KEY
        Name    string
        Address string
    }
    sample := Sample{"g", "Taro", "Tokyo"}

    stmt := sess.InsertInto("test_upsert").Columns("id", "name", "address").Record(sample)
    stmt.Build(dialect.MySQL, buf)

    stmt2 := sess.UpdateBySql(" ON DUPLICATE KEY UPDATE name = ?, address = ?", sample.Name, sample.Address) // of course it is not necessary to use UpdateBySql method.
    stmt2.Build(dialect.MySQL, buf)

    query, interpolateErr := dbr.InterpolateForDialect(buf.String(), buf.Value(), dialect.MySQL)
    if interpolateErr != nil {
        fmt.Println(interpolateErr)
    } else {
        fmt.Println(query) // INSERT INTO `test_upsert` (`id`,`name`,`address`) VALUES ('g','Taro','Tokyo') ON DUPLICATE KEY UPDATE name = 'Taro', address = 'Tokyo'

    }

    if result, insertErr := sess.InsertBySql(query).Exec(); insertErr != nil {
        fmt.Println(err)
    } else {
        fmt.Println(result.RowsAffected())
    }
}

Convert public structs into Interfaces

I believe this could be a breaking change, but it comes with some great perks.

First one is testability. Interfaces let us create a mock subpackage that people can use to mock out all the query types in dbr. Right now we use github.com/DATA-DOG/go-sqlmock but building testing straight into dbr would make everything so much happier.

Mocking empty dbr Connection or Session in other implementations because becomes more portable in other testing as well.

If you happen to be using the new GO15VENDOREXPERIMENT you will potentially encounter an issue with passing dbr structs between two different packages. Go types structs by the full package namespace name, but interfaces are validated by implementation. We hit this recently using glide and implemented the flatten feature it now has to work around this.

Interfaces are the "golang" way for publicly exposed functionality. Public structs often implement an interface in stdlib. It allows people to compose how they use dbr.Session in a way that matches how they decompose their problems.

Enough rambling from me. Thoughts? If this peaks any interest, we could prepare a PR with the changes for the next major release.

dbr recommendation

i would just like to give a shout out to this library.
as im very concern as to why not enough people is using this library

most golang's db orm solutions do not support batch insert besides using raw sql queries

for loop insert 7 row of data : 1.28s*
batch insert using the dbr insert builder system : 128 ms*

awesome work guys!

  • db is in a GCE instance within the same city - called from localhost

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.