Go favors simplicity, and it's pretty common to work with a database via driver directly without any ORM. It provides great control and efficiency in your queries, but here is a problem: you need to manually iterate over database rows and scan data from all columns into a corresponding destination. It can be error-prone verbose and just tedious. scany aims to solve this problem. It allows developers to scan complex data from a database into Go structs and other composite types with just one function call and don't bother with rows iteration.
scany isn't limited to any specific database. It integrates with database/sql
,
so any database with database/sql
driver is supported.
It also works with pgx library native interface.
Apart from the out-of-the-box support, scany can be easily extended to work with almost any database library.
Note that scany isn't an ORM. First of all, it works only in one direction: it scans data into Go objects from the database, but it can't build database queries based on those objects. Secondly, it doesn't know anything about relations between objects e.g: one to many, many to many.
- Custom database column name via struct tag
- Reusing structs via nesting or embedding
- NULLs and custom types support
- Omitted struct fields
- Apart from structs, support for maps and Go primitive types as the destination.
go get github.com/georgysavva/scany
package main
import (
"context"
"database/sql"
"github.com/georgysavva/scany/sqlscan"
)
type User struct {
ID string
Name string
Email string
Age int
}
func main() {
ctx := context.Background()
db, _ := sql.Open("postgres", "example-connection-url")
var users []*User
sqlscan.Select(ctx, db, &users, `SELECT id, name, email, age FROM users`)
// users variable now contains data from all rows.
}
Use sqlscan
package to work with database/sql
standard library.
package main
import (
"context"
"github.com/jackc/pgx/v4/pgxpool"
"github.com/georgysavva/scany/pgxscan"
)
type User struct {
ID string
Name string
Email string
Age int
}
func main() {
ctx := context.Background()
db, _ := pgxpool.Connect(ctx, "example-connection-url")
var users []*User
pgxscan.Select(ctx, db, &users, `SELECT id, name, email, age FROM users`)
// users variable now contains data from all rows.
}
Use pgxscan
package to work with pgx
library native interface.
Use dbscan
package that works with an abstract database,
and can be integrated with any library that has a concept of rows.
This particular package implements core scany features and contains all the logic.
Both sqlscan
and pgxscan
use dbscan
internally.
Comparison with sqlx
- sqlx only works with
database/sql
standard library. scany isn't limited only todatabase/sql
. It also supports pgx native interface and can be extended to work with any database library independent ofdatabase/sql
- In terms of scanning and mapping abilities, scany provides all features of sqlx
- scany has a simpler API and much fewer concepts, so it's easier to start working with
scany supports Go 1.13 and higher.
- Add ability to set custom function to translate struct field to the column name, instead of the default to snake case function
- Allow to use a custom separator for embedded structs prefix, instead of the default "."
The easiest way to run the tests is:
go test ./...
scany runs a CockroachDB server to execute its tests. It will download, cache and run the CockroachDB binary for you. It's very convenient since the only requirement to run the tests is an internet connection. Alternatively, you can download the CockroachDB binary yourself and pass the path to the binary into tests:
go test ./... -cockroach-binary cockroach
This project uses golangci-lint
v1.38.0.
To run the linter locally do the following:
- Install
golangci-lint
program - In the project root type:
golangci-lint run
Every feature request or question is appreciated. Don't hesitate. Just post an issue or PR.
This project is licensed under the terms of the MIT license.