Skip to content

Commit

Permalink
Start writing a good readme.
Browse files Browse the repository at this point in the history
  • Loading branch information
georgysavva committed Jul 7, 2020
1 parent 1b27028 commit 781c17d
Show file tree
Hide file tree
Showing 6 changed files with 113 additions and 101 deletions.
129 changes: 76 additions & 53 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,100 +2,123 @@

[![GoDoc](https://img.shields.io/badge/pkg.go.dev-doc-blue)](http://pkg.go.dev/github.com/georgysavva/scany)
[![Build Status](https://travis-ci.com/georgysavva/scany.svg?branch=master)](https://travis-ci.com/georgysavva/scany)
[![Go Report Card](https://goreportcard.com/badge/github.com/georgysavva/scany)](https://goreportcard.com/report/github.com/georgysavva/scany)
[![codecov](https://codecov.io/gh/georgysavva/scany/branch/master/graph/badge.svg)](https://codecov.io/gh/georgysavva/scany)
[![Go Report Card](https://goreportcard.com/badge/github.com/georgysavva/scany)](https://goreportcard.com/report/github.com/georgysavva/scany)

Library for scanning data from database into Go structs and more.
Library for scanning data from a database into Go structs and more.

## Overview

Go favors simplicity and it's pretty common to work with database via driver directly without any ORM.
Go favors simplicity, and it's pretty common to work with a database via driver directly without any ORM.
It provides great control and efficiency in your queries, but here is a problem:
you need to manually iterate over database rows and scan data from all columns into a corresponding destination.
It can be error prone, verbose and just tedious.

`Scany` library aims to solve this problem,
it allows developers to scan complex data from database into Go structs and other composite types
Scany aims to solve this problem,
it allows developers to scan complex data from a database into Go structs and other composite types
with just one function call and don't bother with rows iteration.
It's not limited to any specific database, it works with standard `database/sql` library,

Scany isn't limited to any specific database. It integrates with standard `database/sql` library,
so any database with `database/sql` driver is supported.
It also supports pgx library specific for PostgreSQL.
It also works with [pgx](https://github.com/jackc/pgx) - specific library for PostgreSQL.
Apart from supporting `database/sql` and `pgx` out of the box,
Scany can be easily extended to work with any database library.

## Install

This library consists of the following packages: sqlscan, pgxscan and dbscan.
```
go get github.com/georgysavva/scany
```

## How to use with `database/sql`

## How to use with database/sql
```go
package main

import (
"context"
"database/sql"

"github.com/georgysavva/scany/sqlscan"
)

```
type User struct {
ID string `db:"user_id"`
Name string
Email string
Age int
UserID string
Name string
Email string
Age int
}

// Query rows from the database that implement dbscan.Rows interface, e.g. *sql.Rows:
db, _ := sql.Open("pgx", "example-connection-url")
rows, _ := db.Query(`SELECT user_id, name, email, age from users`)
func main() {
ctx := context.Background()
db, _ := sql.Open("postgres", "example-connection-url")

var users []*User
if err := dbscan.ScanAll(&users, rows); err != nil {
// Handle rows processing error
var users []*User
sqlscan.QueryAll(ctx, &users, db, `SELECT user_id, name, email, age FROM users`)
// users variable now contains data from all rows.
}
// users variable now contains data from all rows.
```

## How to use with pgx
Use [`sqlscan`](https://pkg.go.dev/github.com/georgysavva/scany/sqlscan)
package to work with `database/sql` standard library.

```
type User struct {
ID string `db:"user_id"`
Name string
Email string
Age int
}

// Query rows from the database that implement dbscan.Rows interface, e.g. *sql.Rows:
db, _ := sql.Open("pgx", "example-connection-url")
rows, _ := db.Query(`SELECT user_id, name, email, age from users`)
## How to use with `pgx`

var users []*User
if err := dbscan.ScanAll(&users, rows); err != nil {
// Handle rows processing error
```go
package main

import (
"context"

"github.com/jackc/pgx/v4/pgxpool"

"github.com/georgysavva/scany/pgxscan"
)

type User struct {
UserID string
Name string
Email string
Age int
}
// users variable now contains data from all rows.
```

## Install
func main() {
ctx := context.Background()
db, _ := pgxpool.Connect(ctx, "example-connection-url")

```
go get github.com/georgysavva/scany
var users []*User
pgxscan.QueryAll(ctx, &users, db, `SELECT user_id, name, email, age FROM users`)
// users variable now contains data from all rows.
}
```

## Tests
Use [`pgxscan`](https://pkg.go.dev/github.com/georgysavva/scany/pgxscan)
package to work with `pgx` library.

The only thing you need to run tests locally is an internet connection,
it's required to download and cache the database binary.
Just type `go test ./...` inside scany root directory and let the code to the rest.
## How to use with other database libraries

## what it is not
Use [`dbscan`](https://pkg.go.dev/github.com/georgysavva/scany/dbscan) package that works with an abstract database,
and can be integrated with any library.
This particular package implements core scany features and contains all the logic.
Both `sqlscan` and `pgxscan` use `dbscan` internally.

## Contributing
## Supported Go versions

Every feature request or question is really appreciated. Don't hesitate, just post an issue or PR.
Scany supports Go 1.13 and higher.

## Roadmap

Customize

## Supported Go versions

Scany supports Go 1.13 and higher.
## Tests

The only thing you need to run tests locally is an internet connection,
it's required to download and cache the database binary.
Just type `go test ./...` inside scany root directory and let the code do the rest.

## Versions policy
## Contributing

todo
Every feature request or question is really appreciated. Don't hesitate, just post an issue or PR.

## License

Expand Down
38 changes: 16 additions & 22 deletions dbscan/doc.go
Original file line number Diff line number Diff line change
@@ -1,16 +1,7 @@
// Package dbscan allows scanning data from abstract database rows into Go structs and more.
/*
dbscan works with abstract Rows and doesn't depend on any specific database or library.
dbscan works with abstract Rows and doesn't depend on any specific database or a library.
If a type implements Rows interface it can leverage full functional of this package.
Subpackages github.com/georgysavva/scany/sqlscan
and github.com/georgysavva/scany/pgxscan are wrappers around this package,
they contain functions and adapters tailored to database/sql
and github.com/jackc/pgx/v4 libraries correspondingly. sqlscan and pgxscan proxy all calls to dbscan internally.
dbscan does all the logic, but generally, it shouldn't be imported by the application code directly.
If you are working with database/sql - use github.com/georgysavva/scany/sqlscan package.
If you are working with pgx - use github.com/georgysavva/scany/pgxscan package.
Scanning into struct
Expand All @@ -22,10 +13,11 @@ The main feature of dbscan is ability to scan row data into struct.
Email string
}
// Query rows from the database that implement dbscan.Rows interface.
var rows dbscan.Rows
var users []*User
if err := dbscan.ScanAll(&users, rows); err != nil {
// Handle rows processing error
}
dbscan.ScanAll(&users, rows)
// users variable now contains data from all rows.
By default, to get the corresponding column dbscan translates field name to snake case.
Expand Down Expand Up @@ -104,26 +96,28 @@ Scanning into map
Apart from scanning into structs, dbscan can handle maps,
in that case it uses column name as the map key and column data as the map value, for example:
// Query rows from the database that implement dbscan.Rows interface.
var rows dbscan.Rows
var results []map[string]interface{}
if err := dbscan.ScanAll(&results, rows); err != nil {
// Handle rows processing error
}
// results variable now contains data from the row.
dbscan.ScanAll(&results, rows)
// results variable now contains data from all rows.
Map type isn't limited to map[string]interface{},
it can be any map with string key, e.g. map[string]string or map[string]int,
if all column values have the same specific type.
Scanning into other types
If the destination isn't a struct nor a map, dbscan handles it as single column scan,
If the destination isn't a struct nor a map, dbscan handles it as a single column scan,
dbscan ensures that rows contain exactly one column and scans destination from that column, for example:
// Query rows from the database that implement dbscan.Rows interface.
var rows dbscan.Rows
var results []string
if err := dbscan.ScanAll(&results, rows); err != nil {
// Handle rows processing error
}
// results variable not contains data from the row single column.
dbscan.ScanAll(&results, rows)
// results variable not contains data from all single columns rows.
Duplicate columns
Expand Down
20 changes: 8 additions & 12 deletions doc.go
Original file line number Diff line number Diff line change
@@ -1,17 +1,13 @@
// Package scany is a set of packages for scanning data from database into Go structs and more.
// Package scany is a set of packages for scanning data from a database into Go structs and more.
/*
Go favors simplicity and it's pretty common to work with database via driver directly without any ORM.
It provides great control and efficiency in your queries, but here is a problem:
you need to manually iterate over database rows and scan data from all columns into a corresponding destination.
It can be error prone, verbose and just tedious.
scany contains the following packages:
scany library aims to solve this problem,
it allows developers to scan complex data from database into Go structs and other composite types
with just one function call and don't bother with rows iteration.
It's not limited to any specific database, it works with standard database/sql library,
so any database with database/sql driver is supported.
It also supports pgx library specific for PostgreSQL.
sqlscan package works with database/sql standard library.
This library consists of the following packages: sqlscan, pgxscan and dbscan.
pgxscan package works with github.com/jackc/pgx/v4 library.
dbscan package works with an abstract database, and can be integrated with any library.
This particular package implements core scany features and contains all the logic.
Both sqlscan and pgxscan use dbscan internally.
*/
package scany
6 changes: 3 additions & 3 deletions pgxscan/doc.go
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
// Package pgxscan improves scanning abilities of pgx library.
// Package pgxscan allows scanning data into Go structs and other composite types,
// when working with pgx library.
/*
pgxscan allows scanning complex data into Go structs and other composite types,
when working with pgx library. Essentially, it is a wrapper around github.com/georgysavva/scany/dbscan package.
Essentially, pgxscan is a wrapper around github.com/georgysavva/scany/dbscan package.
It contains adapters and proxy functions that are meant to connect github.com/jackc/pgx/v4
with dbscan functionality. pgxscan mirrors all capabilities provided by dbscan.
See dbscan docs to get familiar with all concepts and features.
Expand Down
9 changes: 4 additions & 5 deletions sqlscan/doc.go
Original file line number Diff line number Diff line change
@@ -1,8 +1,7 @@
// Package sqlscan improves scanning abilities of standard database/sql library.
// Package sqlscan allows scanning data into Go structs and other composite types,
// when working with database/sql library.
/*
sqlscan allows scanning complex data into Go structs and other composite types,
when working with database/sql library. Essentially,
it is a wrapper around github.com/georgysavva/scany/dbscan package.
Essentially, sqlscan is a wrapper around github.com/georgysavva/scany/dbscan package.
It contains adapters and proxy functions that are meant to connect database/sql
with dbscan functionality. sqlscan mirrors all capabilities provided by dbscan.
See dbscan docs to get familiar with all concepts and features.
Expand All @@ -19,7 +18,7 @@ it's as simple as this:
Age int
}
db, _ := sql.Open("pgx", "example-connection-url")
db, _ := sql.Open("postgres", "example-connection-url")
// Use QueryAll to query multiple records.
var users []*User
Expand Down
12 changes: 6 additions & 6 deletions sqlscan/example_test.go
Original file line number Diff line number Diff line change
Expand Up @@ -14,7 +14,7 @@ func ExampleQueryAll() {
Age int
}

db, _ := sql.Open("pgx", "example-connection-url")
db, _ := sql.Open("postgres", "example-connection-url")

var users []*User
if err := sqlscan.QueryAll(
Expand All @@ -33,7 +33,7 @@ func ExampleQueryOne() {
Age int
}

db, _ := sql.Open("pgx", "example-connection-url")
db, _ := sql.Open("postgres", "example-connection-url")

var user User
if err := sqlscan.QueryOne(
Expand All @@ -53,7 +53,7 @@ func ExampleScanAll() {
}

// Query *sql.Rows from the database.
db, _ := sql.Open("pgx", "example-connection-url")
db, _ := sql.Open("postgres", "example-connection-url")
rows, _ := db.Query(`SELECT user_id, name, email, age FROM users`)

var users []*User
Expand All @@ -72,7 +72,7 @@ func ExampleScanOne() {
}

// Query *sql.Rows from the database.
db, _ := sql.Open("pgx", "example-connection-url")
db, _ := sql.Open("postgres", "example-connection-url")
rows, _ := db.Query(`SELECT user_id, name, email, age FROM users WHERE id='bob'`)

var user User
Expand All @@ -91,7 +91,7 @@ func ExampleRowScanner() {
}

// Query *sql.Rows from the database.
db, _ := sql.Open("pgx", "example-connection-url")
db, _ := sql.Open("postgres", "example-connection-url")
rows, _ := db.Query(`SELECT user_id, name, email, age FROM users`)

// Make sure rows are always closed.
Expand All @@ -118,7 +118,7 @@ func ExampleScanRow() {
}

// Query *sql.Rows from the database.
db, _ := sql.Open("pgx", "example-connection-url")
db, _ := sql.Open("postgres", "example-connection-url")
rows, _ := db.Query(`SELECT user_id, name, email, age FROM users`)

// Make sure rows are always closed.
Expand Down

0 comments on commit 781c17d

Please sign in to comment.