A powerful command-line tool to export SQL database tables to CSV files. Supports MySQL, PostgreSQL, and SQLite databases with an interactive interface and concurrent exports.
- 🗃️ Support for multiple SQL database types:
- MySQL/MariaDB
- PostgreSQL
- SQLite
- 🔌 Multiple connection methods:
- Interactive connection details input
- Connection string/URL support
- SQL dump file import
- 📊 Interactive table selection with row count display
- ⚡ Concurrent export of multiple tables
- 🚀 Efficient handling of large tables through batch processing
- 🛠️ User-friendly command-line interface
- 🔒 Secure password handling
go install github.com/gossterrible/sql2csv/cmd/sql2csv@latest
Download the latest binary for your platform from the releases page.
# Clone the repository
git clone https://github.com/gossterrible/sql2csv.git
cd sql2csv
# Install dependencies
go mod download
# Build the binary
go build ./cmd/sql2csv
# Run tests
go test ./...
Simply run the tool and follow the interactive prompts:
sql2csv
$ sql2csv
? Select connection type: Direct Connection
? Select database type: postgres
? Enter database host: localhost
? Enter database port: 5432
? Enter database user: myuser
? Enter database password: ****
? Enter database name: mydb
$ sql2csv
? Select connection type: Connection String
? Select database type: mysql
? Enter connection string: myuser:mypass@tcp(localhost:3306)/mydb
$ sql2csv
? Select connection type: SQL Dump File
? Enter SQL dump file path: ./dump.sql
? Select original database type: postgres
output_directory/
├── users.csv
├── products.csv
└── orders.csv
- Supports all MySQL data types
- Default port: 3306
- Connection string format:
user:password@tcp(host:port)/dbname
- Required permissions: SELECT on target tables
- Supports all PostgreSQL data types
- Default port: 5432
- Connection string format:
postgresql://user:password@host:port/dbname
- SSL mode disabled by default
- Required permissions: SELECT on target tables and schema information
- Supports all SQLite data types
- Connection format: Path to database file
- No additional server setup needed
- Read permissions required on the database file
The tool implements several optimizations for handling large datasets:
- Batch processing to minimize memory usage
- Concurrent table exports using Go routines
- Efficient CSV writing with buffering
- Connection pooling for better resource utilization
Contributions are welcome! Please feel free to submit a Pull Request. For major changes, please open an issue first to discuss what you would like to change.
- Fork the repository
- Create your feature branch (
git checkout -b feature/AmazingFeature
) - Commit your changes (
git commit -m 'Add some AmazingFeature'
) - Push to the branch (
git push origin feature/AmazingFeature
) - Open a Pull Request
- Go 1.16 or later
- CGO enabled (required for SQLite support)
- Access to test databases (MySQL, PostgreSQL, SQLite) for integration testing
This project is licensed under the MIT License - see the LICENSE file for details.
- go-sql-driver/mysql - MySQL driver
- lib/pq - PostgreSQL driver
- mattn/go-sqlite3 - SQLite driver
- survey - Interactive prompts