A blazing-fast, local-first application for querying CSV, JSON, Parquet, and SQLite files instantly. Zero cloud dependencies, zero setup—just drop your files and run SQL.
- Instant Queries — DuckDB's columnar engine handles millions of rows in milliseconds
- Multi-Format Support — CSV, JSON, JSONL, Parquet, SQLite
- 100% Local — No data ever leaves your machine
- Virtualized Grid — Smoothly scroll through millions of rows
- Resizable Columns — Drag column headers to adjust width
- Monaco Editor — Full SQL editor with syntax highlighting
- Export to CSV — Export query results directly to file
- Dark Mode — Easy on the eyes for long analysis sessions
graph TB
subgraph Frontend["Frontend (React + TypeScript)"]
Monaco["Monaco Editor<br/>SQL Input"]
Grid["Virtualized Grid<br/>Resizable Columns"]
Store["Zustand Store<br/>State Management"]
end
subgraph Bridge["Tauri IPC"]
Invoke["invoke() API"]
end
subgraph Backend["Backend (Rust)"]
Commands["Tauri Commands"]
DBManager["DatabaseManager<br/>Mutex<Connection>"]
end
subgraph DuckDB["DuckDB Engine"]
CSV["read_csv_auto()"]
JSON["read_json_auto()"]
Parquet["read_parquet()"]
SQLite["ATTACH SQLite"]
end
subgraph Files["Local Files"]
F1["*.csv"]
F2["*.json"]
F3["*.parquet"]
F4["*.sqlite"]
end
Monaco --> Store
Store --> Invoke
Grid --> Store
Invoke --> Commands
Commands --> DBManager
DBManager --> DuckDB
CSV --> F1
JSON --> F2
Parquet --> F3
SQLite --> F4
| Component | Choice | Reason |
|---|---|---|
| Database | DuckDB | Columnar storage, zero-copy file reading, analytical SQL optimized |
| Backend | Rust + Tauri | Memory safety, native performance, small binary size |
| Grid | @tanstack/react-virtual | Only renders visible rows—O(1) instead of O(n) |
| Editor | Monaco | VSCode-quality editing, SQL syntax highlighting |
| State | Zustand | Minimal boilerplate, TypeScript-first |
- Rust (latest stable)
- Node.js v18+
- Platform-specific dependencies:
- Windows: Visual Studio Build Tools (C++ workload)
- macOS: Xcode Command Line Tools
- Linux:
build-essential,libgtk-3-dev,libwebkit2gtk-4.1-dev
# Clone the repository
git clone https://github.com/your-username/data-playground.git
cd data-playground
# Install dependencies
npm install
# Start development server
npm run tauri dev# Build optimized release
npm run tauri buildThe installer will be created in src-tauri/target/release/bundle/.
- Drag & Drop: Drop any supported file onto the drop zone.
- Click to Browse: Click the drop zone to open the native file picker (recommended for large files or network drives).
Supported formats:
- CSV — Auto-detected delimiters and headers
- JSON — Supports nested objects (flattened)
- JSONL/NDJSON — Line-delimited JSON
- Parquet — Direct columnar reading
- SQLite — All tables imported as views
Use the Monaco-powered SQL editor to write queries:
-- Query your data
SELECT * FROM my_file WHERE column > 100 ORDER BY date DESC;
-- Join multiple files
SELECT a.*, b.value
FROM file1 a
JOIN file2 b ON a.id = b.id;
-- Aggregate analysis
SELECT category, COUNT(*), AVG(price)
FROM sales
GROUP BY category;Keyboard shortcuts:
Ctrl+Enter/Cmd+Enter— Execute query
Click "Export CSV" to save query results to a file. The export uses DuckDB's streaming COPY command for maximum efficiency.
data-playground/
├── src/ # React frontend
│ ├── components/ # UI components
│ ├── store.ts # Zustand state
│ └── App.tsx # Main layout
├── src-tauri/ # Rust backend
│ ├── src/
│ │ ├── database.rs # DuckDB integration
│ │ ├── error.rs # Error handling
│ │ └── lib.rs # Tauri commands
│ └── Cargo.toml # Rust deps
├── package.json
└── vite.config.ts
| Command | Parameters | Returns |
|---|---|---|
execute_query |
query, limit, offset |
Paginated results + metadata |
load_file |
file_path, table_name |
Table schema |
get_tables |
— | List of loaded tables |
export_to_csv |
query, file_path |
Row count |
Contributions are welcome! Please see CONTRIBUTING.md for guidelines.
MIT License — see LICENSE for details.