A self-hostable visual workflow automation platform for data processing, analytics, and integration tasks.
Dxsh is a microservices-based platform that enables users to create, execute, and monitor data processing workflows through an intuitive visual interface. The system supports various data sources, processing nodes, and output destinations, making it suitable for ETL operations, data analytics, and automation tasks.
Build complex data pipelines with an intuitive drag-and-drop interface. Connect nodes, configure parameters, and execute workflows in real-time.
Create customizable dashboards with real-time data visualization. Share dashboards publicly or embed them in your applications.
Generate secure embed tokens to share dashboards and widgets externally with fine-grained access control.
The platform consists of five core microservices:
-
Workflow Engine (Port 8000)
- Core workflow execution engine
- Handles workflow orchestration and node execution
- Manages workflow state and execution history
-
API Gateway (Port 8001)
- Central routing service for all API requests
- Handles authentication and request forwarding
- Provides unified API interface
-
Dashboard Service (Port 8002)
- Manages dashboard configurations and widgets
- Handles dashboard data aggregation
- Provides embed token management
-
Workflow Frontend (Port 3000)
- React-based workflow builder interface
- Visual workflow design and configuration
- Real-time workflow execution monitoring
-
Dashboard Frontend (Port 3001)
- React-based dashboard visualization interface
- Widget management and configuration
- Embeddable dashboard and widget views
- Visual drag-and-drop workflow design
- Extensive library of processing nodes
- Real-time execution monitoring
- Parameter configuration and validation
- Workflow templates and versioning
- Input Nodes: HTTP requests, database queries, file uploads, web scraping
- Processing Nodes: Data transformation, filtering, aggregation, calculations
- ML/AI Nodes: GPT integration, text analysis, predictions
- Output Nodes: Database storage, file exports, API calls, notifications
- Customizable dashboard layouts
- Multiple widget types (charts, tables, metrics, text)
- Real-time data updates
- Embeddable dashboards and widgets
- Token-based public sharing
- JWT-based authentication
- Role-based access control
- API rate limiting
- Token-based embed authentication
- CORS configuration for embedding
- Docker and Docker Compose
- Node.js 18+ (for local development)
- Python 3.9+ (for local development)
- PostgreSQL 13+ (or use Docker container)
- Clone the repository:
git clone https://github.com/markm39/dxsh.git
cd dxsh- Run the development script:
./start-dev.shThis will start all services locally. Access the applications at:
- Workflow Builder: http://localhost:3000
- Dashboard Interface: http://localhost:3001
- API Documentation: http://localhost:8001/docs
Each service requires specific environment variables. Key variables include:
DATABASE_URL: PostgreSQL connection stringJWT_SECRET: Secret key for JWT token generationCORS_ORIGINS: Allowed origins for CORSAPI_BASE_URL: Base URL for API servicesFRONTEND_URL: URL for frontend services
See individual service .env.example files for complete configuration options.
The system uses PostgreSQL for data persistence. On first run, the database schema is automatically created through SQLAlchemy migrations.
For production deployments, ensure you:
- Use a dedicated PostgreSQL instance
- Configure proper backup strategies
- Set secure database credentials
- Enable SSL for database connections
- Access the Workflow Builder at http://localhost:3000
- Click "New Workflow" to create a workflow
- Drag nodes from the sidebar to the canvas
- Connect nodes by dragging from output to input ports
- Configure each node by clicking on it
- Save and execute the workflow
- Access the Dashboard interface at http://localhost:3001
- Create a new dashboard or edit existing ones
- Add widgets and configure data sources
- Arrange widgets using the grid layout
- Save and share dashboards
- Navigate to Settings > Embed Tokens
- Create a new embed token for a dashboard or widget
- Configure security settings (domains, expiration)
- Use the generated embed code in your application
Example embed code:
<iframe
src="http://localhost:3001/embed/dashboard/123?token=your-token-here"
width="100%"
height="600"
frameborder="0"
>
</iframe>All API endpoints require JWT authentication except public embed endpoints.
# Login
POST /api/v1/auth/login
{
"username": "user@example.com",
"password": "password"
}
# Returns
{
"access_token": "jwt-token-here",
"token_type": "bearer"
}# List workflows
GET /api/v1/workflows
# Create workflow
POST /api/v1/workflows
{
"name": "My Workflow",
"description": "Process data",
"nodes": [...],
"connections": [...]
}
# Execute workflow
POST /api/v1/workflows/{id}/execute# List dashboards
GET /api/v1/dashboards
# Create dashboard
POST /api/v1/dashboards
{
"name": "Analytics Dashboard",
"description": "Main analytics",
"layout": {...}
}
# Get dashboard data
GET /api/v1/dashboards/{id}/dataFor complete API documentation, visit /docs on the API Gateway service.
See the Node Development Guide for instructions on creating custom workflow nodes.
See the Widget Development Guide for instructions on building custom dashboard widgets.
Run tests for each service:
# Backend services
cd services/workflow-engine
python -m pytest
# Frontend services
cd services/workflow-frontend
npm test- Python: Follow PEP 8, use Black formatter
- TypeScript/React: Use ESLint and Prettier
- Commit messages: Follow conventional commits
-
Security
- Use HTTPS for all services
- Configure proper CORS policies
- Rotate JWT secrets regularly
- Implement rate limiting
-
Performance
- Use Redis for caching
- Configure database connection pooling
- Implement horizontal scaling for services
- Use CDN for static assets
-
Monitoring
- Set up logging aggregation
- Configure health checks
- Implement metrics collection
- Set up alerting
Each microservice can be scaled independently:
docker-compose -f docker-compose.microservices.yml up -d --scale workflow-engine=3-
Database Connection Errors
- Verify PostgreSQL is running
- Check DATABASE_URL configuration
- Ensure database exists
-
CORS Errors
- Update CORS_ORIGINS in API Gateway
- Verify frontend URLs are correct
-
Authentication Failures
- Check JWT_SECRET matches across services
- Verify token expiration settings
See the Troubleshooting Guide for more solutions.
- Fork the repository
- Create a feature branch
- Make your changes
- Add tests for new functionality
- Submit a pull request
See CONTRIBUTING.md for detailed guidelines.
This project is licensed under the MIT License. See LICENSE file for details.
- Documentation: docs/
- Issues: GitHub Issues
- Discussions: GitHub Discussions


