A cryptocurrency exchange platform with advanced features, including real-time exchange rates, an OTC (Over-the-Counter) market, wallet management, and a secure verification process
- Real-Time Exchange Rates: Display exchange rate trends between various national currencies and cryptocurrencies.
- News Section: Integrates the latest cryptocurrency news from various sources, providing links and summaries for quick access to market updates.
- Active Orders: Shows all active buy and sell orders.
- Buy Orders: Users can view all orders selling cryptocurrencies.
- Sell Orders: Users can view all demand orders to buy cryptocurrencies.
- Initiating Trades: Users can click on any order to view detailed information and proceed with transactions.
- Balance Display: Shows the user's account balance on the platform.
- Transaction History: Lists all historical transactions of the user, with filters to view bought or sold orders.
- Transaction Security Code: Users set and manage their own transaction security code for transaction confirmation.
- Customer Service Window: Provides instant help to resolve any platform-related issues.
You can set up the project locally using either Docker (recommended for consistent environments) or manual Python commands.
- Install Docker and Docker Compose.
- Clone the repository:
git clone https://github.com/username/project.git cd project
Before starting, ensure your database settings are correctly configured:
-
Check the database settings in
.env.docker:- POSTGRES_DB
- POSTGRES_USER
- POSTGRES_PASSWORD
- POSTGRES_HOST
- POSTGRES_PORT
-
If you need to modify these settings, update them in:
.env.dockerfor Docker environment.env.localfor local developmentdocker-compose.yml(ensure the database service configuration matches)
Note: If you change database credentials, ensure they match across all configuration files.
-
For Docker, copy
.env.dockerto.env:cp .env.docker .env
-
Start the containers:
docker-compose down -v # Clean start by removing volumes docker-compose up -d --build # Build and start containers in detached mode
-
Verify containers are running:
docker-compose ps # All containers should show status "running" -
Wait for the database to be ready:
docker-compose logs db # Check if PostgreSQL is ready to accept connections -
Apply migrations in the correct order:
# First, apply Django's default migrations docker-compose exec web python manage.py migrate auth docker-compose exec web python manage.py migrate contenttypes docker-compose exec web python manage.py migrate admin docker-compose exec web python manage.py migrate sessions # Then, apply third-party app migrations docker-compose exec web python manage.py migrate django_celery_beat # Finally, apply all project app migrations docker-compose exec web python manage.py migrate authentication docker-compose exec web python manage.py migrate wallet docker-compose exec web python manage.py migrate market docker-compose exec web python manage.py migrate news docker-compose exec web python manage.py migrate rates docker-compose exec web python manage.py migrate support # Alternative: apply all migrations at once docker-compose exec web python manage.py migrate
docker-compose exec db psql -U postgres -d crypto_exchange_1 -c '\dt' -
Create a superuser:
# First, ensure the database is ready and migrations are applied docker-compose logs db # Verify database is running docker-compose exec db psql -U postgres -d crypto_exchange_1 -c '\dt' # Verify tables exist # Then create the superuser docker-compose exec web python manage.py createsuperuser
-
Access the application:
- Backend: http://localhost:8000
- Swagger Docs: http://localhost:8000/swagger/
docker-compose downdocker-compose down
docker-compose up -d- Install Python 3.9+ and PostgreSQL.
- Set up a virtual environment:
python -m venv venv source venv/bin/activate # On Windows, use venv\Scripts\activate
-
Clone the repository:
git clone https://github.com/username/project.git cd project -
For local development, copy
.env.localto.env:cp .env.local .env
Note: This configuration uses "localhost" as PostgreSQL host for local development.
-
Install dependencies:
pip install -r requirements.txt
-
Apply migrations and create a superuser:
python manage.py migrate python manage.py createsuperuser
-
Run the development server:
python manage.py runserver
-
Access the application:
- Backend: http://127.0.0.1:8000
- Swagger Docs: http://127.0.0.1:8000/swagger/
-
Run all tests:
docker-compose exec web python manage.py test
-
Check test coverage:
docker-compose exec web coverage run manage.py test docker-compose exec web coverage report
-
Run all tests:
python manage.py test -
Run tests for specific apps:
python manage.py test authentication python manage.py test wallet
-
Generate test coverage report:
coverage run manage.py test coverage report coverage html # Generates an HTML report
- Secure environment variables with a service like AWS Secrets Manager or create a secure
.envfile. - Update
docker-compose.prod.ymlfor production settings. - Deploy the application:
docker-compose -f docker-compose.prod.yml up --build
- WebSocket: Real-time updates for exchange rates.
- Swagger/OpenAPI: Interactive API documentation.
- Email Notifications: Alerts and notifications for user activities.
- Security: JWT-based authentication and transaction security code.
For questions or assistance, contact tatyoko28@gmail.com.
The project uses two different configuration files depending on the execution mode:
- Ensure PostgreSQL is running locally on your machine
- Copy
.env.localto.env:This configuration uses:cp .env.local .env
- POSTGRES_HOST=localhost
- REDIS_URL=redis://localhost:6379/0
- Copy
.env.dockerto.env:This configuration uses:cp .env.docker .env
- POSTGRES_HOST=db (Docker service name)
- REDIS_URL=redis://redis:6379/0 (Docker service name)
- Never use 'db' as host when running locally with
py manage.py runserver - Never use 'localhost' as host when running with Docker
- If you switch between Docker and local development:
# For local development cp .env.local .env # For Docker cp .env.docker .env
- Python 3.9+: Core programming language
- Django 4.2+: Web framework
- Django REST Framework: API development
- Celery: Asynchronous task processing
- Channels: WebSocket support
- PostgreSQL: Primary database
- Redis: Caching and message broker
- JWT: Token-based authentication
- Argon2: Password hashing
- Django CORS Headers: Cross-Origin Resource Sharing
- Django Rate Limit: API rate limiting
- Pytest: Testing framework
- Factory Boy: Test data generation
- Coverage: Code coverage reporting
- Flake8: Code linting
- Black: Code formatting
- isort: Import sorting
- Swagger/OpenAPI: API documentation
- drf-yasg: Swagger generator
- AWS S3: Cloud storage
- Pillow: Image processing
- Whitenoise: Static files serving
- Sentry: Error tracking
- Django Debug Toolbar: Development debugging
- django-celery-beat: Periodic task scheduling
- Docker: Containerization
- Docker Compose: Multi-container orchestration
- Python Dotenv: Environment variables management












