ManhwaSearch is a self-hosted web application for scraping, managing, and reading manga and manhwa. It features a modern, responsive Single Page Application (SPA) frontend and a Python backend with a background scheduler for automated updates.
- Manga & Manhwa Scraping: Automatically scrape details, chapters, and images from supported websites (e.g., mangaread.org).
- Reading Interface: A built-in reader with Single Page and All Pages modes.
- Favorites Management: Mark titles as favorites to automatically keep them updated.
- Background Scheduler: Periodically checks for updates to favorites and fetches new recommendations.
- Manual Scraping Control: Trigger immediate scrapes for specific chapters or entire manga directly from the UI.
- Responsive UI: Optimized for both desktop and mobile devices.
- Configuration: customizable settings for scraping intervals and website management.
- Modular Architecture: Designed for easy extension with new scraper modules.
ManhwaSearch/
├── backend/ \# Python Flask Backend
│ ├── app.py \# Main application entry point & API routes
│ ├── scheduler.py \# Background task scheduler
│ ├── Dockerfile \# Backend container definition
│ └── ...
├── frontend/ \# Node.js Express Frontend
│ ├── server.js \# Express server & API proxy
│ ├── Dockerfile \# Frontend container definition
│ └── public/ \# Static assets
└── config/
└── settings.json \# Application settings
├── docker-compose.yml \# Orchestration for full-stack deployment
- Docker & Docker Compose (Recommended)
- Or for manual setup:
- Python 3.8+
- Node.js 14+ & npm
The easiest way to run ManhwaSearch is using Docker Compose, which sets up the database, backend, and frontend networking automatically.
-
Start the Application: Run the following command in the project root:
docker-compose up -d
-
Access the App:
- Frontend: Open http://localhost:3000 in your browser.
- Backend API: Running on
http://localhost:5000(handled internally).
-
Stop the Application:
docker-compose down
If you prefer running without Docker, follow these steps:
Navigate to the backend directory and install the required Python packages:
cd backend
pip install -r requirements.txtStart the backend server:
# From project root
python backend/app.pyThe backend API will start on http://127.0.0.1:5000.
In a separate terminal, navigate to the frontend directory and install dependencies:
cd frontend
npm installStart the frontend server:
node server.jsThe frontend will be accessible at http://localhost:3000.
The application is configured via config/settings.json. These settings persist even when running in Docker (via volume mapping).
{
"scraping": {
"interval_hours": 8, # How often the background scraper runs
"max_chapters_per_manga": 1, # How many new chapters to scrape images for automatically
"num_recommendations_per_genre": 5, # Number of recommendations to fetch
"grab_all_chapters_favorites": false # If true, scrapes images for ALL chapters of favorites (intensive)
},
"websites": [
{
"name": "mangaread",
"url": "https://www.mangaread.org/",
"enabled": true
}
],
"ai_scraper": {
"enabled": false,
"api_key": ""
}
}- Create a new Python file in
backend/scraper/(e.g.,mysite.py). - Implement a class inheriting from
ScraperBase. - Register the new class in
backend/scheduler.pyin theSCRAPER_CLASSESdictionary. - Add an entry to the
websiteslist inconfig/settings.json.
- Home Page: Browse a list of scraped titles. Use the search bar to filter.
- Reading: Click "View Chapters" on any card. Select a chapter to read.
- Favorites: Click the star icon on any manga card to add it to your favorites. The scheduler prioritizes updates for these titles.
- Manual Scraping:
- Full Manga: On the manga details page, click "Scrape All Chapters" to queue a full update.
- Specific Chapter: In the chapter list, click the "Scrape" button next to a chapter to fetch its images immediately.
- Docker Issues: If the containers fail to start, check the logs with
docker-compose logs -f. - Images not loading: Some websites block hotlinking. Ensure the backend has successfully scraped the images (check console logs).
- Port Conflicts: Ensure ports 5000 (backend) and 3000 (frontend) are free on your host machine.
MIT