Stop checking TrustPilot, BBB, and Yelp separately. Review Hound scrapes them all, flags negative reviews, and emails you before customers start talking.
Why? Bad reviews spread. A 1-star complaint on Yelp can sit for days before you notice. Review Hound catches them within hours.
- One command, three sources:
reviewhound scrape --allhits TrustPilot, BBB, and Yelp - Sentiment scoring: Flags negative reviews automatically so you know what needs attention
- Web dashboard: See all your businesses, ratings, and trends in one place
- Email alerts: Get notified when someone leaves a bad review
- CLI or web: Use whichever fits your workflow
- Scheduled scraping: Set it and forget it—runs every few hours
- CSV export: Pull data out for spreadsheets or reporting
Track all your businesses at a glance with ratings, sentiment breakdowns, and trend indicators.
See individual business metrics with rating trends and quick actions.
Browse and filter reviews by source and sentiment, with CSV export support.
Configure API keys for Google Places and Yelp Fusion, plus sentiment analysis tuning.
pip install reviewhoundThat's it. Now run the web dashboard:
reviewhound web
# → Starting web dashboard at http://127.0.0.1:5000Or use the CLI directly:
reviewhound add "Acme Corp" --trustpilot "https://trustpilot.com/review/acme.com"
reviewhound scrape --all
reviewhound listgit clone https://github.com/jonmartin721/review-hound.git
cd review-hound
docker-compose up -d
# → Access at http://localhost:5000git clone https://github.com/jonmartin721/review-hound.git
cd review-hound
pip install -e .
reviewhound web# Add with TrustPilot URL
reviewhound add "Acme Corp" --trustpilot "https://www.trustpilot.com/review/acme.com"
# Add with multiple sources
reviewhound add "Acme Corp" \
--trustpilot "https://www.trustpilot.com/review/acme.com" \
--bbb "https://www.bbb.org/..." \
--yelp "https://www.yelp.com/biz/acme-corp"# Scrape one business
reviewhound scrape "Acme"
# → Scraped 47 reviews from 3 sources
# Scrape everything (grab coffee, this takes a minute)
reviewhound scrape --all
# → Scraped 203 reviews across 5 businesses# List all businesses
reviewhound list
# View reviews for a business
reviewhound reviews 1 --limit 50
# Filter by sentiment
reviewhound reviews 1 --sentiment negative
# View statistics
reviewhound stats 1# Export to CSV
reviewhound export 1 -o acme_reviews.csv# Configure alerts for negative reviews
reviewhound alert 1 alerts@company.com --threshold 3.0
# List alert configurations
reviewhound alerts# Run scheduler (scrapes every 6 hours by default)
reviewhound watch
# Custom interval
reviewhound watch --interval 2
# Run web dashboard with scheduler
reviewhound web --with-schedulerCreate a .env file in the project root:
# Database
DATABASE_PATH=data/reviews.db
# Scraping
REQUEST_DELAY_MIN=2.0
REQUEST_DELAY_MAX=4.0
MAX_PAGES_PER_SOURCE=3
# Scheduler
SCRAPE_INTERVAL_HOURS=6
# Email Alerts (optional)
SMTP_HOST=smtp.gmail.com
SMTP_PORT=587
SMTP_USER=your-email@gmail.com
SMTP_PASSWORD=your-app-password
SMTP_FROM=alerts@yourdomain.com
# Web Dashboard
FLASK_SECRET_KEY=change-this-in-production
FLASK_DEBUG=falseThe web dashboard provides:
- Dashboard: Overview of all businesses with sentiment bars and ratings
- Business Detail: Individual business stats, rating trends, and recent reviews
- Reviews Page: Filterable list of all reviews with pagination
- One-Click Scraping: Trigger scrapes directly from the UI
Access at http://localhost:5000 after starting with reviewhound web.
review-hound/
├── reviewhound/
│ ├── __init__.py
│ ├── __main__.py
│ ├── cli.py # CLI commands
│ ├── config.py # Configuration
│ ├── database.py # Database setup
│ ├── models.py # SQLAlchemy models
│ ├── scheduler.py # APScheduler setup
│ ├── scrapers/
│ │ ├── base.py # Abstract scraper
│ │ ├── trustpilot.py
│ │ ├── bbb.py
│ │ └── yelp.py
│ ├── analysis/
│ │ └── sentiment.py # TextBlob analysis
│ ├── alerts/
│ │ └── email.py # SMTP alerts
│ └── web/
│ ├── app.py # Flask factory
│ ├── routes.py # Web routes
│ ├── templates/
│ └── static/
├── tests/
├── data/ # SQLite database
├── exports/ # CSV exports
├── Dockerfile
├── docker-compose.yml
└── requirements.txt
# Clone and install with dev dependencies
git clone https://github.com/jonmartin721/review-hound.git
cd review-hound
pip install -e ".[dev]"
# Run tests
pytest tests/ -v
# Run with debug mode
reviewhound web --debug- Set up email alerts:
reviewhound alert 1 you@email.com - Run the scheduler for hands-off monitoring:
reviewhound watch - Found a bug? Open an issue
Web scraping may violate some websites' Terms of Service. Use responsibly and respect rate limits.
MIT License - see LICENSE file for details.



