Struggling with the 120-result limit on Google Maps? GMapHunter shatters that barrier.
GMapHunter is a sophisticated lead generation tool engineered to extract comprehensive business data from Google Maps with 100% coverage. Unlike standard searches that cap results, GMapHunter utilizes a smart Grid Search Algorithm to subdivide target cities into micro-zones (2km²), effectively bypassing limitations and harvesting thousands of high-value leads (e.g., Dentists, Restaurants, Clinics) in a single run.
Built for scale and resilience, it leverages advanced TLS fingerprinting to emulate legitimate user behavior, ensuring continuous operation without IP bans. Whether you are building a B2B database or analyzing market density, GMapHunter delivers raw, actionable data.
- 📍 Precision Grid Search: Automatically generates a coordinate grid over a target city to perform granular searches, ensuring no business is missed.
- 🛡️ Advanced Anti-Detect: Powered by
curl_cffi, matching real browser TLS fingerprints (Chrome/Safari) to bypass bot protections effortlessly. - 🐳 Docker Native: Fully containerized application. Deploy anywhere with a single command, keeping your host system clean.
- 💾 Smart Data Persistence: Uses PostgreSQL with UPSERT logic to handle millions of records while preventing duplicates automatically.
- 📊 Business-Ready Exports: Clean, structured Excel export functionality via Pandas, ready for your CRM or sales pipeline.
- 🤖 CI/CD Automation: Integrated with GitHub Actions for scheduled, completely autonomous data scraping tasks.
The system follows a streamlined Data Pipeline architecture:
- Grid Generation:
gmap_url_creator.pycalculates a grid of coordinates based on the target city's viewport. - Data Ingestion:
gmap_data_fetcher.pysends requests using TLS fingerprinting. - Parsers: Raw HTML/JSON is parsed to extract metadata (Name, Phone, Website, Address, Rating).
- Storage Layer: Validated data is pushed to PostgreSQL (NeonDB). Duplicates are rejected or updated.
- Analytics/Export:
export_to_excel.pypulls from the DB and generates formatted reports.
This project is optimized for Docker. No local Python environment setup is required.
- Docker & Docker Compose
-
Clone the repository
git clone https://github.com/oguzcankaraman/GMapHunter.git cd GMapHunter -
Configure Environment Create a
.envfile with your database credentials:DATABASE_URL=postgresql://user:password@host:port/dbname
-
Run with Docker
# Build and run the scraper container docker-compose up --build -
Export Data
# Execute the export script inside the container docker-compose run scraper python export_to_excel.py
This project is for educational and research purposes only.
- Respect Google Maps' Terms of Service.
- Do not use this tool for spamming or harassment.
- The author is not responsible for any misuse of this software or any consequences resulting from its use.


