A Chrome extension that extracts event information from webpages and creates calendar entries. Uses Ollama LLM locally through Docker for event extraction.
- 🧠 Extract events from webpages using local LLM (Ollama)
- 🔍 Scrape multiple sites from a configuration
- 📅 Generate iCalendar (.ics) files from extracted events
- 🔄 Sync with your preferred calendar app
- 🛠️ Fully configurable through the extension UI
EventScraper/
├── manifest.json # Chrome extension manifest
├── html/ # HTML files for extension UI
├── css/ # CSS styles for extension
├── js/ # JavaScript for extension logic
│ ├── background.js # Extension background script
│ ├── content.js # Content script for page interaction
│ ├── popup.js # Popup UI logic
│ └── contentScripts/ # Additional content scripts
└── server/ # Server-side Docker components
├── docker-compose.yml # Docker services configuration
├── Dockerfile # API server Dockerfile
├── app.py # FastAPI server for LLM integration
└── requirements.txt # Python dependencies
- Clone this repository
- Open Chrome and go to
chrome://extensions/ - Enable "Developer mode" in the top right
- Click "Load unpacked" and select the EventScraper folder
- The extension is now installed in your browser
- Navigate to the server directory
- Start the Docker containers:
docker-compose up -d - The server will be available at http://localhost:8000
- Click the extension icon to open the popup
- To extract events from the current page, click "Extract Events from Page"
- To scrape multiple sites, click "Scrape from Config"
- View extracted events by clicking "View Saved Events"
- Generate a calendar file with "Generate Calendar"
The extension supports two LLM options:
- Ollama (Local): Uses a local Ollama instance running in Docker
- OpenAI: Uses OpenAI's API (requires API key)
Set your preferred timezone for calendar events in the Settings panel.
The Chrome extension uses manifest v3 and follows standard extension development practices.
The server component uses FastAPI and Ollama running in Docker:
- Ollama container: Provides the LLM service
- API server: Interfaces between the extension and Ollama
- Chrome browser
- Docker and Docker Compose
- Python 3.9+ (for server development)