A web application that tracks and displays Devin's credit usage and limits by scraping the relevant data once a day.
- Automated scraping of Devin credit usage data from:
- Main usage page: https://app.devin.ai/settings/usage (for Available ACUs)
- History page: https://app.devin.ai/settings/usage?tab=history (for Session, Created At, and ACUs Used)
- Daily scheduled updates
- Web interface to view current and historical usage data
- Server-side login window for authentication
- Admin-only manual scrape functionality
- Public view for non-admin users
The Devin platform uses email confirmation codes for authentication instead of passwords. The application provides two ways to handle this:
The application now includes a two-step login process:
- Enter your user ID (email address)
- Enter the confirmation code sent to your email
- Use these credentials for scraping
This approach allows you to:
- Manually trigger scrapes after logging in (admin only)
- View your credit usage data without modifying environment variables
- Maintain a session for the duration of your browser session
You can still use environment variables for automated scraping:
- Set
USER_ID
in your.env
file - Update
DEVIN_CONFIRMATION_CODE
with the latest code before running the scraper
- Non-Admin Users: Can view all credit usage data without logging in
- Admin Users: Can log in and access additional features like manual scraping
- Admin status is configured via the
ADMIN_USER
setting in the.env
file
- Python 3.8+
- Chrome browser (for Selenium WebDriver)
- ChromeDriver
-
Clone the repository:
git clone https://github.com/codeforjapan/devinwork.git cd devinwork
-
Create a virtual environment and install dependencies:
python -m venv venv source venv/bin/activate # On Windows: venv\Scripts\activate pip install -r requirements.txt
-
Configure the application:
- Copy
.env.example
to.env
- Update the configuration values in
.env
, including:ORGANIZATION_NAME
: Your organization name (displayed in the UI)USER_ID
: Your Devin account emailADMIN_USER
: Set to "true" to enable admin featuresFLASK_SECRET_KEY
: A secure random key for session management
- Copy
-
Start the web server:
python src/web/app.py
-
Access the web interface at
http://localhost:5000
-
To manually run the scraper:
python src/scraper/run.py
src/scraper/
: Contains the web scraping codesrc/web/
: Contains the web server codestatic/
: Static assets for the web interfacetemplates/
: HTML templates for the web interfacedata/
: Storage for scraped data