-
Notifications
You must be signed in to change notification settings - Fork 0
feat: adding more logs to crawler! #11
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
WalkthroughThe pull request modifies the Changes
Sequence Diagram(s)sequenceDiagram
participant ETL as WebsiteETL.extract
participant Crawler as crawl(url)
participant Logger as Log System
ETL->>Crawler: Call crawl(url)
Crawler-->>ETL: Return data list
ETL->>Logger: Log info ("Extracted {len(data)} items")
ETL->>ETL: Extend extracted_data with data
Possibly related PRs
Poem
✨ Finishing Touches
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. 🪧 TipsChatThere are 3 ways to chat with CodeRabbit:
Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments. CodeRabbit Commands (Invoked using PR comments)
Other keywords and placeholders
CodeRabbit Configuration File (
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 0
🧹 Nitpick comments (1)
hivemind_etl/website/website_etl.py (1)
55-57: LGTM! Good addition of logging.The changes effectively enhance visibility into the extraction process by capturing the number of items extracted from each URL. This aligns perfectly with the PR objective of adding more logs to the crawler.
A minor suggestion: Consider improving the log message phrasing to something like
"{len(data)} items extracted from {url}"for better readability and grammar.- logging.info(f"{len(data)} data is extracted.") + logging.info(f"{len(data)} items extracted from {url}.")
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (1)
hivemind_etl/website/website_etl.py(1 hunks)
🧰 Additional context used
🧬 Code Definitions (1)
hivemind_etl/website/website_etl.py (1)
hivemind_etl/website/crawlee_client.py (1)
crawl(78-116)
⏰ Context from checks skipped due to timeout of 90000ms (2)
- GitHub Check: ci / test / Test
- GitHub Check: ci / lint / Lint
Summary by CodeRabbit