Skip to content

Commit

Permalink
clean some stuff
Browse files Browse the repository at this point in the history
  • Loading branch information
YoongiKim committed May 23, 2023
1 parent f43b032 commit 12c868a
Show file tree
Hide file tree
Showing 2 changed files with 0 additions and 16 deletions.
14 changes: 0 additions & 14 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,17 +1,3 @@
# Changes on this Fork
updated: 2023.03.02 / ChromeDriver 110.0.5481.178

- Fixed bug on `google` & `google_full`
- Add `transparents` arguments that filters transparent images only. (for google)
- Fixed a bug that `limit` does not work on `google_full`
- Fixed a bug that multiprocess threads did not exit when `Ctrl+C` is pressed

### Best usage of this Fork
```
python3 main.py --google true --transparent true --naver false --full true #[--no_gui false] [--limit 100]
```
And now whenever you can stop program by pressing `Ctrl+C`!

# AutoCrawler
Google, Naver multiprocess image crawler (High Quality & Speed & Customizable)

Expand Down
2 changes: 0 additions & 2 deletions collect_links.py
Original file line number Diff line number Diff line change
Expand Up @@ -132,7 +132,6 @@ def google(self, keyword, add_url=""):
except ElementNotVisibleException:
pass

# photo_grid_boxes = self.browser.find_elements(By.XPATH, '//div[@class="bRMDJf islir"]')
photo_grid_boxes = self.browser.find_elements(By.XPATH, '//div[@class=" bRMDJf islir"]')

print('Scraping links')
Expand Down Expand Up @@ -220,7 +219,6 @@ def google_full(self, keyword, add_url="", limit=100):

NUM_MAX_RETRY = 30
NUM_MAX_SCROLL_PATIENCE = 100
# while True:
for _ in range(limit):
try:
xpath = '//div[@id="islsp"]//div[@class="v4dQwb"]'
Expand Down

0 comments on commit 12c868a

Please sign in to comment.