Skip to content

Commit

Permalink
Merge pull request #53 from Ryuk-me/dev
Browse files Browse the repository at this point in the history
fix: 1337x magnet issue on render & aws (#50)
  • Loading branch information
Ryuk-me authored Jul 3, 2023
2 parents facfaac + 952ccc3 commit 751615d
Show file tree
Hide file tree
Showing 25 changed files with 114 additions and 72 deletions.
7 changes: 7 additions & 0 deletions .dockerignore
Original file line number Diff line number Diff line change
@@ -0,0 +1,7 @@
# created by virtualenv automatically
__pycache__
api-py
.env
function.*
.vscode
.github
4 changes: 1 addition & 3 deletions Dockerfile
Original file line number Diff line number Diff line change
@@ -1,8 +1,6 @@
FROM python:3.8
ADD requirements.txt requirements.txt
ADD main.py main.py
ADD okteto-stack.yaml okteto-stack.yaml
RUN pip install -r requirements.txt
EXPOSE 8080
COPY . .
CMD ["python3", "main.py"]
CMD ["python" ,"main.py"]
43 changes: 18 additions & 25 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -47,7 +47,7 @@ $ pip install -r requirements.txt
$ python main.py

# To access API Open any browser/API Testing tool & move to the given URL
$ localhost:8080
$ localhost:8009

```

Expand Down Expand Up @@ -251,7 +251,7 @@ $ localhost:8080
<summary style='font-size: 15px'><span style='font-size: 20px;font-weight:bold;'>Supported sites list</span></summary>
<p>

> [`api/v1/sites`](https://6howpgdphtrt752l4oksuieqyu0qcqfu.lambda-url.ap-southeast-2.on.aws/api/v1/sites)
> [`api/v1/sites`](https://torrent-api-py-nx0x.onrender.com/api/v1/sites)
</p>
</details>
Expand All @@ -261,7 +261,7 @@ $ localhost:8080
<summary style='font-size: 15px'><span style='font-size: 20px;font-weight:bold;'>Search</span></summary>
<p>

> [`api/v1/search`](https://6howpgdphtrt752l4oksuieqyu0qcqfu.lambda-url.ap-southeast-2.on.aws/api/v1/search)
> [`api/v1/search`](https://torrent-api-py-nx0x.onrender.com/api/v1/search)
| Parameter | Required | Type | Default | Example |
| :-------: | :------: | :-----: | :-----: | :------------------------------------------------------: |
Expand Down Expand Up @@ -340,9 +340,9 @@ $ localhost:8080

<pre>Here <b>limit = 5</b> will get 5 results from each site.</pre>

> [api/v1/all/search?query=avengers](https://6howpgdphtrt752l4oksuieqyu0qcqfu.lambda-url.ap-southeast-2.on.aws/api/v1/all/search?query=avengers)
> [api/v1/all/search?query=avengers](https://torrent-api-py-nx0x.onrender.com/api/v1/all/search?query=avengers)
> [api/v1/all/search?query=avengers&limit=5](https://6howpgdphtrt752l4oksuieqyu0qcqfu.lambda-url.ap-southeast-2.on.aws/api/v1/all/search?query=avengers&limit=5)
> [api/v1/all/search?query=avengers&limit=5](https://torrent-api-py-nx0x.onrender.com/api/v1/all/search?query=avengers&limit=5)
</pre>
</details>
Expand All @@ -359,9 +359,9 @@ $ localhost:8080
| :-------: | :------: | :-----: | :-----: | :---------------------------: |
| limit || integer | Default | `api/v1/all/trending?limit=2` |

> [api/v1/all/trending](https://6howpgdphtrt752l4oksuieqyu0qcqfu.lambda-url.ap-southeast-2.on.aws/api/v1/all/trending)
> [api/v1/all/trending](https://torrent-api-py-nx0x.onrender.com/api/v1/all/trending)
> [api/v1/all/trending?limit=2](https://6howpgdphtrt752l4oksuieqyu0qcqfu.lambda-url.ap-southeast-2.on.aws/api/v1/all/trending?limit=2)
> [api/v1/all/trending?limit=2](https://torrent-api-py-nx0x.onrender.com/api/v1/all/trending?limit=2)
</p>
</details>
Expand All @@ -378,9 +378,9 @@ $ localhost:8080
| :-------: | :------: | :-----: | :-----: | :-------------------------: |
| limit || integer | Default | `api/v1/all/recent?limit=2` |

> [api/v1/all/recent](https://6howpgdphtrt752l4oksuieqyu0qcqfu.lambda-url.ap-southeast-2.on.aws/api/v1/all/recent)
> [api/v1/all/recent](https://torrent-api-py-nx0x.onrender.com/api/v1/all/recent)
> [api/v1/all/recent?limit=2](https://6howpgdphtrt752l4oksuieqyu0qcqfu.lambda-url.ap-southeast-2.on.aws/api/v1/all/recent)
> [api/v1/all/recent?limit=2](https://torrent-api-py-nx0x.onrender.com/api/v1/all/recent)
</p>
</details>
Expand All @@ -389,7 +389,7 @@ $ localhost:8080

## Want to Try api ?

> [api/v1/search?site=1337x&query=eternals](https://6howpgdphtrt752l4oksuieqyu0qcqfu.lambda-url.ap-southeast-2.on.aws/api/v1/search?site=1337x&query=eternals)
> [api/v1/search?site=1337x&query=eternals](https://torrent-api-py-nx0x.onrender.com/api/v1/search?site=1337x&query=eternals)
<details open>
<summary> See response</summary>
Expand Down Expand Up @@ -421,7 +421,7 @@ $ localhost:8080
"[TGx]Downloaded from torrentgalaxy.to .txt (0.7 KB)"
],
"poster": "https://lx1.dyncdn.cc/cdn/02/0251ab7772c031c1130bc92810758cd4.jpg",
"magnet": "magnet:?xt=urn:btih:20F8D7C2942B143E6E2A0FB5562CDE7EE1B17822&dn=Eternals.2021.1080p.WEBRip.1600MB.DD5.1.x264-GalaxyRG&tr=udp%3A%2F%2Fopen.stealth.si%3A80%2Fannounce&tr=udp%3A%2F%2Ftracker.tiny-vps.com%3A6969%2Fannounce&tr=udp%3A%2F%2Ftracker.opentrackr.org%3A1337%2Fannounce&tr=udp%3A%2F%2Ftracker.torrent.eu.org%3A451%2Fannounce&tr=udp%3A%2F%2Fexplodie.org%3A6969%2Fannounce&tr=udp%3A%2F%2Ftracker.cyberia.is%3A6969%2Fannounce&tr=udp%3A%2F%2Fipv4.tracker.harry.lu%3A80%2Fannounce&tr=udp%3A%2F%2Fp4p.arenabg.com%3A1337%2Fannounce&tr=udp%3A%2F%2Ftracker.birkenwald.de%3A6969%2Fannounce&tr=udp%3A%2F%2Ftracker.moeking.me%3A6969%2Fannounce&tr=udp%3A%2F%2Fopentor.org%3A2710%2Fannounce&tr=udp%3A%2F%2Ftracker.dler.org%3A6969%2Fannounce&tr=udp%3A%2F%2F9.rarbg.me%3A2970%2Fannounce&tr=https%3A%2F%2Ftracker.foreverpirates.co%3A443%2Fannounce&tr=udp%3A%2F%2Ftracker.opentrackr.org%3A1337%2Fannounce&tr=http%3A%2F%2Ftracker.openbittorrent.com%3A80%2Fannounce&tr=udp%3A%2F%2Fopentracker.i2p.rocks%3A6969%2Fannounce&tr=udp%3A%2F%2Ftracker.internetwarriors.net%3A1337%2Fannounce&tr=udp%3A%2F%2Ftracker.leechers-paradise.org%3A6969%2Fannounce&tr=udp%3A%2F%2Fcoppersurfer.tk%3A6969%2Fannounce&tr=udp%3A%2F%2Ftracker.zer0day.to%3A1337%2Fannounce",
"magnet": "magnet:?xt=urn:btih:20F8D7C2942B143E6E2A0FB5562CDE7EE1B17822&dn=Eternals.2021.1080p.WEBRip.1600MB.DD5.1.x264-GalaxyRG&tr=udp://open.stealth.si:80/announce&tr=udp://tracker.tiny-vps.com:6969/announce&tr=udp://tracker.opentrackr.org:1337/announce&tr=udp://tracker.torrent.eu.org:451/announce&tr=udp://explodie.org:6969/announce&tr=udp://tracker.cyberia.is:6969/announce&tr=udp://ipv4.tracker.harry.lu:80/announce&tr=udp://p4p.arenabg.com:1337/announce&tr=udp://tracker.birkenwald.de:6969/announce&tr=udp://tracker.moeking.me:6969/announce&tr=udp://opentor.org:2710/announce&tr=udp://tracker.dler.org:6969/announce&tr=udp://9.rarbg.me:2970/announce&tr=https://tracker.foreverpirates.co:443/announce&tr=udp://tracker.opentrackr.org:1337/announce&tr=http://tracker.openbittorrent.com:80/announce&tr=udp://opentracker.i2p.rocks:6969/announce&tr=udp://tracker.internetwarriors.net:1337/announce&tr=udp://tracker.leechers-paradise.org:6969/announce&tr=udp://coppersurfer.tk:6969/announce&tr=udp://tracker.zer0day.to:1337/announce",
"hash": "20F8D7C2942B143E6E2A0FB5562CDE7EE1B17822"
}
],
Expand All @@ -437,19 +437,6 @@ $ localhost:8080

---

# How to Host On Repl.it

```sh
> Fork this repo
> Import repo from github in repl
> Command : python main.py
> Install Requirements manually !very important
> And Run Your repl

Note : Due to CPU limitations Repl will take much more time than Heroku.
```

---
## Donations

<p> If you feel like showing your appreciation for this project, then how about buying me a coffee?</p>
Expand All @@ -458,6 +445,12 @@ Note : Due to CPU limitations Repl will take much more time than Heroku.

---

#### You can fork the repo and deploy on VPS or deploy it on Heroku :)
## DEPLOY

<a href="https://render.com/deploy?repo=https://github.com/Ryuk-me/Torrent-Api-py">
<img src="https://render.com/images/deploy-to-render-button.svg" alt="Deploy to Render" />
</a>

</br>

[![Deploy](https://www.herokucdn.com/deploy/button.svg)](https://heroku.com/deploy)
16 changes: 16 additions & 0 deletions constants/base_url.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,16 @@
X1337 = "https://1337x.to"
TGX = "https://torrentgalaxy.to"
TORLOCK = "https://www.torlock.com"
PIRATEBAY = "https://thepiratebay10.org"
NYAASI = "https://nyaa.si"
ZOOQLE = "https://zooqle.com"
KICKASS = "https://kickasstorrents.to"
BITSEARCH = "https://bitsearch.to"
MAGNETDL = "https://www.magnetdl.com"
LIBGEN = "https://libgen.is"
YTS = "https://yts.mx"
LIMETORRENT = "https://www.limetorrents.pro"
TORRENTFUNK = "https://www.torrentfunk.com"
GLODLS = "https://glodls.to"
TORRENTPROJECT = "https://torrentproject2.com"
YOURBITTORRENT = "https://yourbittorrent.com"
2 changes: 2 additions & 0 deletions constants/headers.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,2 @@
HEADER_AIO = {
"User-Agent": "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/114.0.0.0 Safari/537.36 Edg/114.0.1823.67"}
6 changes: 6 additions & 0 deletions docker-compose.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,6 @@
version: '3.9'
services:
api-py:
build: .
ports:
- "8009:8009"
6 changes: 2 additions & 4 deletions helper/html_scraper.py
Original file line number Diff line number Diff line change
@@ -1,16 +1,14 @@
import asyncio
from .asyncioPoliciesFix import decorator_asyncio_fix

from constants.headers import HEADER_AIO

class Scraper:
@decorator_asyncio_fix
async def _get_html(self, session, url):
try:
async with session.get(
url,
headers={
"User-Agent": "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/91.0.4472.106 Safari/537.36"
},
headers=HEADER_AIO,
) as r:
return await r.text(encoding="ISO-8859-1")
except:
Expand Down
2 changes: 1 addition & 1 deletion main.py
Original file line number Diff line number Diff line change
Expand Up @@ -36,4 +36,4 @@
handler = Mangum(app)

if __name__ == "__main__":
uvicorn.run(app, host="0.0.0.0", port=8080)
uvicorn.run(app, host="0.0.0.0", port=8009)
9 changes: 9 additions & 0 deletions render.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,9 @@
services:
# A Docker web service
- type: web
name: torrent-api-py
runtime: python
plan: free
autoDeploy: true
buildCommand: pip install -r requirements.txt
startCommand: uvicorn main:app --host 0.0.0.0 --port 8009
4 changes: 2 additions & 2 deletions torrents/bitsearch.py
Original file line number Diff line number Diff line change
Expand Up @@ -3,11 +3,11 @@
import aiohttp
from bs4 import BeautifulSoup
from helper.html_scraper import Scraper

from constants.base_url import BITSEARCH

class Bitsearch:
def __init__(self):
self.BASE_URL = "https://bitsearch.to"
self.BASE_URL = BITSEARCH
self.LIMIT = None

def _parser(self, htmls):
Expand Down
4 changes: 2 additions & 2 deletions torrents/glodls.py
Original file line number Diff line number Diff line change
Expand Up @@ -2,11 +2,11 @@
import aiohttp
from bs4 import BeautifulSoup
from helper.html_scraper import Scraper

from constants.base_url import GLODLS

class Glodls:
def __init__(self):
self.BASE_URL = "https://glodls.to"
self.BASE_URL = GLODLS
self.LIMIT = None

def _parser(self, htmls):
Expand Down
7 changes: 4 additions & 3 deletions torrents/kickass.py
Original file line number Diff line number Diff line change
Expand Up @@ -5,17 +5,18 @@
from bs4 import BeautifulSoup
from helper.asyncioPoliciesFix import decorator_asyncio_fix
from helper.html_scraper import Scraper

from constants.base_url import KICKASS
from constants.headers import HEADER_AIO

class Kickass:
def __init__(self):
self.BASE_URL = "https://kickasstorrents.to"
self.BASE_URL = KICKASS
self.LIMIT = None

@decorator_asyncio_fix
async def _individual_scrap(self, session, url, obj):
try:
async with session.get(url) as res:
async with session.get(url,headers=HEADER_AIO) as res:
html = await res.text(encoding="ISO-8859-1")
soup = BeautifulSoup(html, "html.parser")
try:
Expand Down
7 changes: 4 additions & 3 deletions torrents/libgen.py
Original file line number Diff line number Diff line change
Expand Up @@ -4,18 +4,19 @@
from bs4 import BeautifulSoup
from helper.asyncioPoliciesFix import decorator_asyncio_fix
from helper.html_scraper import Scraper

from constants.base_url import LIBGEN
from constants.headers import HEADER_AIO

class Libgen:
def __init__(self):
self.BASE_URL = "https://libgen.is"
self.BASE_URL = LIBGEN
self.LIMIT = None

@decorator_asyncio_fix
async def _individual_scrap(self, session, url, obj, sem):
async with sem:
try:
async with session.get(url) as res:
async with session.get(url,headers=HEADER_AIO) as res:
html = await res.text(encoding="ISO-8859-1")
soup = BeautifulSoup(html, "html.parser")
try:
Expand Down
6 changes: 4 additions & 2 deletions torrents/limetorrents.py
Original file line number Diff line number Diff line change
Expand Up @@ -5,17 +5,19 @@
from bs4 import BeautifulSoup
from helper.asyncioPoliciesFix import decorator_asyncio_fix
from helper.html_scraper import Scraper
from constants.base_url import LIMETORRENT
from constants.headers import HEADER_AIO


class Limetorrent:
def __init__(self):
self.BASE_URL = "https://www.limetorrents.pro"
self.BASE_URL = LIMETORRENT
self.LIMIT = None

@decorator_asyncio_fix
async def _individual_scrap(self, session, url, obj):
try:
async with session.get(url) as res:
async with session.get(url,headers=HEADER_AIO) as res:
html = await res.text(encoding="ISO-8859-1")
soup = BeautifulSoup(html, "html.parser")
try:
Expand Down
4 changes: 2 additions & 2 deletions torrents/magnet_dl.py
Original file line number Diff line number Diff line change
Expand Up @@ -5,11 +5,11 @@
import cloudscraper
import requests
from bs4 import BeautifulSoup

from constants.base_url import MAGNETDL

class Magnetdl:
def __init__(self):
self.BASE_URL = "https://www.magnetdl.com"
self.BASE_URL = MAGNETDL
self.LIMIT = None

def _parser(self, htmls):
Expand Down
4 changes: 2 additions & 2 deletions torrents/nyaa_si.py
Original file line number Diff line number Diff line change
Expand Up @@ -3,11 +3,11 @@
import aiohttp
from bs4 import BeautifulSoup
from helper.html_scraper import Scraper

from constants.base_url import NYAASI

class NyaaSi:
def __init__(self):
self.BASE_URL = "https://nyaa.si"
self.BASE_URL = NYAASI
self.LIMIT = None

def _parser(self, htmls):
Expand Down
4 changes: 2 additions & 2 deletions torrents/pirate_bay.py
Original file line number Diff line number Diff line change
Expand Up @@ -3,11 +3,11 @@
import aiohttp
from bs4 import BeautifulSoup
from helper.html_scraper import Scraper

from constants.base_url import PIRATEBAY

class PirateBay:
def __init__(self):
self.BASE_URL = "https://thepiratebay10.org"
self.BASE_URL = PIRATEBAY
self.LIMIT = None

def _parser(self, htmls):
Expand Down
7 changes: 4 additions & 3 deletions torrents/torlock.py
Original file line number Diff line number Diff line change
Expand Up @@ -5,17 +5,18 @@
from bs4 import BeautifulSoup
from helper.asyncioPoliciesFix import decorator_asyncio_fix
from helper.html_scraper import Scraper

from constants.base_url import TORLOCK
from constants.headers import HEADER_AIO

class Torlock:
def __init__(self):
self.BASE_URL = "https://www.torlock2.com"
self.BASE_URL = TORLOCK
self.LIMIT = None

@decorator_asyncio_fix
async def _individual_scrap(self, session, url, obj):
try:
async with session.get(url) as res:
async with session.get(url,headers=HEADER_AIO) as res:
html = await res.text(encoding="ISO-8859-1")
soup = BeautifulSoup(html, "html.parser")
try:
Expand Down
9 changes: 4 additions & 5 deletions torrents/torrentProject.py
Original file line number Diff line number Diff line change
Expand Up @@ -5,11 +5,12 @@
from bs4 import BeautifulSoup
from helper.asyncioPoliciesFix import decorator_asyncio_fix
from helper.html_scraper import Scraper

from constants.base_url import TORRENTPROJECT
from constants.headers import HEADER_AIO

class TorrentProject:
def __init__(self):
self.BASE_URL = "https://torrentproject2.com"
self.BASE_URL = TORRENTPROJECT
self.LIMIT = None

@decorator_asyncio_fix
Expand All @@ -18,9 +19,7 @@ async def _individual_scrap(self, session, url, obj, sem):
try:
async with session.get(
url,
headers={
"User-Agent": "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/98.0.4758.102 Safari/537.36",
},
headers=HEADER_AIO,
) as res:
html = await res.text(encoding="ISO-8859-1")
soup = BeautifulSoup(html, "html.parser")
Expand Down
Loading

0 comments on commit 751615d

Please sign in to comment.