Skip to content

Mining URLs from dark corners of Web Archives for bug hunting/fuzzing/further probing

License

Notifications You must be signed in to change notification settings

PushkraJ99/ParamSpider

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

🔱 ParamSpider

Mining URLs from Dark Corners of Web Archives for Bug Hunting / Fuzzing / Further Probing

📖 About 🏗️ Installation 🛠 Usage 🚀 Examples 🤝 Contributing 🥷🏻 Social 💀 Fun

paramspider

ParamSpider Allows you to Fetch URLs Related to any Domain or a List of Domains from Wayback Achives. It Filters Out "boring" URLs, Allowing you to Focus on the Ones that Matter the Most.


📖 About

  • Finds Parameters From Web Archives of the Entered Domain.
  • Finds Parameters from Subdomains as Well.
  • Gives Support to Exclude URLs with Specific Extensions.
  • Saves the Output Result in a Nice and Clean Manner.
  • It Mines the Parameters from Web Archives ( Without Interacting with the Target Host )
  • New Features Added
  • Scanning for Subdomains of Target Domain OR Target Domain List
  • Saving Combined Output of Domain List with Separate Domain Wise URLs and Combined URLs

📥 Installation

To Install paramspider, Follow These Steps:

git clone https://github.com/PushkraJ99/ParamSpider
cd ParamSpider
pip install .
paramspider -h

OR Try This One Liner Specially For VPS Users Normal Linux User Also Can Try

pip install git+https://github.com/PushkraJ99/ParamSpider

After Installation You Cant Find the paramspider installed Location then Try

whereis paramspider

If You Are Using Kali Linux and Getting Error paramspider not found try This Command

sudo cp paramspider-location /usr/local/bin/

💀 Usage

To Use paramspider, Follow These Steps

paramspider -d domain.com

🚀 Examples

Here are a Few Examples of How to Use paramspider

  • Discover URLs for a Single Domain
   paramspider -d domain.com
  • Discover URLs for a Single Domain with Subdomains
   paramspider -d domain.com --subs

paramspider

  • Save URLs Output for a Single Domain
  paramspider -d domain.com --subs -o fuzz.txt

paramspider

  • Discover URLs for Multiple Domains from a File
  paramspider -l list.txt
  • Discover URLs for Multiple Domains with Subdomains from a File
  paramspider -l list.txt --subs

paramspider

  • Stream URLs on Terminal
  paramspider -d domain.com -s
  • Set up Web Request Proxy
  paramspider -d domain.com --proxy '127.0.0.1:7890'
  • Adding a Placeholder for URL Parameter Values (default: "FUZZ")
  paramspider -d domain.com -p '"><h1>reflection</h1>'

🤝 Contributing

Contributions are Welcome! If you'd like to Contribute to paramspider Please Follow These Steps

  1. Fork the Repository.
  2. Create a New Branch.
  3. Make Your Changes and Commit them.
  4. Submit a Pull Request.

🫣 Author

Github


🥷🏻 UPGRADED BY :)

Github LinkedIn Twitter Medium Instagram


🤗 JUST 4 FUN

✨ Stargazers

Stargazers repo roster for @PushkraJ99/ParamSpider

☢️ Forkers

Forkers repo roster for @PushkraJ99/ParamSpider


Visitor Count


About

Mining URLs from dark corners of Web Archives for bug hunting/fuzzing/further probing

Resources

License

Stars

Watchers

Forks

Packages

No packages published

Languages

  • Python 96.7%
  • Dockerfile 3.3%