Skip to content

Quikko/Recon-Methodology

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

12 Commits
 
 
 
 
 
 
 
 

Repository files navigation

Recon Methodology

Little Intro

I'm Quinten Van Ingh an application security specialist and in my spare time I love to hunt for bugs. I just started with bug bounty (4 weeks ago) on HackerOne and like most of you guys, I want to share my resources and other things. I consider myself to be in the beginner phase of the bug bounty sector but I try to learn every day. So if you have any suggestions, advice, tips, tricks, tools let me know !

This document is based on my own research but mostly on the talk of Jhaddix - The Bug Hunters Methodology v3.

Why am I sharing this? Everything I've learned is from guys like Jhaddix. These people shares all the knowledge they have, to give other hackers to opportunity to grow.

Keep Posted because I'll update this page !

Sub-domain enumeration

We can split this up in two different categories:

  • Horizontal sub-domain enumeration
  • Vertical sub-domain enumeration

Horizontal sub-domain enumeration examples: www.google.com, dev.google.com, maps.google.com

Vertical sub-domain enumeration are sites which are also used by the main domain. For example snapchat.com, snap.com, spectacles.com.

Horizontal subdomain enumeration

The tools that can be used to perform horizontal sub-domain enumeration can also be split into two categories.

  1. Sub-domain brute-forcing
  2. Looking for sub-domains via logging, search engines, ...

Sub-domain bruteforcing tools

Tools to use for these:

gobuster -m dns -u $TARGET.com -t 100 -w all.txt
./subbrute.py /root/work/bin/all.txt $TARGET.com | ./bin/massdns -r resolvers.txt -t A -a -o -w massdns_output.txt
./scripts/ct.py example.com | ./bin/massdns -r lists/resolvers.txt -t A -o S -w results.txt

The all.txt is a collection of all the different wordlist used by all the different sub-domain bruteforcing tools. You can find it over here Jason Haddix' subdomain compilation

Below the massdns github page, you can see that the subbrute.py and ct.py are already included in the massdns project itself. The default subbrute.py list is a good one but is also included in all.txt

Prevent of typing all these commands over and over again every time you have a new project. Make a simple bash script like I did, for example

subbrute.sh

#/bin/bash
./massdns/scripts/subbrute.py ./massdns/lists/names.txt $1 | ./massdns/bin/massdns -r ./massdns/lists/resolvers.txt -t A -o S -w results/$1.sub.txt

subbrute-big.sh : note the all.txt

#/bin/bash
./massdns/scripts/subbrute.py ./all.txt $1 | ./massdns/bin/massdns -r ./massdns/lists/resolvers.txt -t A -o S -w results/$1.sub.txt

ct.sh

#/bin/bash
./massdns/scripts/ct.py $1 | ./massdns/bin/massdns -r ./massdns/lists/resolvers.txt -t A -o S -w results/$1.ct.txt

Another possibility is to add certain function/command to your .bash_profile

The $1 is where the domain itself. So for example to execute subbrute.sh:

./subbrute.sh google.com

This will write all the possible subdomains in a directory results named google.com.txt.

Massdns will default provide all the A records and CNAME records it finds for the sub domains in the results (which is awesome). Later more on this.

Note, there are way more tools which has the option to perform bruteforce attacks against a certain domain. For example:

Looking for sub-domains via logging, search engines, ...

Tools which can be used:

Next to tools, there are also sites which can be used to find some sub-domains.

Now I know that most of the sites above are included in the tools. Just provide them your API key and play with it.

Vertical sub-domain enumeration

Look for reserved IP blocks and look to the IP's. This way you can see other domains used by an organazations as well sub-domains. Below you'll find some useful links:

Note this can easily be script for example: https://reverse.report/search?q=test.com where you replace "test.com" by a parameter you provide to your script. Also look if the above sites have an API available. So you can easily lookup for domains and automate this.

Acquisitions

Next to vertical and horizontal subdomain enumeration, Jhaddix also mentioned "acquisitions" in his talk:

Enter the the company or a person into the search bar at the top of the page and look at the acquisitions.

Linked Discovery

For this you'll need the tool BurpSuite. I highly recommend you to go and watch how this works on Jhaddix his twitter account. I'll try to explain the steps to do it below:

  • Turn off passive scanning (Scanner tab -> turn off passive scanning)
  • Set forms auto to submit (Spider tab --> options --> Application login : Handle as ordinary forms. )BE CAREFUL FOR EMAIL FORMS !!!
  • Set scope to advanced control and use string of target name - -> target tab --> enable : use advanced scope control --> match it to a keyword (not a normal FQDN) e.g. Tesla in the host field
  • Show only in scope items.
  • Walk+Browse through the website
  • Then spider all hosts recursively.

With this you'll also find some extra sub-domains.

CSP-Header

Always go through the CSP-Header. In here you can also find domains/subdomains.

What now?

So you have a list of sub-domains and form some tools also the IP's. From here we can perform several steps.

Portscanner

Gather from all the (sub)domains the IP addresses and throw them in a masscan or any other portscanner:

I prefer using Masscan because it's really fast. The command provided by Jhaddix:

masscan -p1-65535 -iL $TARGET_LIST --max-rate 100000 -oG $TARGET_OUTPUT

To gather the IP's of the (sub) domains. There is a script available in the talk from Jhaddix. I'll add this later.

Look through the ports and see if there are old versions or services which should not be open to the public. To perform a bruteforce attack Jhaddix gave a nice tool:

Note the output file of masscan needs to be in a .gnamp or .xml file. (nmap -oG)

Example of brutespray:

 python brutespray.py --file nmap.gnmap -U /usr/share/wordlist/user.txt -P /usr/share/wordlist/pass.txt --threads 5 --hosts 5

Screenshots

Get all the domains in one file, make sure all domains are listed only once (uniq command in Linux). Provide the list to a tool which will make a screen shot of every domain:

An example of usage:

sudo ./EyeWitness.py -f test.txt  --headless --prepend-https

The --prepend-https options make sure it will take a screen shot of port 80 (HTTP) and 443(HTTPS)

When you've used aquatone-discover to gather sub-domains, you can use the aquatone-gather to make screenshots of the sub-domains. Make sure you first run aquatone-scan.

401/403 response

After all the screenshots are made, go through the list and search for the ones which gave you a 401/403 response. Copy and paste these and throw them in a list. Use the waybackmachine and check if you can find some directories. Maybe the organization forgot to put the right permissions on certain directories or files. Tools u can use for this:

Another tip is to run a tool which perform directory brute forcing on it. There are several tools for this:

There are many wordlists you can use here:

Identify

So you got your screenshots from all the sub-domains but don't know where to start? Look for the ones which are custom made (ASP.NET,...). Tools which can help you to identify these are:

Keep these domains in a list and start looking at them when your whole recon phase is done.

Linkfinder

Another tip provided by Jhaddix and several other infosec guys is, always check the JavaScript code. It could be possible that you'll find new endpoints, hardcoded credentials, hardcoded JWT signing key, ... Especially when they are making use of a CMS based on JavaScript.

Tools to make your life easier:

Do you need to manually put all the used JavaScript files into the tool ? Not when using BurpSuite:

  • In BurpSuite, Go to Target tab.
  • Right click on the subdomain (where you want to analyze the JS from)
  • Click on Engagements tools
  • Select Find scripts
  • CTRL + A to select all the scripts Burp has found.
  • Right click and select "Copy Selected URLS"
  • Paste them into a file and run a command like uniq. This to remove the duplicates.
  • Paste all the urls in LinkFinder/JSParser.
  • Enjoy ;)

Via these tools, you can easily find new endpoints.

Subdomain takeover

When you have a main list of all the subdomains, you can start looking for subdomain takeovers. You can provide your list of subdomains to a tool like SubOver. I higly recommended resource is https://github.com/EdOverflow/can-i-take-over-xyz. In here you'll find the default messages provided by several services which can lead to a subdomain takeover.

When you are using aquatone, you can use aquatone-takeover.

AWS - Buckets

To find buckets of an organization:

Another tip it's not because a bucket is not publicly readable that the permissions to write or delete a file into/from the bucket are correctly configured. Always test writing into the bucket.

GitHub

Github is great for searching things (credentials, keys, endpoints, services, APK's/IPA's, ..) of an organization. You can find these by using sort of github dorks. I highly recommend resource/tool :

I highly suggest to check the commits of a certain repository.

To be honest, I need to do more resources for tools.

WAF

To check which WAF is used on a certain subdomain, I make use of wafwoof.

To bypass a WAF, you'll need to have the original IP of the webserver. This can be obtained via several ways:

A good friend/colleague of mine wrote a tool for this which I only can recommend:

Another technique that can be used to obtain the original IP address:

When the website has a certain "subscribe" functionality or a build in functionality which send you an email, check the headers of the mail and look for the IP.

Once you got several IP's, you can test these with a simple curl command

curl --silent --fail -H "Host: www.test.com" http://$IP_YOU_HAVE_FOUND

When the html of the curl command is the same as the one you can see on www.test.com, you got a WAF bypass.

You can also just use the IP as an URL or change your host file (or the one in burp.)

Thank you

First of all I want to thank bug bounty platforms like BugCrowd and HackerOne to organize events and give people like me the opportunity to learn and enter the bug bounty community. Things like the levelupx02 by BugCrowd really helps for people like me.

Secondly I want to thank all the security researchers who share their knowledge and tools they have written. Also thanks to all the speakers at BugCrowd event: levelupx02

About

Recon Methodology

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published