Route Crawler is a CLI application built with Node.js that scans a given website and displays all the routes found on it. It utilizes modern web technologies such as axios for making HTTP requests and jsdom for parsing HTML, effectively extracting and listing all hyperlinks present on a webpage.
- Scans a given website for all routes.
- Handles dynamic routes and JavaScript-driven navigation.
- Outputs the total number of routes found.
- Provides a list of all discovered routes.
-
Clone the repository:
git clone https://github.com/RizkyZaki/route-crawler.git cd route-crawler
-
Install the required dependencies:
npm i
Run the main script:
npm start crawl
Contributions and suggestions are highly appreciated. If you would like to contribute to this project, please open an issue or submit a pull request.
Distribute under the MIT license. For more information, see LICENSE
.
© 2024 zch. Built with love for the developer community.