- DNS randomization flag
- No-Wait (between tow requests) option added
- Switched Internet connectivity check from PING to TCP Handshake
- Moved all functions in ./src directory
- Minor fixes
- HTTP/HTTPS proxy option (Thx to acbekoac)
- Added flags for separate URL-list and blocklist file specification
- Added disclaimer as suggested by Samuel First
- Minor code fixes (Thx to Samuel First)
- Minur UI improvements
- HTTP requests are now atomic
- You can now stop partyloud via CTRL-C
- On startup is executed a software check, if something is missing user get notified
- On startup is executed a connection check, if partyloud will exit if unable to connect to network
- Added compression option to cURL requests
- Added max-time (aka timeout) option to cURL requests
- Changed Header options in cURL requests to filter everything but text
- Changes in UI:
- Everything fits in 80 cols
- Every request made will be dispalyed
- Changed error recovery mechanism
- Changed html parsing mechanism
- Changed badwords wordlist
- Changed url list
- Changed url list location (partyloud.conf)
- Removed process number option
- Re-added User defined # of threads
- upper bound = 24
- lower bound = 1
- Now UserAgent is generated using generateUserAgent function.
- OS List
- Windows 10
- Windows 8.1
- Windows 8
- Windows 7
- MacOS Mojave
- MacOS High Sierra
- MacOS Sierra
- MacOs El Capitan
- Linux (generic)
- Browser List
- Mozilla Firefox 50 - 66
- Google Chrome 56 - 73
- Added Help screen
- Fixed a bug that caused a division by 0 to computed during Engine execution
- Fixed a bug that caused a file named "1" to be generated
- Minor changes in UI
- Now each partyloud engine wait a pseudo-randomic amount of time before making a new request to prevent anti-DDoS mechanism triggering (Thx to Ale Sala)
- The wait time is calculted using this formula Wait time = (Guessed #Word * Reading Speed [second/word])
- Added User Agents to cURL requests in order to improve traffic randomness
- Changed error recovery mechanism (now if an HTTP request fail a backup URL is used)
- Fixed bash 3.2 bug in the URL selection mechanism
- Fixed wc -l related bug
- Minor changes in UI
- Internal Engine is now complete and operative
- cURL is now used to generate pseudo-random requests
- HTML response is now parsed using grep
- Bad URLs are now filtered using a wordlist mechanism (wordlist is located in a file named badwords)
- Fixed number of sub-processes to 7
- noisy.py and python are now no more required to run the script
- disabled user-defined number of processes
- Started migration from noisy.py to internel Egnine
- Major UI Improvemnts
- Initilal Alpha
- Added a while loop to start a used defined number of noisy.py process
- Added a minimal UI