hello, im looking for some software like a Web scraper / parser that parse results from search engines and a spider / crawler to extract all http links from domains / sites.. which ones you recommend? Thanks
Google - Gscraper (works fine without proxy for simple queryes not including syntax like "site:/inurl:?etc" to get about 200k-400k results from 1 ip ) also with proxy works fine with sintax A-parser - better way for parsing all search engenies and etc (https://a-parser.com/wiki/special/pages) $199 pro license lifetime Alternative - scarpebox https://transfer.sh/aUUPJ/Gscraper_Patched_Fully_WORKING.exe Good solutions if you looking tool for security audit - http://www.spiderfoot.net/
this tool github.com/1N3/BlackWidow crawl POST param but its not effective.. anyone know a similar better alternative? Thanks
what you mean insufficient effective? For fuzzing POST parameters use tools for fuzzing/sqli/etc. Burp/Zenproxy/Sqlmap Your question about crawling. More effective would be if you tell what you whant in result) Else you can contribute BlackWidow, add couple string of code https://github.com/1N3/BlackWidow/blob/4b19bdcc2f642bcfdef4f9a15eaa125f55e91c9d/blackwidow#L57 Another way if you look tool with better crawling urls/parametrs for next auditing it in other tools, check it http://htcap.org/