
Grawler is the best tool ever, made for automating google dorks it's a tool written in PHP which comes with a web interface that automates the task of using google dorks, scrapes the results, and stores them in a file, version 1.0 is the more powerful than ever supporting multiple proxies. ( Read in features )
More: https://github.com/A3h1nt/Grawler
General info
Grawler aims to automate the task of using google dorks with a web interface, the main idea is to provide a simple yet powerful tool that can be used by anyone, the thing that makes Grawler different in its category is its features.
Features
- The biggest issue faced by tools that automate google dorks is CAPTCHA, but with Grawler, CAPTCHA is not an issue anymore, Grawler comes with a proxy feature which supports three different proxies.
- Supported Proxies ( The mentioned proxies need you to signup and get the API key, without any credit card information and give you around one thousand free API calls each )
- Grawler now supports two different modes.
- Automatic Mode : Automatic mode now comes with many different dork files and supports multiple proxies to deliver a smooth experience.
- Manual Mode : The manual mode has become more powerful with the Depth feature, now you can select the number of pages you want to scrape results from, proxy feature is also supported by manual mode.
- Dorks are now categorized in the following categories:
- Error Messages
- Extension
- Java
- JavaScript
- Login Panels
- .Net
- PHP
- SQL Injection (7 different files with different dorks)
- My_dorks file for users to add their own dorks.
- API keys for proxies are first validated and added to the file.
- Manual mode allows users to go up to depth 4, but I'd recommend using depth 2 or 3 because the best results are usually on the initial pages.
- Grawler comes with it's own guide to learn google dorks.
- The results are stored in a file ( filename needs to be specified with txt extension ).
- URL scraping is better than ever with no garbage URL's at all.
- Grawler supports three different search engines are supported (Bing, Google, Yahoo), so if one blocks you another one is available.
- Multiple proxy with multiple search engines delivers the best experience ever.
Setup
- Download the ZIP file
- Download XAMPP server
- Move the folder to htdocs folder in XAMPP
- Navigate to http://localhost/grawler
- Results will be stored in same directory
Demo
Contribute
- Report Bugs
- Add more effective google dorks (which actually works)
- Work on portability
- Suggestions
Contact Tool Developer:
Questions? You can contact, the tool developer at Twitter: A3h1nt
Problems? Visit tool main page for help https://github.com/A3h1nt/Grawler
Author

- Hakin9 is a monthly magazine dedicated to hacking and cybersecurity. In every edition, we try to focus on different approaches to show various techniques - defensive and offensive. This knowledge will help you understand how most popular attacks are performed and how to protect your data from them. Our tutorials, case studies and online courses will prepare you for the upcoming, potential threats in the cyber security world. We collaborate with many individuals and universities and public institutions, but also with companies such as Xento Systems, CATO Networks, EY, CIPHER Intelligence LAB, redBorder, TSG, and others.
Latest Articles
Blog2022.12.13What are the Common Security Weaknesses of Cloud Based Networks?
Blog2022.10.12Vulnerability management with Wazuh open source XDR
Blog2022.08.29Deception Technologies: Improving Incident Detection and Response by Alex Vakulov
Blog2022.08.25Exploring the Heightened Importance of Cybersecurity in Mobile App Development by Jeff Kalwerisky
Subscribe
0 Comments