Booter - Bots & Crawlers Manager is a preventative measure (treatment in advance) and treatment of damages caused by crawlers and bots.
The plugin uses a number of existing technologies which are known by crawlers and bots and takes them one step forward - smartly and almost completely automatically.
To allow the plugin to function correctly, you must follow the instructions and manually enter some data (which must be done by a human being to avoid errors).
At the prevention level
- Booter allows you to manage and create an advanced dynamic robots.txt file.
- View a 404 error log to see the most common bad links.
- Blocking bad bots that cause high server loads due to very frequent page crawls, or are used to search for security vulnerabilities.
At the treatment level
- Booter allows you to limit the amount of requests from crawlers and bots, if or when they exceed the specified amount of requests per minute, it will be rejected for a specified period of time.
- Rejecting links that we do not want in the fastest way, not by just blocking but by sending the appropriate HTTP status code to make search engines forget them.
Instructions for use in case of damage treatment
- Activate the plugin.
- Enable the 404 error log option.
- Set the access rate limit.
- Watch the 404 log, try to find common parts in the URLs that repeats most often.
- Enter the common parts to the "reject links" page, and ensure the rejection code is 410.
- Clear the 404 error log.
- Repeat the process once every few hours until the 404 error log remains blank.
- Check the status of your website's index coverage every few days.