A lightweight package that blocks unwanted crawlers, AI agents, scrapers, and high‑frequency visitors while allowing legitimate bots such as Googlebot and Bingbot to pass.
You can install the package via composer:
composer require xqus/bad-botYou can publish the config file with:
php artisan vendor:publish --tag="bad-bot-config"Optionally, you can publish the views using
php artisan vendor:publish --tag="bad-bot-views"Add the middelware you want to run to your bootstrap\app.php file (see the official documentation)
->withMiddleware(function (Middleware $middleware): void {
$middleware->appendToGroup('web', [
xqus\BadBot\Middleware\BadBotMiddleware::class, // blocks marked ip addresses
xqus\BadBot\Middleware\ThrottleMiddleware::class, // blocks noisy bots, but allows whitelisted bots (Google, Bing, etc)
xqus\BadBot\Middleware\UserAgentMiddleWare::class // blocks bots based on user-agent
]);
})To build a new robots.txt run
php artisan badbot:update-txtBe aware that this command overwrites your current public/robots.txt file.
By default an exception extending Symfony\Component\HttpKernel\Exception\HttpException will be called when an request is blocked. This will render the default HTTP error page for that error code.
If you want to handle errors differently the following exceptions exists:
xqus\BadBot\Exceptions\RequestRateLimitedException
xqus\BadBot\Exceptions\UserAgentBlockedException
xqus\BadBot\Exceptions\IpAddressBlockedException
composer testPlease see CHANGELOG for more information on what has changed recently.
Please see CONTRIBUTING for details.
The MIT License (MIT). Please see License File for more information.