Say goodbye to manually creating a robots.txt file
If you don't want a site to be indexed by search engines you must place a robots.txt file. Typically you don't want anything indexed except production sites.
Today Spatie released a new package called laravel-robots-middleware. It was coded up by my colleague Sebastian. Instead of you having to create a robots.txt this Laravel package will add a x-robots-tag to every request. If you want to limit a search bot to only crawl your production enviroment you can extend the Spatie\RobotsMiddleware\RobotsMiddleware
-class like this:
// app/Http/Middleware/MyRobotsMiddleware.php
use Illuminate\Http\Request;
use Spatie\RobotsMiddleware\RobotsMiddleware;
class MyRobotsMiddleware extends RobotsMiddleware
{
/**
* @return string|bool
*/
protected function shouldIndex(Request $request)
{
return app()->environment('production');
}
}
To learn all the options read the documentation on GitHub. Like this package? Then maybe some of our other Laravel packages might be useful to you as well.
What are your thoughts on "Say goodbye to manually creating a robots.txt file"?