- About Hugo
- Getting Started
- Hugo Modules
- Content Management
-
Templates
- Templates Overview
- Introduction
- Template Lookup Order
- Custom Output Formats
- Base Templates and Blocks
- List Page Templates
- Homepage Template
- Section Templates
- Taxonomy Templates
- Single Page Templates
- Content View Templates
- Data Templates
- Partial Templates
- Shortcode Templates
- Local File Templates
- 404 Page
- Menu Templates
- Pagination
- RSS Templates
- Sitemap Template
- Robots.txt
- Internal Templates
- Alternative Templating
- Template Debugging
- Functions
- Variables
- Hugo Pipes
- CLI
- Troubleshooting
- Tools
- Hosting & Deployment
- Contribute
- Maintenance
Robots.txt File
Hugo can generate a customized robots.txt in the same way as any other template.
To create your robots.txt as a template, first set the enableRobotsTXT
value to true
in your
configuration file
. By default, this option generates a robots.txt with the following content, which tells search engines that they are allowed to crawl everything:
User-agent: *
Robots.txt Template Lookup Order
The
lookup order
for the robots.txt
template is as follows:
/layouts/robots.txt
/themes/<THEME>/layouts/robots.txt
Robots.txt Template Example
The following is an example robots.txt
layout:
layouts/robots.txt
User-agent: *
{{range .Pages}}
Disallow: {{.RelPermalink}}
{{end}}
This template disallows all the pages of the site by creating one Disallow
entry for each page.