---
title: robots.txt
description: API Reference for robots.txt file.
---
Add or generate a `robots.txt` file that matches the [Robots Exclusion Standard](https://en.wikipedia.org/wiki/Robots.txt#Standard) in the **root** of `app` directory to tell search engine crawlers which URLs they can access on your site.
## Static `robots.txt`
```txt filename="app/robots.txt"
User-Agent: *
Allow: /
Disallow: /private/
Sitemap: https://acme.com/sitemap.xml
```
## Generate a Robots file
Add a `robots.js` or `robots.ts` file that returns a [`Robots` object](#robots-object).
> **Good to know**: `robots.js` is a special Route Handler that is cached by default unless it uses a [Request-time API](/docs/app/glossary#request-time-apis) or [dynamic config](/docs/app/guides/caching-without-cache-components#dynamic) option.
```ts filename="app/robots.ts" switcher
import type { MetadataRoute } from 'next'
export default function robots(): MetadataRoute.Robots {
return {
rules: {
userAgent: '*',
allow: '/',
disallow: '/private/',
},
sitemap: 'https://acme.com/sitemap.xml',
}
}
```
```js filename="app/robots.js" switcher
export default function robots() {
return {
rules: {
userAgent: '*',
allow: '/',
disallow: '/private/',
},
sitemap: 'https://acme.com/sitemap.xml',
}
}
```
Output:
```txt
User-Agent: *
Allow: /
Disallow: /private/
Sitemap: https://acme.com/sitemap.xml
```
### Customizing specific user agents
You can customize how individual search engine bots crawl your site by passing an array of user agents to the `rules` property. For example:
```ts filename="app/robots.ts" switcher
import type { MetadataRoute } from 'next'
export default function robots(): MetadataRoute.Robots {
return {
rules: [
{
userAgent: 'Googlebot',
allow: ['/'],
disallow: '/private/',
},
{
userAgent: ['Applebot', 'Bingbot'],
disallow: ['/'],
},
],
sitemap: 'https://acme.com/sitemap.xml',
}
}
```
```js filename="app/robots.js" switcher
export default function robots() {
return {
rules: [
{
userAgent: 'Googlebot',
allow: ['/'],
disallow: ['/private/'],
},
{
userAgent: ['Applebot', 'Bingbot'],
disallow: ['/'],
},
],
sitemap: 'https://acme.com/sitemap.xml',
}
}
```
Output:
```txt
User-Agent: Googlebot
Allow: /
Disallow: /private/
User-Agent: Applebot
Disallow: /
User-Agent: Bingbot
Disallow: /
Sitemap: https://acme.com/sitemap.xml
```
### Non-standard directives
Some search engines support directives that aren't part of the [Robots Exclusion Standard](https://en.wikipedia.org/wiki/Robots.txt#Standard), such as `Request-Rate` (Seznam) or `Clean-param` (Yandex). Pass these through the `other` field on a rule. Keys preserve their casing and array values emit one line per entry, scoped to the rule's `User-Agent` block.
```ts filename="app/robots.ts" switcher
import type { MetadataRoute } from 'next'
export default function robots(): MetadataRoute.Robots {
return {
rules: [
{ userAgent: '*', allow: '/' },
{
userAgent: 'SeznamBot',
allow: '/',
other: {
'Request-Rate': '10/1m',
},
},
],
}
}
```
```js filename="app/robots.js" switcher
export default function robots() {
return {
rules: [
{ userAgent: '*', allow: '/' },
{
userAgent: 'SeznamBot',
allow: '/',
other: {
'Request-Rate': '10/1m',
},
},
],
}
}
```
Output:
```txt
User-Agent: *
Allow: /
User-Agent: SeznamBot
Allow: /
Request-Rate: 10/1m
```
> **Good to know**: Values in `other` are passed through verbatim. Next.js does not validate directive names or values, so refer to the target search engine's documentation for the exact syntax.
### Robots object
```tsx
type Robots = {
rules:
| {
userAgent?: string | string[]
allow?: string | string[]
disallow?: string | string[]
crawlDelay?: number
other?: Record<string, string | number | Array<string | number>>
}
| Array<{
userAgent: string | string[]
allow?: string | string[]
disallow?: string | string[]
crawlDelay?: number
other?: Record<string, string | number | Array<string | number>>
}>
sitemap?: string | string[]
host?: string
}
```
## Version History
| Version | Changes |
| --------- | ---------------------------------------------------------- |
| `v16.3.0` | Added `other` field for non-standard per-agent directives. |
| `v13.3.0` | `robots` introduced. |