Skip to content

Commit 36264d4

Browse files
authored
[Feat Security] - Allow blocking web crawlers (#10420)
* security add robots.txt settings to security * block web crawlers * test_enterprise_routes.py * docs proxy enterprise
1 parent 00605e3 commit 36264d4

File tree

8 files changed

+556
-421
lines changed

8 files changed

+556
-421
lines changed

docs/my-website/docs/proxy/deploy.md

+22
Original file line numberDiff line numberDiff line change
@@ -1049,3 +1049,25 @@ export DATABASE_SCHEMA="schema-name" # skip to use the default "public" schema
10491049
litellm --config /path/to/config.yaml --iam_token_db_auth
10501050
```
10511051
1052+
### ✨ Blocking web crawlers
1053+
1054+
Note: This is an [enterprise only feature](https://docs.litellm.ai/docs/enterprise).
1055+
1056+
To block web crawlers from indexing the proxy server endpoints, set the `block_robots` setting to `true` in your `litellm_config.yaml` file.
1057+
1058+
```yaml showLineNumbers title="litellm_config.yaml"
1059+
general_settings:
1060+
block_robots: true
1061+
```
1062+
1063+
#### How it works
1064+
1065+
When this is enabled, the `/robots.txt` endpoint will return a 200 status code with the following content:
1066+
1067+
```shell showLineNumbers title="robots.txt"
1068+
User-agent: *
1069+
Disallow: /
1070+
```
1071+
1072+
1073+

0 commit comments

Comments
 (0)