Skip to content

change for guardrails + openrouter + new integrations section #336

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 8 commits into from
May 19, 2025
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
26 changes: 26 additions & 0 deletions api-reference/inference-api/authentication.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -14,6 +14,10 @@ To ensure secure access to Portkey's APIs, authentication is required for all re

Based on your access level, you might see the relevant permissions on the API key modal - tick the ones you'd like, name your API key, and save it.

<Card title="JWT-based Authentication" href="#jwt-based-authentication">
You can also authenticate Portkey using JWT Tokens. Learn more here
</Card>

## Authentication with SDKs

### Portkey SDKs
Expand Down Expand Up @@ -130,3 +134,25 @@ response = openai_client.chat.completions.create(
</Tabs>

Read more [here](/integrations/llms/openai).


## JWT-based Authentication

Portkey supports JWT-based authentication as a secure alternative to API Key authentication. With JWT authentication, clients can authenticate API requests using a JWT token that is validated against a configured JWKS (JSON Web Key Set).

This enterprise-grade authentication method is available as an add-on to any Portkey plan. JWT authentication provides enhanced security through:

- Temporary, expiring tokens
- Fine-grained permission scopes
- User identity tracking
- Centralized authentication management

<Card title="JWT Token Authentication" href="/product/enterprise-offering/org-management/jwt">
Learn how to implement JWT-based authentication with Portkey
</Card>

<Note>
<b>Interested in adding JWT authentication to your Portkey plan?</b>

[Contact our sales team](https://portkey.sh/jwt) to discuss pricing and implementation details.
</Note>
80 changes: 66 additions & 14 deletions docs.json
Original file line number Diff line number Diff line change
Expand Up @@ -101,20 +101,9 @@
"pages": [
"product/guardrails",
"product/guardrails/list-of-guardrail-checks",
"product/guardrails/azure-guardrails",
"product/guardrails/pii-redaction",
"product/guardrails/patronus-ai",
"product/guardrails/bedrock-guardrails",
"product/guardrails/lasso",
"product/guardrails/aporia",
"product/guardrails/pillar",
"product/guardrails/prompt-security",
"product/guardrails/pangea",
"product/guardrails/acuvity",
"product/guardrails/mistral",
"product/guardrails/embedding-guardrails",
"product/guardrails/bring-your-own-guardrails",
"product/guardrails/creating-raw-guardrails-in-json"
"product/guardrails/creating-raw-guardrails-in-json",
"product/guardrails/pii-redaction"
]
},
"product/mcp",
Expand Down Expand Up @@ -333,6 +322,22 @@
"group": "Cloud Platforms",
"pages": ["integrations/cloud/azure"]
},
{
"group": "Guardrails",
"pages": [
"integrations/guardrails/aporia",
"integrations/guardrails/acuvity",
"integrations/guardrails/azure-guardrails",
"integrations/guardrails/bedrock-guardrails",
"integrations/guardrails/lasso",
"integrations/guardrails/mistral",
"integrations/guardrails/pangea",
"integrations/guardrails/patronus-ai",
"integrations/guardrails/pillar",
"integrations/guardrails/prompt-security",
"integrations/guardrails/bring-your-own-guardrails"
]
},
{
"group": "Plugins",
"pages": ["integrations/plugins/exa"]
Expand Down Expand Up @@ -869,7 +874,10 @@
},
{
"group": "SDK Releases",
"pages": ["changelog/node-sdk-changelog", "changelog/python-sdk-changelog"]
"pages": [
"changelog/node-sdk-changelog",
"changelog/python-sdk-changelog"
]
}
]
}
Expand Down Expand Up @@ -2241,6 +2249,50 @@
{
"source": "/api-reference/inference-api/sdks/c-sharp",
"destination": "/api-reference/sdk/c-sharp"
},
{
"source": "/product/guardrails/aporia",
"destination": "/integrations/guardrails/aporia"
},
{
"source": "/product/guardrails/acuvity",
"destination": "/integrations/guardrails/acuvity"
},
{
"source": "/product/guardrails/azure-guardrails",
"destination": "/integrations/guardrails/azure-guardrails"
},
{
"source": "/product/guardrails/bedrock-guardrails",
"destination": "/integrations/guardrails/bedrock-guardrails"
},
{
"source": "/product/guardrails/lasso",
"destination": "/integrations/guardrails/lasso"
},
{
"source": "/product/guardrails/mistral",
"destination": "/integrations/guardrails/mistral"
},
{
"source": "/product/guardrails/pangea",
"destination": "/integrations/guardrails/pangea"
},
{
"source": "/product/guardrails/patronus-ai",
"destination": "/integrations/guardrails/patronus-ai"
},
{
"source": "/product/guardrails/pillar",
"destination": "/integrations/guardrails/pillar"
},
{
"source": "/product/guardrails/prompt-security",
"destination": "/integrations/guardrails/prompt-security"
},
{
"source": "/product/guardrails/bring-your-own-guardrails",
"destination": "/integrations/guardrails/bring-your-own-guardrails"
}
],
"seo": {
Expand Down
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
103 changes: 103 additions & 0 deletions integrations/llms/mistral-ai.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -163,6 +163,109 @@ You can manage all prompts to Mistral AI in the [Prompt Library](/product/prompt

Once you're ready with your prompt, you can use the `portkey.prompts.completions.create` interface to use the prompt in your application.


### Mistral Tool Calling
Tool calling feature lets models trigger external tools based on conversation context. You define available functions, the model chooses when to use them, and your application executes them and returns results.

Portkey supports Mistral Tool Calling and makes it interoperable across multiple providers. With Portkey Prompts, you can templatize various your prompts & tool schemas as well.


<Tabs>
<Tab title="Node.js">
```javascript Get Weather Tool
let tools = [{
type: "function",
function: {
name: "getWeather",
description: "Get the current weather",
parameters: {
type: "object",
properties: {
location: { type: "string", description: "City and state" },
unit: { type: "string", enum: ["celsius", "fahrenheit"] }
},
required: ["location"]
}
}
}];

let response = await portkey.chat.completions.create({
model: "your_mistral_model_name",
messages: [
{ role: "system", content: "You are a helpful assistant." },
{ role: "user", content: "What's the weather like in Delhi - respond in JSON" }
],
tools,
tool_choice: "auto",
});

console.log(response.choices[0].finish_reason);
```
</Tab>
<Tab title="Python">
```python Get Weather Tool
tools = [{
"type": "function",
"function": {
"name": "getWeather",
"description": "Get the current weather",
"parameters": {
"type": "object",
"properties": {
"location": {"type": "string", "description": "City and state"},
"unit": {"type": "string", "enum": ["celsius", "fahrenheit"]}
},
"required": ["location"]
}
}
}]

response = portkey.chat.completions.create(
model="your_mistral_model_name",
messages=[
{"role": "system", "content": "You are a helpful assistant."},
{"role": "user", "content": "What's the weather like in Delhi - respond in JSON"}
],
tools=tools,
tool_choice="auto"
)

print(response.choices[0].finish_reason)
```
</Tab>
<Tab title="cURL">
```curl Get Weather Tool
curl -X POST "https://api.portkey.ai/v1/chat/completions" \
-H "Content-Type: application/json" \
-H "Authorization: Bearer YOUR_PORTKEY_API_KEY" \
-d '{
"model": "your_mistral_model_name",
"messages": [
{"role": "system", "content": "You are a helpful assistant."},
{"role": "user", "content": "What'\''s the weather like in Delhi - respond in JSON"}
],
"tools": [{
"type": "function",
"function": {
"name": "getWeather",
"description": "Get the current weather",
"parameters": {
"type": "object",
"properties": {
"location": {"type": "string", "description": "City and state"},
"unit": {"type": "string", "enum": ["celsius", "fahrenheit"]}
},
"required": ["location"]
}
}
}],
"tool_choice": "auto"
}'
```
</Tab>
</Tabs>


## Next Steps

The complete list of features supported in the SDK are available on the link below.
Expand Down
110 changes: 109 additions & 1 deletion integrations/llms/openrouter.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -52,7 +52,7 @@ To use OpenRouter with Portkey, [get your API key from here](https://openrouter.

portkey = Portkey(
api_key="PORTKEY_API_KEY", # Replace with your Portkey API key
virtual_key="VIRTUAL_KEY" # Replace with your virtual key for Groq
virtual_key="VIRTUAL_KEY" # Replace with your virtual key for Open Router
)
```
</Tab>
Expand Down Expand Up @@ -90,6 +90,114 @@ console.log(chatCompletion.choices);
</Tabs>



### Open Router Tool Calling
Tool calling feature lets models trigger external tools based on conversation context. You define available functions, the model chooses when to use them, and your application executes them and returns results.

Portkey supports Open Router Tool Calling and makes it interoperable across multiple providers. With Portkey Prompts, you can templatize various your prompts & tool schemas as well.

<Card title="Open Router Tool Calling" href="https://openrouter.ai/docs/features/tool-calling">

</Card>

<Tabs>
<Tab title="Node.js">
```javascript Get Weather Tool
let tools = [{
type: "function",
function: {
name: "getWeather",
description: "Get the current weather",
parameters: {
type: "object",
properties: {
location: { type: "string", description: "City and state" },
unit: { type: "string", enum: ["celsius", "fahrenheit"] }
},
required: ["location"]
}
}
}];

let response = await portkey.chat.completions.create({
model: "openai/gpt-4o",
messages: [
{ role: "system", content: "You are a helpful assistant." },
{ role: "user", content: "What's the weather like in Delhi - respond in JSON" }
],
tools,
tool_choice: "auto",
});

console.log(response.choices[0].finish_reason);
```
</Tab>
<Tab title="Python">
```python Get Weather Tool
tools = [{
"type": "function",
"function": {
"name": "getWeather",
"description": "Get the current weather",
"parameters": {
"type": "object",
"properties": {
"location": {"type": "string", "description": "City and state"},
"unit": {"type": "string", "enum": ["celsius", "fahrenheit"]}
},
"required": ["location"]
}
}
}]

response = portkey.chat.completions.create(
model="openai/gpt-4o",
messages=[
{"role": "system", "content": "You are a helpful assistant."},
{"role": "user", "content": "What's the weather like in Delhi - respond in JSON"}
],
tools=tools,
tool_choice="auto"
)

print(response.choices[0].finish_reason)
```
</Tab>
<Tab title="cURL">
```curl Get Weather Tool
curl -X POST "https://api.portkey.ai/v1/chat/completions" \
-H "Content-Type: application/json" \
-H "Authorization: Bearer YOUR_PORTKEY_API_KEY" \
-d '{
"model": "openai/gpt-4o",
"messages": [
{"role": "system", "content": "You are a helpful assistant."},
{"role": "user", "content": "What'\''s the weather like in Delhi - respond in JSON"}
],
"tools": [{
"type": "function",
"function": {
"name": "getWeather",
"description": "Get the current weather",
"parameters": {
"type": "object",
"properties": {
"location": {"type": "string", "description": "City and state"},
"unit": {"type": "string", "enum": ["celsius", "fahrenheit"]}
},
"required": ["location"]
}
}
}],
"tool_choice": "auto"
}'
```
</Tab>
</Tabs>




The complete list of features supported in the SDK are available on the link below.

<Card title="SDK" href="/api-reference/portkey-sdk-client">
Expand Down
Loading