Skip to content

Commit

Permalink
Add attachments to GenericOpenAI prompt (#2831)
Browse files Browse the repository at this point in the history
* added attachments to genericopenai prompt

* add devnote

---------

Co-authored-by: timothycarambat <[email protected]>
  • Loading branch information
wolfganghuse and timothycarambat authored Dec 16, 2024
1 parent ff02428 commit d145602
Showing 1 changed file with 49 additions and 1 deletion.
50 changes: 49 additions & 1 deletion server/utils/AiProviders/genericOpenAi/index.js
Original file line number Diff line number Diff line change
Expand Up @@ -77,17 +77,65 @@ class GenericOpenAiLLM {
return true;
}

/**
* Generates appropriate content array for a message + attachments.
*
* ## Developer Note
* This function assumes the generic OpenAI provider is _actually_ OpenAI compatible.
* For example, Ollama is "OpenAI compatible" but does not support images as a content array.
* The contentString also is the base64 string WITH `data:image/xxx;base64,` prefix, which may not be the case for all providers.
* If your provider does not work exactly this way, then attachments will not function or potentially break vision requests.
* If you encounter this issue, you are welcome to open an issue asking for your specific provider to be supported.
*
* This function will **not** be updated for providers that **do not** support images as a content array like OpenAI does.
* Do not open issues to update this function due to your specific provider not being compatible. Open an issue to request support for your specific provider.
* @param {Object} props
* @param {string} props.userPrompt - the user prompt to be sent to the model
* @param {import("../../helpers").Attachment[]} props.attachments - the array of attachments to be sent to the model
* @returns {string|object[]}
*/
#generateContent({ userPrompt, attachments = [] }) {
if (!attachments.length) {
return userPrompt;
}

const content = [{ type: "text", text: userPrompt }];
for (let attachment of attachments) {
content.push({
type: "image_url",
image_url: {
url: attachment.contentString,
detail: "high",
},
});
}
return content.flat();
}

/**
* Construct the user prompt for this model.
* @param {{attachments: import("../../helpers").Attachment[]}} param0
* @returns
*/
constructPrompt({
systemPrompt = "",
contextTexts = [],
chatHistory = [],
userPrompt = "",
attachments = [],
}) {
const prompt = {
role: "system",
content: `${systemPrompt}${this.#appendContext(contextTexts)}`,
};
return [prompt, ...chatHistory, { role: "user", content: userPrompt }];
return [
prompt,
...chatHistory,
{
role: "user",
content: this.#generateContent({ userPrompt, attachments }),
},
];
}

async getChatCompletion(messages = null, { temperature = 0.7 }) {
Expand Down

0 comments on commit d145602

Please sign in to comment.