Skip to content

Conversation

@ccope
Copy link

@ccope ccope commented Jan 11, 2026

The generate_prompts function appended instructions multiple times when called repeatedly with the same opts object, causing duplicate content in both the instruction string and messages array.

Resolves #2907

Changes

  • Instruction file loading (lines 265-273): Added opts._instructions_loaded guard flag to prevent re-reading and re-appending file content. Flag set after file processing regardless of content to avoid redundant I/O.

  • Message array insertion (lines 444-447): Added opts._instructions_added_to_messages guard flag to prevent duplicate instruction messages.

Example

local opts = { instructions = "Initial" }

-- Without guards (old behavior):
generate_prompts(opts)  -- instructions: "Initial\nFile content"
generate_prompts(opts)  -- instructions: "Initial\nFile content\nFile content" ❌

-- With guards (new behavior):
generate_prompts(opts)  -- instructions: "Initial\nFile content"
generate_prompts(opts)  -- instructions: "Initial\nFile content" ✓

Tests

Added two test cases validating no duplication occurs across multiple generate_prompts calls with the same opts object.

Copilot PR from my fork: ccope#1

Original prompt

Problem

The generate_prompts function in lua/avante/llm.lua has two issues where instructions can be appended multiple times if the function is called repeatedly with the same opts object:

  1. Lines 265-270: Instructions from the project instruction file are appended to opts.instructions without checking if they've already been added
  2. Lines 441-443: The instructions are added to the messages array without checking if they've already been added

This causes duplicate content when generate_prompts is called multiple times with the same options.

Solution

Add guard flags to prevent duplicate appending:

  1. Add opts._instructions_loaded flag to track if the instruction file content has been loaded
  2. Add opts._instructions_added_to_messages flag to track if instructions have been added to the messages array

Changes Required

In lua/avante/llm.lua, modify the M.generate_prompts function:

Around line 265-270, change:

if instruction_file_path:exists() then
  local lines = Utils.read_file_from_buf_or_disk(instruction_file_path:absolute())
  local instruction_content = lines and table.concat(lines, "\n") or ""

  if instruction_content then opts.instructions = (opts.instructions or "") .. "\n" .. instruction_content end
end

To:

if instruction_file_path:exists() and not opts._instructions_loaded then
  local lines = Utils.read_file_from_buf_or_disk(instruction_file_path:absolute())
  local instruction_content = lines and table.concat(lines, "\n") or ""

  if instruction_content then 
    opts.instructions = (opts.instructions or "") .. "\n" .. instruction_content 
    opts._instructions_loaded = true
  end
end

Around line 441-443, change:

if opts.instructions ~= nil and opts.instructions ~= "" then
  messages = vim.list_extend(messages, { { role = "user", content = opts.instructions } })
end

To:

if opts.instructions ~= nil and opts.instructions ~= "" and not opts._instructions_added_to_messages then
  messages = vim.list_extend(messages, { { role = "user", content = opts.instructions } })
  opts._instructions_added_to_messages = true
end

Expected Outcome

After these changes, the generate_prompts function can be safely called multiple times with the same opts object without duplicating instruction content.

This pull request was created from Copilot chat.

@ccope
Copy link
Author

ccope commented Jan 11, 2026

I've been using this change locally for the past couple of days and I've stopped getting warnings about duplicated instructions. I'm not super familiar with this plugin's code though, so I may have missed edge cases in the LLM's output.

@ccope
Copy link
Author

ccope commented Jan 13, 2026

Ah this looks like a duplicate of #2884

@github-actions
Copy link

This PR is stale because it has been open 14 days with no activity. Remove stale label or comment or this will be closed in 10 days.

@github-actions github-actions bot added the Stale label Jan 28, 2026
guijun added a commit to guijun/avante.nvim that referenced this pull request Jan 28, 2026
 Prevent duplicate instruction appending in generate_prompts yetone#2916
@ccope
Copy link
Author

ccope commented Jan 28, 2026

Bumping this because it looks like the other PR got closed as stale.

@github-actions github-actions bot removed the Stale label Jan 29, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

bug: Instructions from avante.md are appended twice to user input for agents

1 participant