Skip to content

Agent instructions are not present in system prompt #192

Open
@Vatavuk

Description

@Vatavuk

Agent instructions are not passed to LLM as a system prompt when using Gemini as BaseLlm. The instructions are visible in Llmrequest.getSystemInstructions but are not converted to Content and included in a query towards llm.

To Reproduce
Steps to reproduce the behavior:

LlmAgent.builder()
      .name("My agent")
      .description("some description")
      .model("gemini-2.0-flash")
      .instruction("You are a helpful agent for solving crimes. Always refer to me as Vatavuk")
      .build()

Send "Hi" to agent. Result will be something like "Hello! How can I help you today?"

Expected behavior
Agent should respond with something like: "Hello, Vatavuk! How can I assist you with your investigation or crime-solving needs today?"

Desktop:

  • OS: [macOS]
  • Java version: openjdk-24
  • ADK version: 0.1.0

Metadata

Metadata

Assignees

Labels

No labels
No labels

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions