Description
https://github.com/vorburger/vorburger.ch-Notes/blob/develop/ml/adk-ollama.md, which originally (just stating for future reproducibility) was using the code as-is at this revision is intentionally "trivial-ized", and does not actually use any Tools just yet - because at first I ran into some exception with Ollama.
It seems to be an Ollama specific issue (I have not yet myself tested it with any other providers, yet), and possibly something in LangChain4j (also, or even only), but I thought for now I would start an issue here to start tracking this. (I've lost the actual stack trace, but could reproduce to post it again later.)
In the larger scheme of things, I don't think this is a "tomorrow" priority; personally (and as discussed), I would suggest that we focus on getting pending PRs merged and a first version of the LangChain4j LLM Adapter "as-is" merged and made available - and then look further into this a little bit later again.
PS: My assumption here is that Gemma CAN support tools; https://ai.google.dev/gemma/docs/core#function-calling certainly sounds like it should. So I suspect, without having done any real debugging yet, that we're just missing some... "piping".