Skip to content

Conversation

@Artur-Fedjukevits
Copy link
Contributor

Added support for creating HABot cards via the new enhanced reply method and new create_intent function.
Added requestTimeout field to the Thing configuration. Before it was hardcoded.
Function executions (items_control, create_intent) are now accompanied by a short natural language response from ChatGPT. Try together with this HABot PR - openhab/openhab-webui#3347

Signed-off-by: Artur-Fedjukevits <[email protected]>
Signed-off-by: Artur-Fedjukevits <[email protected]>
Signed-off-by: Artur-Fedjukevits <[email protected]>
@lsiepel
Copy link
Contributor

lsiepel commented Aug 30, 2025

the gson build issue is not related to this PR. Some dependency trouble up the chain. I´ll trigger all builds once we got this sorted.

Artur-Fedjukevits and others added 4 commits August 30, 2025 21:31
Co-authored-by: lsiepel <[email protected]>
Signed-off-by: Artur-Fedjukevits <[email protected]>
Signed-off-by: Artur-Fedjukevits <[email protected]>
Signed-off-by: Artur-Fedjukevits <[email protected]>
@Artur-Fedjukevits
Copy link
Contributor Author

Here is compiled jar for testing.

@lsiepel
Copy link
Contributor

lsiepel commented Aug 31, 2025

Please fix the remaining build issues.

@TAKeanice
Copy link

I´d want to clean this PR up a little - It contains some pure whitespace changes, the changes from #19077 , and some new hardcoded system message parts for habot. Let me take a look into doing this.

@Artur-Fedjukevits
Copy link
Contributor Author

I´d want to clean this PR up a little - It contains some pure whitespace changes, the changes from #19077 , and some new hardcoded system message parts for habot. Let me take a look into doing this.

The HABot system message is hardcoded for now to ensure correct function usage, but I’m fine if you want to refactor it and make it configurable.

Signed-off-by: Artur-Fedjukevits <[email protected]>
@wborn wborn requested a review from Copilot September 2, 2025 05:10
Copy link

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull Request Overview

This PR enhances the ChatGPT binding to support HABot integration by adding a new enhanced reply method for creating HABot cards and enabling natural language responses alongside function executions. The main changes include adding a new create_intent function for HABot card creation, updating the existing items_control function to support natural language answers, and making the request timeout configurable instead of hardcoded.

  • Added create_intent function to create HABot cards with entities and matched items
  • Enhanced function executions to include natural language responses from ChatGPT
  • Made request timeout configurable through Thing configuration

Reviewed Changes

Copilot reviewed 12 out of 12 changed files in this pull request and generated 2 comments.

Show a summary per file
File Description
tools.json Added new create_intent function definition and answer field to items_control
ChatGPTHLIService.java Added HABot integration with new reply method, card creation logic, and enhanced function execution
ItemsControl.java Added answer field to support natural language responses
CreateIntent.java New DTO class for the create_intent function parameters
Parameters.java Enhanced to support nested object properties for complex function parameters
ChatRequestBody.java Updated to use max_completion_tokens instead of deprecated max_tokens
ChatGPTHandler.java Made request timeout configurable and removed hardcoded constant
ChatGPTConfiguration.java Added requestTimeout configuration field
ChatGPTChannelConfiguration.java Changed default temperature from 0.5 to 1.0
feature.xml Added HABot feature dependency
pom.xml Added HABot and Gson dependencies
README.md Updated documentation with new configuration options and usage examples

Tip: Customize your code reviews with copilot-instructions.md. Create the file or learn how to get started.

Artur-Fedjukevits and others added 3 commits September 2, 2025 11:09
…binding/chatgpt/internal/hli/ChatGPTHLIService.java

Co-authored-by: Copilot <[email protected]>
Signed-off-by: Artur-Fedjukevits <[email protected]>
Co-authored-by: Copilot <[email protected]>
Signed-off-by: Artur-Fedjukevits <[email protected]>
Signed-off-by: Artur-Fedjukevits <[email protected]>

<feature name="openhab-binding-chatgpt" description="ChatGPT Binding" version="${project.version}">
<feature>openhab-runtime-base</feature>
<feature>openhab-ui-habot</feature>
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't think this is allowed for add-ons (and BTW core also isn't allowed to depend on ui/add-on features).
See my comment openhab/openhab-webui#2995 (comment).

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I couldn’t find any rule in the official docs that explicitly forbids add-ons from depending on UI add-ons. In my case, the solution works reliably. HABot is installed by default for most users anyway. Why exactly should a binding not depend on a UI add-on? What kind of issues could this cause? And if it’s really discouraged, what alternatives or workarounds would you recommend? @kaikreuzer, what’s your view on this?

Yes, developing a completely new chatbot integrated into core sounds much more attractive in the long run, but it seems this would take quite some time — in the meantime, why not reuse what we already have?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It is more a unwritten rule than something written down, but IMO is pretty clear from an architecture POV.
In general, the following rule applies to all add-ons: Their installation should be independent from all other add-ons, since HABot is nothing installed by default IIRC and nothing always installed this would introduce a dependency ChatGPT add-on -> HABot add-on.
Whether or not a solution works reliably in a single case doesn't really matter, it has to work in all valid use cases. IIRC one year ago we had a feature dependency of core on addons.xml which normally shouldnt't cause problems, since addons.xml should always be in place, but from an architecture POV this was very bad style and in fact caused trouble.

Adding something to core might take time, but it could be done in steps.
Add a new bundle org.openhab.core.hli (IMO having the current HLI interface in org.openhab.core.voice:text isn't the perfect place for it) and define a new interface LLMHLIInterpreter which extends HLIInterpreter with a reply function that is able to provide a reply and a card.
In the beginning, the code could mainly base on the HABot code. ChatGPT could then implement LLMHLIIntepreter and HABot be adjusted to support that as well.
We don't need session management, chat history etc. in core at the beginning.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Okay, got it. I can give it a try and work on a first version of this refactoring. If I understood correctly, I’ll need to move the ChatReply and Card classes out of HABot into a new core bundle as plain model classes, update HABot so that it checks instanceof LLMHLIInterpreter and calls reply(), otherwise falling back to interpret(), and make the ChatGPT HLI implement this new interface. Does that match what you had in mind?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yeah, as a first step I think moving the ChatReply and Card classes HABot -> Core and defining the LLMHLIInterpreter / EnhancedHLIInterpreter (this might be a more generic name, also matching the OpenNLP HLI, which is no LLM but still more powerful than simple pattern matching) with the reply method should be enough, at least for getting your PR here finished.
I would suggest to add a new bundle for this, e.g. named org.openhab.core.hli, see openhab/openhab-core#4865 for an example of adding a new bundle.
In a next step, the Card logic could be looked at and possibly extended or refactored.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@florian-h05 I have created this new core bundle and tested it. It is working fine on my side. Would be great if you could give it a try as well. openhab/openhab-core#5016

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

In general, the following rule applies to all add-ons: Their installation should be independent from all other add-ons

My understanding of the MQTT Split into several bindings is that all MQTT add-on things will depend on and install the MQTT Broker Binding add-on.

I think it is better to have HLI as add-on, installed by other add-ons, instead of in core, because people, who do not use it, do not wait putting the bundle in the OSGi cache, after clean-cache; do not wait to load the bundle on each start of OpenHAB and OpenHAB will use by default, standard installation, less RAM.

For me it is logical, if an add-on is created, which is installed implicitly, whenever the HaBot or ChatGPT are installed.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@dilyanpalauzov HLI stuff is in core and the enhanced HLI stuff will go into core as it is needed by the several parts of openHAB, from bindings providing HLI implementation, to HABot consuming them, to Main UI needing REST APIs (which can only be provided by core).
And not to forget the voice „pipeline“ consisting of STT - HLI - TTS, where all the three implementations are provided through add-ons, core handled there interaction.

MQTT Binding and its sub bindings are something different, as MQTT is already a (big) binding, with it being only connected through openHAB core through the Thing API. MQTT is way less baked into openHAB than HLI.
And BTW: MQTT binding is using openHAB Core MQTT Transport: https://github.com/openhab/openhab-core/tree/main/bundles/org.openhab.core.io.transport.mqtt.

@Artur-Fedjukevits
Will take a look once I find the time 👍

@lolodomo lolodomo added enhancement An enhancement or new feature for an existing add-on work in progress A PR that is not yet ready to be merged labels Sep 5, 2025
@lolodomo
Copy link
Contributor

lolodomo commented Sep 6, 2025

@Artur-Fedjukevits : I added the tag "work in progress" after reading the comments. Feel free to remove it when you consider this is ready again for a review.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

enhancement An enhancement or new feature for an existing add-on work in progress A PR that is not yet ready to be merged

Projects

None yet

Development

Successfully merging this pull request may close these issues.

6 participants