Releases: posit-dev/chatlas
Releases · posit-dev/chatlas
chatlas 0.9.1
chatlas 0.9.0
New features
Chat
gains a handful of new methods:.register_mcp_tools_http_stream_async()
and.register_mcp_tools_stdio_async()
: for registering tools from a MCP server. (#39).get_tools()
and.set_tools()
: for fine-grained control over registered tools. (#39).set_model_params()
: for setting common LLM parameters in a model-agnostic fashion. (#127).get_cost()
: to get the estimated cost of the chat. Only popular models are supported, but you can also supply your own token prices. (#106).add_turn()
: to addTurn
(s) to the current chat history. (#126)
- Tool functions passed to
.register_tool()
can nowyield
numerous results. (#39) - A
ContentToolResultImage
content class was added for returning images from tools. It is currently only works withChatAnthropic
. (#39) - A
Tool
can now be constructed from a pre-existing tool schema (via a new__init__
method). (#39) - The
Chat.app()
method gains ahost
parameter. (#122) ChatGithub()
now supports the more standardGITHUB_TOKEN
environment variable for storing the API key. (#123)
Changes
Breaking Changes
Chat
constructors (ChatOpenAI()
,ChatAnthropic()
, etc) no longer have aturns
keyword parameter. Use the.set_turns()
method instead to set the (initial) chat history. (#126)Chat
's.tokens()
methods have been removed in favor of.get_tokens()
which returns both cumulative tokens in the turn and discrete tokens. (#106)
Other Changes
Tool
's constructor no longer takes a function as input. Use the new.from_func()
method instead to create aTool
from a function. (#39).register_tool()
now throws an exception when the tool has the same name as an already registered tool. Set the newforce
parameter toTrue
to force the registration. (#39)
Improvements
ChatGoogle()
andChatVertex()
now default to Gemini 2.5 (instead of 2.0). (#125)ChatOpenAI()
andChatGithub()
now default to GPT 4.1 (instead of 4o). (#115)ChatAnthropic()
now supportscontent_image_url()
. (#112)- HTML styling improvements for
ContentToolResult
andContentToolRequest
. (#39) Chat
's representation now includes cost information if it can be calculated. (#106)token_usage()
includes cost if it can be calculated. (#106)
Bug fixes
- Fixed an issue where
httpx
client customization (e.g.,ChatOpenAI(kwargs = {"http_client": httpx.Client()})
) wasn't working as expected (#108)
Developer APIs
- The base
Provider
class now includes aname
andmodel
property. In order for them to work properly, provider implementations should pass aname
andmodel
along to the__init__()
method. (#106) Provider
implementations must implement two new abstract methods:translate_model_params()
andsupported_model_params()
.
chatlas 0.8.1
- Fixed
@overload
definitions for.stream()
and.stream_async()
.
chatlas 0.8.0
New features
- New
.on_tool_request()
and.on_tool_result()
methods register callbacks that fire when a tool is requested or produces a result. These callbacks can be used to implement custom logging or other actions when tools are called, without modifying the tool function (#101). - New
ToolRejectError
exception can be thrown from tool request/result callbacks or from within a tool function itself to prevent the tool from executing. Moreover, this exception will provide some context for the the LLM to know that the tool didn't produce a result because it was rejected. (#101)
Improvements
- The
CHATLAS_LOG
environment variable now enables logs for the relevant model provider. It now also supports a level ofdebug
in addition toinfo
. (#97) ChatSnowflake()
now supports tool calling. (#98)Chat
instances can now be deep copied, which is useful for forking the chat session. (#96)
Changes
ChatDatabricks()
'smodel
now defaults todatabricks-claude-3-7-sonnet
instead ofdatabricks-dbrx-instruct
. (#95)ChatSnowflake()
'smodel
now defaults toclaude-3-7-sonnet
instead ofllama3.1-70b
. (#98)
Bug fixes
- Fixed an issue where
ChatDatabricks()
with an Anthropicmodel
wasn't handling empty-string responses gracefully. (#95)
chatlas 0.7.1
- Added
openai
as a hard dependency, making installation easier for a wide range of use cases. (#91)
chatlas 0.7.0
New features
- Added
ChatDatabricks()
, for chatting with Databrick's foundation models. (#82) .stream()
and.stream_async()
gain acontent
argument. Set this to"all"
to includeContentToolResult
/ContentToolRequest
objects in the stream. (#75)ContentToolResult
/ContentToolRequest
are now exported tochatlas
namespace. (#75)ContentToolResult
/ContentToolRequest
gain a.tagify()
method so they render sensibly in a Shiny app. (#75)- A tool can now return a
ContentToolResult
. This is useful for: Chat
gains a new.current_display
property. When a.chat()
or.stream()
is currently active, this property returns an object with a.echo()
method (to echo new content to the display). This is primarily useful for displaying custom content during a tool call. (#79)
Improvements
- When a tool call ends in failure, a warning is now raised and the stacktrace is printed. (#79)
- Several improvements to
ChatSnowflake()
: ChatAnthropic()
no longer chokes after receiving an output that consists only of whitespace. (#86)orjson
is now used for JSON loading and dumping. (#87)
Changes
- The
echo
argument of the.chat()
method defaults to a new value of"output"
. As a result, tool requests and results are now echoed by default. To revert to the previous behavior, setecho="text"
. (#78) - Tool results are now dumped to JSON by default before being sent to the model. To revert to the previous behavior, have the tool return a
ContentToolResult
withmodel_format="str"
. (#87)
Breaking changes
- The
.export()
method'sinclude
argument has been renamed tocontent
(to match.stream()
). (#75)
chatlas 0.6.1
Bug fixes
- Fixed a missing dependency on the
requests
package.
chatlas 0.6.0
chatlas 0.5.0
New features
- Added a
ChatSnowflake()
class to interact with Snowflake Cortex LLM. (#54) - Added a
ChatAuto()
class, allowing for configuration of chat providers and models via environment variables. (#38, thanks @mconflitti-pbc)
Improvements
- Updated
ChatAnthropic()
'smodel
default to"claude-3-7-sonnet-latest"
. (#62) - The version is now accessible as
chatlas.__version__
. (#64) - All provider-specific
Chat
subclasses now have an associated extras in chatlas. For example,ChatOpenAI
haschatlas[openai]
,ChatPerplexity
haschatlas[perplexity]
,ChatBedrockAnthropic
haschatlas[bedrock-anthropic]
, and so forth for the otherChat
classes. (#66)
Bug fixes
chatlas 0.4.0
New features
- Added a
ChatVertex()
class to interact with Google Cloud's Vertex AI. (#50) - Added
.app(*, echo=)
support. This allows for chatlas to change the echo behavior when running the Shiny app. (#31)
Improvements
- Migrated
ChatGoogle()
's underlying python SDK fromgoogle-generative
togoogle-genai
. As a result, streaming tools are now working properly. (#50)
Bug fixes
- Fixed a bug where synchronous chat tools would not work properly when used in a
_async()
context. (#56) - Fix broken
Chat
's Shiny app when.app(*, stream=True)
by using async chat tools. (#31) - Update formatting of exported markdown to use
repr()
instead ofstr()
when exporting tool call results. (#30)