Releases: monocle2ai/monocle
Version 0.4.2
This is a patch release on the previous 0.4.1 release, fixing the issue(s) below.
- Add gemini instrumentation (#220)
Version 0.4.1
This is a patch release on the previous 0.4.0 release, fixing the issue(s) below.
- Add exception status code for Boto3 (#211)
- Add exception status code for anthropic and openai (#210)
- Add prompt template info in teamsAI (#209)
- TeamsAI : added system prompt (#208)
- Add prompt template info in ActionPlanner for teamsAI (#207)
- Add teams channel id as scope in MS teams instrumentations (#206)
- Azure function wrapper to generate http span (#205)
- Azure ai inference sdk (#204)
Version 0.4.0
-
Update teams scopes (#200)
-
Record input and errors for inference.modelapi in case of error (#193)
-
Removed special handling for streaming in wrapper (#192)
-
Add Span error handling (#186)
-
Add teams ai enhancements (#184)
-
Added conversation id in scope for teams ai bot (#180)
-
Update inference entity type of TeamsAI SDK (#178)
-
Added stream and async for openai (#177)
-
Update inference span of TeamsAI (#176)
-
Remove Preset span name and Bugfix for Event (#175)
-
Add haystack anthropic sample (#174)
-
aiohttp auto instrumentation (#173)
-
Add source path to spans and fix json syntax in file exporter (#172)
-
Added changes for openai streaming (#171)
-
Add llama index anthropic sample (#170)
Version 0.4.0b3
Version 0.4.0b2
Version 0.4.0b1
- Added conversation id in scope for teams ai bot (#180)
- Update inference entity type of TeamsAI SDK (#178)
- Added stream and async for openai (#177)
- Update inference span of TeamsAI (#176)
- Remove Preset span name and Bugfix for Event (#175)
- Add haystack anthropic sample (#174)
- aiohttp auto instrumentation (#173)
- Add source path to spans and fix json syntax in file exporter (#172)
- Added changes for openai streaming (#171)
- Add llama index anthropic sample (#170)
Version 0.3.1
This is a patch release on the previous 0.3.0 release, fixing the issue(s) below.
- Add MetaModel for Anthropic SDK (#159)
- Add openAI response for openAI and AzureOpenAI (#158)
- Update retrieval span for Boto Client (#157)
- Resolve token threshold error (#156)
- Update Inference Span (#155)
- Refactor workflow and spans (#160)
- Support monocle exporter list as parameter to
setup_monocle_telemetry()
(#161) - Add langchain anthropic sample (#165)
Version 0.3.1b1
Version 0.3.0
- Fixed issue with passing context in async case (#150)
- Added lambda processor (#148)
- Setup package level init scripts to make the monocle import simpler (#147)
- Boto attributes and test cleanup (#146)
- Openai workflow (#142)
- Add input/output for openai embedding (#141)
- Async method and scope fix (#140)
- Bug fix for helper langchain and langgraph (#137)
- Package main script to run any app with monocle instrumentation (#132)
- Add openai api metamodel (#131)
- Support notion of scopes to group traces/snaps into logical constructs (#130)
- Add Llamaindex ReAct agent (#127)
- Langhcain input fix and s3 exporter prefix support (#126)
- Use standard AWS credential envs (#123)
- Check additional attributes for Azure OpenAI model and consolidate common method in utils (#122)
- Bug fix for accessor (#121)
- Bug fix for empty response (#120)
- Bug fix for inference endpoint (#119)
- Opendal exporter for S3 and Blob (#117)
- Handle specific ModuleNotFoundError exceptions gracefully (#115)
- Adding support for console and memory exporter to list of monocle exporters (#113)
- Add trace id propogation for constant trace id and from request (#111)
- Restructure of monoocle code for easy extensibility (#109)
- S3 update filename prefix (#98)
- Update inference span for botocore sagemaker (#93)
- Capturing inference output and token metadata for bedrock (#82)
- Add dev dependency for Mistral AI integration (#81)
- Add VectorStore deployment URL capture support (#80)
- Clean up cloud exporter implementation (#79)
- Capture inference span input/output events attributes (#77)
- Add release automation workflows (#76)
- Fix gaps in Monocle SDK implementation (#72)
- Add kwargs and return value handling in Accessor (#71)
- Update workflow name formatting (#69)
- Implement Haystack metamodel support (#68)
Version 0.3.0b7
- Add dev dependency for Mistral AI integration (#81)
- Add VectorStore deployment URL capture support (#80)
- Clean up cloud exporter implementation (#79)
- Capture inference span input/output events attributes (#77)
- Add release automation workflows (#76)
- Fix gaps in Monocle SDK implementation (#72)
- Add kwargs and return value handling in Accessor (#71)
- Update workflow name formatting (#69)
- Implement Haystack metamodel support (#68)