v0.2.0
0.2.0 (2024-06-10)
Full Changelog: v0.1.45...v0.2.0
Features
- add cost estimate and ability to log arbitrary columns (2137e80)
- add support for the assistants api (abc667c)
- enable using Openlayer in JS with alternative providers (5a635a1)
- move project and inference pipeline setup to startMonitoring method (26db616)
- support for new development workflow (c4034ce)
- various codegen changes (4ec0b86)
Bug Fixes
- calculate tokens in streaming completions (b7343ca)
- entry point does not resolve with submodules (184f877)
- extraneous warning logs when inference pipeline not located for monitoring assistant runs (8b81e70)
- incorrect symbol in LangChain example (865abf8)
- ReferenceError when using fetch (017df18)
- rename token count column (66a0b46)
- rm version from data-stream queries (8f418ec)
- round timestamp seconds to nearest int (01d442e)
- update config to adopt new prompt tracking schema (5a813c3)
- update examples to fix race condition issue when processing multiple requests back-to-back (6e9f9f8)
Chores
- add example module for using Openlayer with LangChain (3860579)
- add new turbo model pricings (f6ef841)
- add README (cd94cee)
- adopt new schema for managing prompts (160b3a5)
- downgrade node-fetch (b73f237)
- increment npm version (62d890d)
- link to docs in README for example usage (4cff313)
- update doc comments for throwable methods (db16328)
- update logo image in README (89873ec)
- update package version (7b8d549)
- update package version (c50091c)