Skip to content

Conversation

@keligggg
Copy link

Description

Please include a summary of the change and which issue is fixed. Please also include relevant motivation and context. List any dependencies that are required for this change.

Fixes # (issue)

New Package?

Did I fill in the tool.llamahub section in the pyproject.toml and provide a detailed README.md for my new integration or package?

  • Yes
  • No

Version Bump?

Did I bump the version in the pyproject.toml file of the package I am updating? (Except for the llama-index-core package)

  • Yes
  • No

Type of Change

Please delete options that are not relevant.

  • Bug fix (non-breaking change which fixes an issue)
  • New feature (non-breaking change which adds functionality)
  • Breaking change (fix or feature that would cause existing functionality to not work as expected)
  • This change requires a documentation update

How Has This Been Tested?

Please describe the tests that you ran to verify your changes. Provide instructions so we can reproduce. Please also list any relevant details for your test configuration

  • Added new unit/integration tests
  • Added new notebook (that tests end-to-end)
  • I stared at the code and made sure it makes sense

Suggested Checklist:

  • I have performed a self-review of my own code
  • I have commented my code, particularly in hard-to-understand areas
  • I have made corresponding changes to the documentation
  • I have added Google Colab support for the newly added notebooks.
  • My changes generate no new warnings
  • I have added tests that prove my fix is effective or that my feature works
  • New and existing unit tests pass locally with my changes
  • I ran make format; make lint to appease the lint gods

nfiacco and others added 30 commits March 31, 2024 10:04
…un-llama#12456)

All of these have now been tested using local notebooks against live data sources

- Convert JSON string to dict and accept dict as param for GCS service account key
- Accept additional file/folder path params for Google Drive and Sharepoint readers during construction
- Correctly add private attributes to SharePointReader
Was using service_account_key_json as a file path instead of the direct
string representation of the service account key.
* add span_id to Event

* remove raise err in NullHandler

* wip

* modify root dispatcher event enclosing span

* remove *args as we have bound_args now

* add LLMChatInProgressEvent

* add LLMStructuredPredict Eventst

* store span_id before await executions

* add SpanDropEvent with err_str payload

* add event to _achat; flush current_span_id when open_spans is empty

* llm callbacks use root span_id

* add unit tests

* remove print statements

* provide context manager returning a distpatch event partial with correct span id

* move to context manager usage

* fix invocation of cm

* define and use get_dispatch_event method

* remove aim tests
…ne supports it (run-llama#12503)

* delegating retriever api if the query engine supports it

* add comments

* address lint errors
schkovich and others added 29 commits April 18, 2024 21:26
run-llama#12970)

Add new Llama 2 and Mixtral 8x22b model into Llama Index for Fireworks
* cr

* cr

* cr

* cr

* cr

* cr
* cr

* cr

* cr

---------

Co-authored-by: Logan <logan.markewich@live.com>
@keligggg keligggg merged commit 5b6ead1 into prod-scriptit Apr 22, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.