Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Exceptions #22

Open
Philomath88 opened this issue Apr 24, 2023 · 1 comment
Open

Exceptions #22

Philomath88 opened this issue Apr 24, 2023 · 1 comment

Comments

@Philomath88
Copy link

I want you to write a summary how can I store the SP-API documenation using
LlamaIndexThinking...
Traceback (most recent call last):
File "/Users/maximiliandoelle/miniconda3/lib/python3.10/site-packages/langchain/output_parsers/pydantic.py", line 25, in parse
json_object = json.loads(json_str)
File "/Users/maximiliandoelle/miniconda3/lib/python3.10/json/init.py", line 346, in loads
return _default_decoder.decode(s)
File "/Users/maximiliandoelle/miniconda3/lib/python3.10/json/decoder.py", line 337, in decode
obj, end = self.raw_decode(s, idx=_w(s, 0).end())
File "/Users/maximiliandoelle/miniconda3/lib/python3.10/json/decoder.py", line 355, in raw_decode
raise JSONDecodeError("Expecting value", s, err.value) from None
json.decoder.JSONDecodeError: Expecting value: line 1 column 1 (char 0)

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
File "/Users/maximiliandoelle/miniconda3/lib/python3.10/runpy.py", line 196, in _run_module_as_main
return _run_code(code, main_globals, None,
File "/Users/maximiliandoelle/miniconda3/lib/python3.10/runpy.py", line 86, in _run_code
exec(code, run_globals)
File "/Users/maximiliandoelle/Projects/llama-lab/auto_llama/auto_llama/main.py", line 52, in
main()
File "/Users/maximiliandoelle/Projects/llama-lab/auto_llama/auto_llama/main.py", line 31, in main
response = agent.get_response()
File "/Users/maximiliandoelle/Projects/llama-lab/auto_llama/auto_llama/agent.py", line 54, in get_response
response_obj = parser.parse(output.content)
File "/Users/maximiliandoelle/miniconda3/lib/python3.10/site-packages/langchain/output_parsers/pydantic.py", line 31, in parse
raise OutputParserException(msg)
langchain.schema.OutputParserException: Failed to parse Response from completion To store the SP-API documentation, you can use the following steps:

  1. Search for the SP-API documentation on the web using the "search" command with the search terms "SP-API documentation".

  2. Download the contents of the web page using the "download" command with the URL of the documentation page and a name for the downloaded document.

  3. Query the downloaded document to extract the relevant information using the "query" command with the name of the downloaded document and a query string that specifies the information you want to extract.

  4. Write the extracted information to a file using the "write" command with a file name and the extracted data.

By following these steps, you can store the SP-API documentation in a file for future refe

@logan-markewich
Copy link
Contributor

Yea, this is an issue with the LLM not properly predicting the output, and langchain barfing on that. Could definitely have better error handling support for this, would love a PR (could be for auto llama or llama-agi!)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants