Skip to content

Conversation

openvino-dev-samples
Copy link
Collaborator

image

Copy link

Check out this pull request on  ReviewNB

See visual diffs & provide feedback on Jupyter Notebooks.


Powered by ReviewNB

@eaidova
Copy link
Contributor

eaidova commented Nov 20, 2024

@openvino-dev-samples please update test patching:

  File "/tmp/scripts/patch_notebooks.py", line 204, in <module>
    patch_notebooks(args.notebooks_dir, args.test_device, args.skip_ov_install)
  File "/tmp/scripts/patch_notebooks.py", line 181, in patch_notebooks
    raise ValueError(f"Processing {notebookfile} failed: {source_value} does not exist in cell")
ValueError: Processing /tmp/notebooks/llm-agent-react/llm-agent-react-langchain.ipynb failed: mistralai/Mistral-7B-Instruct-v0.3 does not exist in cell

fix ci issues
@openvino-dev-samples
Copy link
Collaborator Author

I think a small model may lead parsing error on output of agent.

@eaidova
Copy link
Contributor

eaidova commented Nov 20, 2024

I think a small model may lead parsing error on output of agent.

@openvino-dev-samples ok, then I think it will be better to remove patching and disable notebook for precommit as we can not run 7B models due to github limitations (it is still will validated in internal infra with more powerful hw)

@openvino-dev-samples
Copy link
Collaborator Author

I think a small model may lead parsing error on output of agent.

@openvino-dev-samples ok, then I think it will be better to remove patching and disable notebook for precommit as we can not run 7B models due to github limitations (it is still will validated in internal infra with more powerful hw)

I'dont know if "TinyLlama/TinyLlama-1.1B-Chat-v1.0" can pass the CI, let me try

@eaidova
Copy link
Contributor

eaidova commented Nov 20, 2024

I think a small model may lead parsing error on output of agent.

@openvino-dev-samples ok, then I think it will be better to remove patching and disable notebook for precommit as we can not run 7B models due to github limitations (it is still will validated in internal infra with more powerful hw)

I'dont know if "TinyLlama/TinyLlama-1.1B-Chat-v1.0" can pass the CI, let me try

what about Qwen2.5-1.5B-Instruct? do you think it less accurate than tinyllama?

@openvino-dev-samples
Copy link
Collaborator Author

I think a small model may lead parsing error on output of agent.

@openvino-dev-samples ok, then I think it will be better to remove patching and disable notebook for precommit as we can not run 7B models due to github limitations (it is still will validated in internal infra with more powerful hw)

I'dont know if "TinyLlama/TinyLlama-1.1B-Chat-v1.0" can pass the CI, let me try

what about Qwen2.5-1.5B-Instruct? do you think it less accurate than tinyllama?

It depends on how good LLM can follow prompt's instruction, Let me try ting-llama first

@openvino-dev-samples openvino-dev-samples force-pushed the agent branch 4 times, most recently from 6346c74 to 683a22b Compare November 21, 2024 09:22
replace llm test case

replace llm test case

replace llm test case

replace llm test case
replace optimum-cli method
@eaidova eaidova merged commit 5a32b8c into openvinotoolkit:latest Nov 22, 2024
16 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants