Skip to content

Commit

Permalink
Update flow diagram with Llama 2
Browse files Browse the repository at this point in the history
  • Loading branch information
kennethleungty committed Jul 19, 2023
1 parent cf92689 commit 586d3c0
Show file tree
Hide file tree
Showing 3 changed files with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -11,7 +11,7 @@ ___
- When we host open-source LLMs locally on-premise or in the cloud, the dedicated compute capacity becomes a key issue. While GPU instances may seem the obvious choice, the costs can easily skyrocket beyond budget.
- In this project, we will discover how to run quantized versions of open-source LLMs on local CPU inference for document question-and-answer (Q&A).
<br><br>
![Alt text](assets/document_qa_flowchart.png)
![Alt text](assets/diagram_flow.png)
___

## Quickstart
Expand Down
Binary file added assets/diagram_flow.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file removed assets/document_qa_flowchart.png
Binary file not shown.

0 comments on commit 586d3c0

Please sign in to comment.