Skip to content

Improvements #35

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 30 commits into from
Nov 18, 2023
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
30 commits
Select commit Hold shift + click to select a range
d1c7518
chat with docs
shreyaskarnik Nov 7, 2023
1e1f474
Merge branch 'main' of github.com:shreyaskarnik/DistiLlama into talk-…
shreyaskarnik Nov 7, 2023
a4c25a4
fix deps
shreyaskarnik Nov 7, 2023
85c1322
sort imports
shreyaskarnik Nov 7, 2023
303ae10
pnpm
shreyaskarnik Nov 7, 2023
1273800
better code organization
shreyaskarnik Nov 8, 2023
291db81
better code organization
shreyaskarnik Nov 8, 2023
bc1e7a3
chat function
shreyaskarnik Nov 9, 2023
c71f46f
simplification
shreyaskarnik Nov 9, 2023
a08fd20
merge
shreyaskarnik Nov 14, 2023
69de73b
Merge branch 'main' of github.com:shreyaskarnik/DistiLlama into talk-…
shreyaskarnik Nov 14, 2023
7c73b4f
more enhancements
shreyaskarnik Nov 14, 2023
0f35a62
more enhancements
shreyaskarnik Nov 15, 2023
a3acb8a
more docs
shreyaskarnik Nov 15, 2023
29bef00
more docs
shreyaskarnik Nov 15, 2023
a51bc2a
more docs
shreyaskarnik Nov 15, 2023
f5c14b5
stashing
shreyaskarnik Nov 16, 2023
6e88d70
stashing
shreyaskarnik Nov 16, 2023
3e6dd25
more UI enhancements
shreyaskarnik Nov 16, 2023
28bc130
more UI enhancements
shreyaskarnik Nov 16, 2023
33fa20c
more UI enhancements
shreyaskarnik Nov 17, 2023
7bd6fcc
more UI enhancements
shreyaskarnik Nov 17, 2023
ef4059a
more UI enhancements
shreyaskarnik Nov 17, 2023
2ba953b
streaming output when possible
shreyaskarnik Nov 17, 2023
5cd8515
Merge branch 'main' of github.com:shreyaskarnik/DistiLlama into talk-…
shreyaskarnik Nov 17, 2023
bf989fb
more changes
shreyaskarnik Nov 17, 2023
35d0da6
more changes
shreyaskarnik Nov 17, 2023
44c29fb
more changes
shreyaskarnik Nov 17, 2023
98a6c01
pass in metadata
shreyaskarnik Nov 18, 2023
49655d1
Merge branch 'main' into talk-to-doc
shreyaskarnik Nov 18, 2023
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
5 changes: 4 additions & 1 deletion src/pages/sidePanel/PageSummary.tsx
Original file line number Diff line number Diff line change
Expand Up @@ -14,7 +14,10 @@ export default function PageSummary({ loading, summary, taskType }) {
</div>
) : summary ? (
<div>
<div className="content-box">{summary.text}</div>
<div className="content-box">
<h2 className="summary-title">{summary.title}</h2>
<div className="summary-body">{summary.text}</div>
</div>
<div className="form-container">
<PageMetadata metadata={summary} taskType={taskType} />
</div>
Expand Down
39 changes: 34 additions & 5 deletions src/pages/sidePanel/QandA.ts
Original file line number Diff line number Diff line change
Expand Up @@ -27,7 +27,7 @@ export type ConversationalRetrievalQAChainInput = {
async function setupVectorstore(selectedModel) {
console.log('Setting up vectorstore', selectedModel);
const embeddings = new HuggingFaceTransformersEmbeddings({
modelName: 'Supabase/gte-small',
modelName: 'Xenova/jina-embeddings-v2-small-en',
});
const voyClient = new VoyClient();
return new VoyVectorStore(voyClient, embeddings);
Expand All @@ -49,6 +49,16 @@ export async function embedDocs(selectedModel, localFile): Promise<EmbedDocsOutp
documents.push(
new Document({
pageContent: pageContent.textContent,
metadata: {
pageURL: pageContent.pageURL,
title: pageContent.title,
length: pageContent.length,
excerpt: pageContent.excerpt,
byline: pageContent.byline,
dir: pageContent.dir,
siteName: pageContent.siteName,
lang: pageContent.lang,
},
}),
);
} else {
Expand Down Expand Up @@ -77,8 +87,6 @@ export async function* talkToDocument(selectedModel, vectorStore, input: Convers
console.log('chat_history', input.chat_history);
console.log('vectorStore', vectorStore);
const retriever = vectorStore.asRetriever();
const context = retriever.pipe(formatDocumentsAsString);
console.log('context', context);
const condenseQuestionTemplate = `Given the following conversation and a follow up question, rephrase the follow up question to be a standalone question.

Chat History:
Expand All @@ -91,8 +99,15 @@ export async function* talkToDocument(selectedModel, vectorStore, input: Convers
Do not use any other sources of information.
Do not provide any answer that is not based on the context.
If there is no answer, type "Not sure based on the context".
Additionally you will be given metadata like
title,content,length,excerpt,byline,dir,siteName,lang
in the metadata field. Use this information to help you answer the question.

{context}

Metadata:
{metadata}

Question: {question}
Answer:
`);
Expand All @@ -109,6 +124,7 @@ export async function* talkToDocument(selectedModel, vectorStore, input: Convers
{
context: retriever.pipe(formatDocumentsAsString),
question: new RunnablePassthrough(),
metadata: retriever.pipe(documents => getMetadataString(documents[0].metadata)),
},
prompt,
llm,
Expand All @@ -122,6 +138,20 @@ export async function* talkToDocument(selectedModel, vectorStore, input: Convers
}
}

function getMetadataString(metadata) {
const result = [];

for (const key in metadata) {
// Check if the property is not an object and not an array
if (Object.prototype.hasOwnProperty.call(metadata, key) && typeof metadata[key] !== 'object') {
result.push(`${key}: ${metadata[key]}`);
}
}
console.log('result', result);

return result.join(' ');
}

export const formatChatHistory = (chatHistory: { question: string; answer: string }[]) => {
console.log('chatHistory', chatHistory);
const formattedDialogueTurns = chatHistory.map(
Expand Down Expand Up @@ -174,7 +204,7 @@ export async function* chatWithLLM(selectedModel, input: ConversationalRetrieval
const llm = new ChatOllama({
baseUrl: OLLAMA_BASE_URL,
model: selectedModel,
temperature: 0,
temperature: 0.3,
});
const chatPrompt = ChatPromptTemplate.fromMessages([
[
Expand Down Expand Up @@ -207,7 +237,6 @@ export async function* chatWithLLM(selectedModel, input: ConversationalRetrieval
});

for await (const chunk of stream) {
console.log('chunk', chunk);
yield chunk.response;
}
}
11 changes: 11 additions & 0 deletions src/pages/sidePanel/SidePanel.css
Original file line number Diff line number Diff line change
Expand Up @@ -544,3 +544,14 @@
.spin {
animation: spin 2s linear infinite;
}

.summary-title {
color: #61dafb;
/* Or any color that fits your design */
margin: 20px 0;
/* Adjust margin as needed */
font-size: calc(10px + 2vmin);
/* Adjust font size as needed */
text-align: center;
/* If you want to center the title */
}
2 changes: 2 additions & 0 deletions src/pages/sidePanel/Summarize.ts
Original file line number Diff line number Diff line change
Expand Up @@ -5,6 +5,7 @@ import { RecursiveCharacterTextSplitter } from 'langchain/text_splitter';
import { OLLAMA_BASE_URL } from '@src/pages/sidePanel/QandA';

export type SummarizationResponse = {
title?: string;
text: string;
pageURL: string;
tabID?: number;
Expand Down Expand Up @@ -33,6 +34,7 @@ async function summarizeCurrentPage(selectedModel) {
input_documents: docs,
});
return {
title: pageContent.title,
text: response.text,
pageURL: pageContent.pageURL,
tabID: pageContent.tabID,
Expand Down