-
Notifications
You must be signed in to change notification settings - Fork 1.6k
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
- Loading branch information
1 parent
35ef8d7
commit 6881b5c
Showing
4 changed files
with
180 additions
and
0 deletions.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,2 @@ | ||
OPENAI_BASE_URL=http://localhost:11434/v1 | ||
OPENAI_API_KEY=fake-key |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,51 @@ | ||
## 📰 Multi-agent AI news assistant | ||
This Streamlit application implements a sophisticated news processing pipeline using multiple specialized AI agents to search, synthesize, and summarize news articles. It leverages the Llama 3.2 model via Ollama and DuckDuckGo search to provide comprehensive news analysis. | ||
|
||
|
||
### Features | ||
- Multi-agent architecture with specialized roles: | ||
- News Searcher: Finds recent news articles | ||
- News Synthesizer: Analyzes and combines information | ||
- News Summarizer: Creates concise, professional summaries | ||
|
||
- Real-time news search using DuckDuckGo | ||
- AP/Reuters-style summary generation | ||
- User-friendly Streamlit interface | ||
|
||
|
||
### How to get Started? | ||
|
||
1. Clone the GitHub repository | ||
```bash | ||
git clone https://github.com/your-username/ai-news-processor.git | ||
cd local_news_agent_openai_swarm | ||
``` | ||
|
||
2. Install the required dependencies: | ||
|
||
```bash | ||
pip install -r requirements.txt | ||
``` | ||
|
||
3. Pull and Run Llama 3.2 using Ollama: | ||
|
||
```bash | ||
# Pull the model | ||
ollama pull llama3.2 | ||
|
||
# Verify installation | ||
ollama list | ||
|
||
# Run the model (optional test) | ||
ollama run llama3.2 | ||
``` | ||
|
||
4. Create a .env file with your configurations: | ||
```bash | ||
OPENAI_BASE_URL=http://localhost:11434/v1 | ||
OPENAI_API_KEY=fake-key | ||
``` | ||
5. Run the Streamlit app | ||
```bash | ||
streamlit run news_agent.py | ||
``` |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,124 @@ | ||
import streamlit as st | ||
from duckduckgo_search import DDGS | ||
from swarm import Swarm, Agent | ||
from datetime import datetime | ||
from dotenv import load_dotenv | ||
|
||
load_dotenv() | ||
MODEL = "llama3.2:latest" | ||
client = Swarm() | ||
|
||
st.set_page_config(page_title="AI News Processor", page_icon="📰") | ||
st.title("📰 News Inshorts Agent") | ||
|
||
def search_news(topic): | ||
"""Search for news articles using DuckDuckGo""" | ||
with DDGS() as ddg: | ||
results = ddg.text(f"{topic} news {datetime.now().strftime('%Y-%m')}", max_results=3) | ||
if results: | ||
news_results = "\n\n".join([ | ||
f"Title: {result['title']}\nURL: {result['href']}\nSummary: {result['body']}" | ||
for result in results | ||
]) | ||
return news_results | ||
return f"No news found for {topic}." | ||
|
||
# Create specialized agents | ||
search_agent = Agent( | ||
name="News Searcher", | ||
instructions=""" | ||
You are a news search specialist. Your task is to: | ||
1. Search for the most relevant and recent news on the given topic | ||
2. Ensure the results are from reputable sources | ||
3. Return the raw search results in a structured format | ||
""", | ||
functions=[search_news], | ||
model=MODEL | ||
) | ||
|
||
synthesis_agent = Agent( | ||
name="News Synthesizer", | ||
instructions=""" | ||
You are a news synthesis expert. Your task is to: | ||
1. Analyze the raw news articles provided | ||
2. Identify the key themes and important information | ||
3. Combine information from multiple sources | ||
4. Create a comprehensive but concise synthesis | ||
5. Focus on facts and maintain journalistic objectivity | ||
6. Write in a clear, professional style | ||
Provide a 2-3 paragraph synthesis of the main points. | ||
""", | ||
model=MODEL | ||
) | ||
|
||
summary_agent = Agent( | ||
name="News Summarizer", | ||
instructions=""" | ||
You are an expert news summarizer combining AP and Reuters style clarity with digital-age brevity. | ||
Your task: | ||
1. Core Information: | ||
- Lead with the most newsworthy development | ||
- Include key stakeholders and their actions | ||
- Add critical numbers/data if relevant | ||
- Explain why this matters now | ||
- Mention immediate implications | ||
2. Style Guidelines: | ||
- Use strong, active verbs | ||
- Be specific, not general | ||
- Maintain journalistic objectivity | ||
- Make every word count | ||
- Explain technical terms if necessary | ||
Format: Create a single paragraph of 250-400 words that informs and engages. | ||
Pattern: [Major News] + [Key Details/Data] + [Why It Matters/What's Next] | ||
Focus on answering: What happened? Why is it significant? What's the impact? | ||
IMPORTANT: Provide ONLY the summary paragraph. Do not include any introductory phrases, | ||
labels, or meta-text like "Here's a summary" or "In AP/Reuters style." | ||
Start directly with the news content. | ||
""", | ||
model=MODEL | ||
) | ||
|
||
def process_news(topic): | ||
"""Run the news processing workflow""" | ||
with st.status("Processing news...", expanded=True) as status: | ||
# Search | ||
status.write("🔍 Searching for news...") | ||
search_response = client.run( | ||
agent=search_agent, | ||
messages=[{"role": "user", "content": f"Find recent news about {topic}"}] | ||
) | ||
raw_news = search_response.messages[-1]["content"] | ||
|
||
# Synthesize | ||
status.write("🔄 Synthesizing information...") | ||
synthesis_response = client.run( | ||
agent=synthesis_agent, | ||
messages=[{"role": "user", "content": f"Synthesize these news articles:\n{raw_news}"}] | ||
) | ||
synthesized_news = synthesis_response.messages[-1]["content"] | ||
|
||
# Summarize | ||
status.write("📝 Creating summary...") | ||
summary_response = client.run( | ||
agent=summary_agent, | ||
messages=[{"role": "user", "content": f"Summarize this synthesis:\n{synthesized_news}"}] | ||
) | ||
return raw_news, synthesized_news, summary_response.messages[-1]["content"] | ||
|
||
# User Interface | ||
topic = st.text_input("Enter news topic:", value="artificial intelligence") | ||
if st.button("Process News", type="primary"): | ||
if topic: | ||
try: | ||
raw_news, synthesized_news, final_summary = process_news(topic) | ||
st.header(f"📝 News Summary: {topic}") | ||
st.markdown(final_summary) | ||
except Exception as e: | ||
st.error(f"An error occurred: {str(e)}") | ||
else: | ||
st.error("Please enter a topic!") |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,3 @@ | ||
git+https://github.com/openai/swarm.git | ||
streamlit | ||
duckduckgo-search |