Skip to content

Commit c4d259a

Browse files
committed
Add motia content creation workflow
1 parent d7228ba commit c4d259a

File tree

13 files changed

+780
-0
lines changed

13 files changed

+780
-0
lines changed
Lines changed: 4 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,4 @@
1+
OPENAI_API_KEY=your_openai_api_key
2+
FIRECRAWL_API_KEY=your_firecrawl_api_key
3+
TYPEFULLY_API_KEY=your_typefully_api_key
4+
MOTIA_PORT=3000

motia-content-creation/.gitignore

Lines changed: 109 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,109 @@
1+
# Dependencies
2+
.motia
3+
node_modules/
4+
python_modules/
5+
npm-debug.log*
6+
yarn-debug.log*
7+
yarn-error.log*
8+
9+
# Python dependencies and environments
10+
__pycache__/
11+
*.py[cod]
12+
*.egg-info/
13+
.venv/
14+
venv/
15+
Pipfile.lock
16+
.mypy_cache/
17+
.pytest_cache/
18+
19+
# Environment variables
20+
.env
21+
.env.local
22+
.env.development.local
23+
.env.test.local
24+
.env.production.local
25+
26+
# Build outputs
27+
dist/
28+
build/
29+
*.tsbuildinfo
30+
31+
# Logs
32+
logs
33+
*.log
34+
35+
# Runtime data
36+
pids
37+
*.pid
38+
*.seed
39+
*.pid.lock
40+
41+
# Coverage directory used by tools like istanbul
42+
coverage/
43+
*.lcov
44+
45+
# nyc test coverage
46+
.nyc_output
47+
48+
# Dependency directories
49+
jspm_packages/
50+
51+
# Optional npm cache directory
52+
.npm
53+
54+
# Optional eslint cache
55+
.eslintcache
56+
57+
# Microbundle cache
58+
.rpt2_cache/
59+
.rts2_cache_cjs/
60+
.rts2_cache_es/
61+
.rts2_cache_umd/
62+
63+
# Optional REPL history
64+
.node_repl_history
65+
66+
# Output of 'npm pack'
67+
*.tgz
68+
69+
# Yarn Integrity file
70+
.yarn-integrity
71+
72+
# parcel-bundler cache (https://parceljs.org/)
73+
.cache
74+
.parcel-cache
75+
76+
# Next.js build output
77+
.next
78+
79+
# Nuxt.js build / generate output
80+
.nuxt
81+
dist
82+
83+
# Gatsby files
84+
.cache/
85+
public
86+
87+
# Storybook build outputs
88+
.out
89+
.storybook-out
90+
91+
# Temporary folders
92+
tmp/
93+
temp/
94+
95+
# Editor directories and files
96+
.vscode/
97+
.idea/
98+
*.swp
99+
*.swo
100+
*~
101+
102+
# OS generated files
103+
.DS_Store
104+
.DS_Store?
105+
._*
106+
.Spotlight-V100
107+
.Trashes
108+
ehthumbs.db
109+
Thumbs.db

motia-content-creation/README.md

Lines changed: 121 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,121 @@
1+
# Social Media Automation workflow using Motia
2+
3+
A streamlined content generation agent built with [Motia](https://motia.dev) that transforms articles into engaging Twitter threads and LinkedIn posts using AI.
4+
5+
We use the following tech stack:
6+
- Motia as the unified backend framework
7+
- Firecrawl to scrape web content
8+
- OpenAI for agentic capabilities
9+
10+
## 🎯Overview
11+
12+
**Workflow**
13+
14+
Our workflow consists of 4 main steps:-
15+
16+
```
17+
API → Scrape → Generate → Schedule
18+
```
19+
20+
1. **API**: Receives article URL via POST request
21+
2. **Scrape**: Extracts content using Firecrawl in markdown format
22+
3. **Generate**: Creates Twitter & LinkedIn content using OpenAI
23+
4. **Schedule**: Saves content as drafts in Typefully for review
24+
25+
## 🛠️ Setup
26+
27+
### Prerequisites
28+
29+
- Node.js 18+
30+
- Python 3.x
31+
- API keys for:
32+
- OpenAI
33+
- Firecrawl
34+
- Typefully
35+
36+
### Installation
37+
38+
1. **Install dependencies:**
39+
```bash
40+
npm install or pnpm install
41+
```
42+
43+
3. **Configure environment:**
44+
```bash
45+
cp .env.example .env
46+
# Edit .env with your API keys
47+
```
48+
or Create a `.env` file in the root directory with the following variables:
49+
```bash
50+
OPENAI_API_KEY=your_openai_api_key
51+
FIRECRAWL_API_KEY=your_firecrawl_api_key
52+
TYPEFULLY_API_KEY=your_typefully_api_key
53+
```
54+
55+
4. **Start the development server:**
56+
```bash
57+
npm run dev
58+
```
59+
60+
## 🚀 Usage
61+
62+
### Generate Content
63+
64+
Send a POST request to trigger content generation:
65+
66+
```bash
67+
curl -X POST http://localhost:3000/generate-content \\
68+
-H "Content-Type: application/json" \\
69+
-d '{"url": "https://example.com/article"}'
70+
```
71+
72+
**Response:**
73+
```json
74+
{
75+
"message": "Content generation started",
76+
"requestId": "req_123456",
77+
"url": "https://example.com/article",
78+
"status": "processing"
79+
}
80+
```
81+
82+
### View Results
83+
84+
After processing completes:
85+
1. Visit [Typefully](https://typefully.com/drafts)
86+
2. Review your generated Twitter thread and LinkedIn post
87+
3. Edit if needed and publish!
88+
89+
## 📁 Project Structure
90+
91+
```
92+
social-media-automation/
93+
├── steps/
94+
│ ├── api.step.ts # API endpoint handler
95+
│ ├── scrape.step.ts # Firecrawl integration
96+
│ ├── generate.step.ts # Parallel OpenAI calls
97+
│ └── schedule.step.ts # Typefully scheduling
98+
├── prompts/
99+
│ ├── twitter-prompt.txt # Twitter generation prompt
100+
│ └── linkedin-prompt.txt # LinkedIn generation prompt
101+
├── config/
102+
│ └── index.js # Configuration management
103+
├── package.json
104+
├── tsconfig.json
105+
└── README.md
106+
```
107+
108+
## 🔍 Monitoring
109+
110+
The Motia workbench provides an interactive UI where you can easily deb ug and monitor your flows as interactive diagrams. It runs automatically with the development server.
111+
112+
## 📬 Stay Updated with Our Newsletter!
113+
**Get a FREE Data Science eBook** 📖 with 150+ essential lessons in Data Science when you subscribe to our newsletter! Stay in the loop with the latest tutorials, insights, and exclusive resources. [Subscribe now!](https://join.dailydoseofds.com)
114+
115+
[![Daily Dose of Data Science Newsletter](https://github.com/patchy631/ai-engineering/blob/main/resources/join_ddods.png)](https://join.dailydoseofds.com)
116+
117+
---
118+
119+
## Contribution
120+
121+
Contributions are welcome! Please fork the repository and submit a pull request with your improvements.
Lines changed: 33 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,33 @@
1+
require('dotenv').config();
2+
3+
const config = {
4+
openai: {
5+
apiKey: process.env.OPENAI_API_KEY,
6+
model: 'gpt-4o',
7+
},
8+
9+
firecrawl: {
10+
apiKey: process.env.FIRECRAWL_API_KEY,
11+
},
12+
13+
typefully: {
14+
apiKey: process.env.TYPEFULLY_API_KEY,
15+
},
16+
17+
motia: {
18+
port: parseInt(process.env.MOTIA_PORT) || 3000,
19+
},
20+
21+
validate() {
22+
const required = ['OPENAI_API_KEY', 'FIRECRAWL_API_KEY', 'TYPEFULLY_API_KEY'];
23+
const missing = required.filter(key => !process.env[key]);
24+
25+
if (missing.length > 0) {
26+
throw new Error(`Missing required environment variables: ${missing.join(', ')}`);
27+
}
28+
}
29+
};
30+
31+
config.validate();
32+
33+
module.exports = config;
Lines changed: 34 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,34 @@
1+
{
2+
"name": "social-media-automation",
3+
"description": "AI-powered social media automation workflow that scrapes articles and generates viral social media posts using Motia",
4+
"scripts": {
5+
"prepare": "python3 -m venv python_modules && source python_modules/bin/activate && pip install -r requirements.txt",
6+
"dev": "source python_modules/bin/activate && motia dev",
7+
"dev:debug": "source python_modules/bin/activate && motia dev --debug",
8+
"build": "source python_modules/bin/activate && motia build",
9+
"clean": "rm -rf dist .motia .mermaid node_modules python_modules",
10+
"generate:config": "motia get-config --output ./"
11+
},
12+
"keywords": [
13+
"motia",
14+
"ai",
15+
"content-writing",
16+
"social-media",
17+
"automation",
18+
"web-scraping"
19+
],
20+
"dependencies": {
21+
"motia": "^0.4.0-beta.90",
22+
"@mendable/firecrawl-js": "^1.0.0",
23+
"openai": "^4.90.0",
24+
"dotenv": "^16.5.0",
25+
"zod": "^3.25.67",
26+
"axios": "^1.10.0"
27+
},
28+
"devDependencies": {
29+
"@types/node": "^20.17.28",
30+
"@types/react": "^18.3.23",
31+
"ts-node": "^10.9.2",
32+
"typescript": "^5.8.3"
33+
}
34+
}
Lines changed: 64 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,64 @@
1+
As a technical writing assistant, your task is to understand the writing style, paraphrasing, and spacing of the example post provided below.
2+
3+
Create a post based on the article content that follows this example, ensure your responses sound genuine and aim to convey valuable information.
4+
5+
Be precise, so that it's easier for readers to grasp the essence of the post in minimal time.
6+
7+
You should sound genuine and informative.
8+
9+
Keep the points (if any) succinct and precise so that it can be a better reading experience for all users, whether they're reading it on mobile or on web.
10+
11+
###############################
12+
13+
Example LinkedIn Post:
14+
15+
Python is all you need.
16+
17+
No other language will give you a better bang for your buck. No other language will let you do more.
18+
19+
Today, you can build AI and data applications using Python alone!
20+
21+
Take a look at Taipy, an open-source Python library to build end-to-end production applications:
22+
23+
https://github.com/Avaiga/taipy
24+
25+
Star the repo!
26+
27+
You can use Taipy to build production-ready applications. Some of its perks:
28+
29+
- Taipy scales as more users hit your application
30+
• Taipy can work with huge datasets
31+
• Taipy is multi-user and can manage different user profiles
32+
33+
Taipy has a library of pre-built components for interacting with data pipelines, including visualization and management tools. It also supports tools for versioning and pipeline orchestration.
34+
35+
It's open-source and comes with a Visual Studio Code extension that allows you to start without writing any code.
36+
37+
For enterprise customers, Taipy Designer is a drag-and-drop layer that helps build applications using a visual interface.
38+
39+
Taipy's goal is not to replace web developers but to provide an alternative to those who need to build applications without web experience. If you are a data scientist or someone dealing with data, Taipy will simplify your life considerably.
40+
41+
###############################
42+
43+
Based on the example above, analyze the following article content and create a LinkedIn post that follows the same style, tone, and structure. Focus on:
44+
45+
1. **Strong Opening**: Start with a compelling statement or hook
46+
2. **Value Proposition**: Clearly communicate what the reader will gain
47+
3. **Practical Information**: Include actionable insights or tools
48+
4. **Professional Tone**: Maintain a balance between casual and professional
49+
5. **Bullet Points**: Use them for easy scanning and readability
50+
6. **Call-to-Action**: End with engagement or next steps
51+
7. **Genuine Voice**: Sound authentic and knowledgeable
52+
53+
**Article Title**: {{title}}
54+
55+
**Article Content**:
56+
{{content}}
57+
58+
Generate a LinkedIn post based on this content. Make it informative, engaging, and valuable for a professional audience interested in technology, AI, and career development.
59+
60+
Return your response in the following JSON format:
61+
{
62+
"post": "Your complete LinkedIn post content here",
63+
"characterCount": 1250
64+
}

0 commit comments

Comments
 (0)