Skip to content

Commit 5f68e85

Browse files
authored
Merge pull request #45 from BrainDriveAI/feature/plugin-services-runtime
# Add Plugin Service Runtime Management + JWT UTC Fix ## Summary This PR introduces **Plugin Service Runtime Management**—a framework for discovering, installing, starting, health-checking, and stopping plugin-scoped backend services (e.g., docker-compose stacks or Python microservices). It also updates JWT token issuance to use **UTC-aware timestamps** to prevent “signature expired” issues on machines with timezone drift. ## Why * Plugins increasingly depend on **companion services** (retrievers, workers, vector stores, etc.). We need a first-class way to declare and manage those. * JWTs created with `datetime.utcnow()` can cause **intermittent auth failures** under clock skew or timezone quirks. Moving to `datetime.now(UTC)` makes tokens robust and standards-aligned. --- ## What’s Changed ### Ignored paths * `.gitignore` * `backend/services_runtime/` – local service checkouts, venvs, compose files * `backend/persistent/` – optional local data/state ### Security / Auth * `backend/app/core/security.py` * Replace `datetime.utcnow()` with `datetime.now(UTC)` for `exp` and `iat`. ### Data Model * `backend/app/models/plugin.py` * Add `required_services_runtime` (JSON string) to `Plugin` * Add `Plugin.service_runtimes` relationship * New model **`PluginServiceRuntime`** with fields: * ids, `plugin_id`, `plugin_slug`, `name`, `source_url`, `type` * `install_command`, `start_command`, `healthcheck_url` * `required_env_vars` (JSON), `status` * `created_at`, `updated_at` (UTC), `user_id` * **Migration**: `backend/migrations/versions/5d073fe444c9_*.py` * Creates `plugin_service_runtime` table + indices * Adds `plugin.required_services_runtime` column ### DTOs * `backend/app/dto/plugin.py` * Add `PluginServiceRuntimeDTO` (Pydantic) and conversion helpers ### Plugin Installer & Repository * `backend/app/plugins/remote_installer.py` * Extract `required_services_runtime` from plugin sources * After install, **automatically install & start required services** * `backend/app/plugins/repository.py` * Add `get_all_service_runtimes()` returning DTOs (for startup orchestration) ### Runtime Orchestration * New package: `backend/app/plugins/service_installler/` (sic) * `service_runtime_extractor.py` – robust parser for `required_services_runtime` in plugin code * `plugin_service_manager.py` – download/extract sources, env prechecks, type-based runners * `docker_manager.py` – docker checks, `docker compose` start/stop, health waits * `python_manager.py` – venv create, install, start, health wait * `service_health_checker.py` – async health polling * `prerequisites.py` – .env loading and required vars validation * `service_debugger.py` – developer utility for parsing verification * `start_stop_plugin_services.py` – **startup**: start all persisted runtimes; **shutdown**: stop all ### App lifecycle hooks * `backend/app/main.py` and `backend/main.py` * On **startup**: `start_plugin_services_on_startup()` * On **shutdown**: `stop_all_plugin_services_on_shutdown()` --- ## Database Migration * **Alembic Revision:** `5d073fe444c9` * **Upgrade:** ```bash cd backend alembic upgrade head ``` * **Downgrade:** ```bash cd backend alembic downgrade -1 ``` * **Backfill:** none required. * **Compatibility:** preserves existing plugin rows; new column is nullable; new table is additive. --- ## How It Works (High-Level) 1. **Plugin declares services** in its lifecycle manager as `required_services_runtime = [...]`. 2. **Remote installer** extracts and validates the list during install. 3. **Manager** downloads sources (ZIP/TAR from Git host), prepares environment (.env projection), and starts services: * `type=="docker-compose"` → `docker compose up --build -d` + health check * `type=="python"` → venv create → install → start + health check 4. **On app startup**, any persisted `plugin_service_runtime` with status in `pending/stopped/running` is resumed. 5. **On shutdown**, all services are stopped gracefully. --- ## Operational Notes ### Requirements * Docker & Docker Compose available and daemon running (for dockerized services). * Root backend `.env` must define **all required env vars** referenced by services. The installer writes a **service-local .env** with those keys. ### New Local Folders * `backend/services_runtime/` – service code, venvs, compose files (ignored) * `backend/persistent/` – optional runtime data (ignored) --- ## Security & Compliance * **JWT**: Use UTC (`datetime.now(UTC)`) for `exp`/`iat` → reduces clock-skew issues. * **Process execution**: service runners use explicit commands from plugin metadata; health checks guard readiness. * **Env handling**: only copies whitelisted keys from root `.env` into the service scope.
2 parents aab8df8 + d8cf4e7 commit 5f68e85

19 files changed

+1571
-15
lines changed

.gitignore

Lines changed: 3 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -47,6 +47,9 @@ backend/alembic/versions/
4747
*.log
4848
backend/logs/
4949

50+
backend/services_runtime/
51+
backend/persistent/
52+
5053
# Environment variables
5154
.env
5255
.env.local

ROADMAP.md

Lines changed: 7 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -59,11 +59,12 @@ Our roadmap is broken into clearly defined versions, each building toward a stab
5959
## Version 0.6.0 – Open Beta
6060
> Goal: AI System with core functionality for developers
6161
62-
- [ ] All plugins moved to the Life Cycle Manager
63-
- [ ] Ollama plugin updated to include server manager
64-
- [ ] User initializer - Plugin install from remote
65-
- [ ] User initializer - Restructure navigation
66-
- [ ] Prompt Library
62+
- [x] All plugins moved to the Life Cycle Manager
63+
- [x] Ollama plugin updated to include server manager
64+
- [x] User initializer - Plugin install from remote
65+
- [x] User initializer - Restructure navigation
66+
- [ ] Improve Registration
67+
- [ ] Ollama AI Provider
6768
- [ ] One-Click Installer - (Windows first)
6869

6970
---
@@ -73,6 +74,7 @@ Our roadmap is broken into clearly defined versions, each building toward a stab
7374
7475
- [ ] Unified Dynamic Page Renderer - Bounce
7576
- [ ] Unified Dynamic Page Renderer - Finetune
77+
- [ ] Prompt Library
7678

7779
---
7880

backend/app/core/security.py

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,5 @@
11
from sqlalchemy.ext.asyncio import AsyncSession
2-
from datetime import datetime, timedelta as datetime_timedelta
2+
from datetime import datetime, timedelta as datetime_timedelta, UTC
33
from typing import Optional
44
from jose import jwt, JWTError
55
import logging
@@ -41,16 +41,16 @@ def create_access_token(data: dict, expires_delta: Optional[datetime_timedelta]
4141
"""Create a new access token."""
4242
to_encode = data.copy()
4343
if expires_delta:
44-
expire = datetime.utcnow() + expires_delta
44+
expire = datetime.now(UTC) + expires_delta
4545
else:
46-
expire = datetime.utcnow() + datetime_timedelta(minutes=settings.ACCESS_TOKEN_EXPIRE_MINUTES)
46+
expire = datetime.now(UTC) + datetime_timedelta(minutes=settings.ACCESS_TOKEN_EXPIRE_MINUTES)
4747

4848
# Convert datetime to Unix timestamp for JWT
4949
to_encode.update({"exp": expire.timestamp()})
5050

5151
# Let JWT library handle iat automatically if not provided
5252
if "iat" not in to_encode:
53-
to_encode.update({"iat": datetime.utcnow().timestamp()})
53+
to_encode.update({"iat": datetime.now(UTC).timestamp()})
5454

5555
encoded_jwt = jwt.encode(to_encode, settings.SECRET_KEY, algorithm=settings.ALGORITHM)
5656
return encoded_jwt

backend/app/dto/__init__.py

Whitespace-only changes.

backend/app/dto/plugin.py

Lines changed: 65 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,65 @@
1+
from pydantic import BaseModel
2+
from typing import List, Optional, Union
3+
from datetime import datetime
4+
import uuid
5+
6+
# This schema is used for returning data from the repository.
7+
# It ensures that JSON fields are correctly converted to Python types.
8+
class PluginServiceRuntimeDTO(BaseModel):
9+
"""
10+
A Pydantic model to represent a PluginServiceRuntime object,
11+
with required_env_vars as a list of strings.
12+
"""
13+
id: str
14+
plugin_id: str
15+
plugin_slug: str
16+
name: str
17+
source_url: Optional[str] = None
18+
type: Optional[str] = None
19+
install_command: Optional[str] = None
20+
start_command: Optional[str] = None
21+
healthcheck_url: Optional[str] = None
22+
required_env_vars: List[str] = []
23+
status: Optional[str] = None
24+
user_id: str
25+
created_at: Optional[datetime] = None
26+
updated_at: Optional[datetime] = None
27+
28+
@classmethod
29+
def from_github_data(cls, service_dict: dict, plugin_id: str, plugin_slug: str, user_id: str) -> 'PluginServiceRuntimeDTO':
30+
"""
31+
Create a PluginServiceRuntimeDTO from raw GitHub service data (dict).
32+
This handles first-time installation where database fields don't exist yet.
33+
"""
34+
return cls(
35+
id=str(uuid.uuid4()), # Generate new UUID for first install
36+
plugin_id=plugin_id,
37+
plugin_slug=plugin_slug,
38+
user_id=user_id,
39+
name=service_dict.get('name'),
40+
source_url=service_dict.get('source_url'),
41+
type=service_dict.get('type', 'python'),
42+
install_command=service_dict.get('install_command'),
43+
start_command=service_dict.get('start_command'),
44+
healthcheck_url=service_dict.get('healthcheck_url'),
45+
required_env_vars=service_dict.get('required_env_vars', []),
46+
status='installing',
47+
created_at=datetime.now(),
48+
updated_at=datetime.now()
49+
)
50+
51+
@classmethod
52+
def from_dict_or_dto(cls, data: Union[dict, 'PluginServiceRuntimeDTO'], plugin_id: str = None, plugin_slug: str = None, user_id: str = None) -> 'PluginServiceRuntimeDTO':
53+
"""
54+
Flexible factory method that handles both dict (GitHub) and DTO (database) inputs.
55+
"""
56+
if isinstance(data, cls):
57+
return data # Already a DTO, return as-is
58+
elif isinstance(data, dict):
59+
# Dict from GitHub, convert using factory method
60+
if not all([plugin_id, plugin_slug, user_id]):
61+
raise ValueError("plugin_id, plugin_slug, and user_id are required when converting from dict")
62+
return cls.from_github_data(data, plugin_id, plugin_slug, user_id)
63+
else:
64+
raise TypeError(f"Expected dict or {cls.__name__}, got {type(data)}")
65+

backend/app/main.py

Lines changed: 14 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -5,6 +5,7 @@
55
from app.api.v1.api import api_router
66
from app.core.config import settings
77
from app.routers.plugins import plugin_manager
8+
from app.plugins.service_installler.start_stop_plugin_services import start_plugin_services_on_startup, stop_all_plugin_services_on_shutdown
89
import logging
910
import time
1011
import structlog
@@ -29,8 +30,21 @@ async def startup_event():
2930
logger.info("Initializing application settings...")
3031
from app.init_settings import init_ollama_settings
3132
await init_ollama_settings()
33+
# Start plugin services
34+
await start_plugin_services_on_startup()
3235
logger.info("Settings initialization completed")
3336

37+
38+
# Add shutdown event to gracefully stop services
39+
@app.on_event("shutdown")
40+
async def shutdown_event():
41+
"""Gracefully stop all plugin services on application shutdown."""
42+
logger.info("Shutting down application and stopping plugin services...")
43+
# Stop all plugin services gracefully
44+
await stop_all_plugin_services_on_shutdown()
45+
logger.info("Application shutdown completed.")
46+
47+
3448
# Add middleware to log all requests
3549
logger = structlog.get_logger()
3650

@@ -94,5 +108,3 @@ async def validation_exception_handler(request: Request, exc: RequestValidationE
94108

95109
# Include API routers
96110
app.include_router(api_router)
97-
98-

backend/app/models/plugin.py

Lines changed: 82 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,10 +1,13 @@
1-
from sqlalchemy import Column, String, Integer, Boolean, ForeignKey, Text, JSON, UniqueConstraint, TIMESTAMP
1+
from sqlalchemy import Column, String, Integer, Boolean, ForeignKey, Text, JSON, UniqueConstraint, TIMESTAMP, DateTime
22
import sqlalchemy
33
from sqlalchemy.orm import relationship
44
from sqlalchemy.sql import func
5+
from datetime import datetime, UTC
6+
import json
57

68
from app.models.base import Base
79

10+
811
class Plugin(Base):
912
"""SQLAlchemy model for plugins."""
1013

@@ -44,6 +47,7 @@ class Plugin(Base):
4447
config_fields = Column(Text) # Stored as JSON string
4548
messages = Column(Text) # Stored as JSON string
4649
dependencies = Column(Text) # Stored as JSON string
50+
required_services_runtime = Column(Text, nullable=True)
4751

4852
# Timestamps
4953
created_at = Column(String, default=func.now())
@@ -60,6 +64,7 @@ class Plugin(Base):
6064

6165
# Relationships
6266
modules = relationship("Module", back_populates="plugin", cascade="all, delete-orphan")
67+
service_runtimes = relationship("PluginServiceRuntime", back_populates="plugin", cascade="all, delete-orphan")
6368

6469
def to_dict(self):
6570
"""Convert model to dictionary."""
@@ -118,6 +123,11 @@ def to_dict(self):
118123
else:
119124
result["permissions"] = []
120125

126+
if self.required_services_runtime:
127+
result["requiredServicesRuntime"] = json.loads(self.required_services_runtime)
128+
else:
129+
result["requiredServicesRuntime"] = []
130+
121131
return result
122132

123133
@classmethod
@@ -162,10 +172,81 @@ def from_dict(cls, data):
162172
# Remove modules from data as they are handled separately
163173
if "modules" in db_data:
164174
db_data.pop("modules")
175+
176+
# Handle service runtimes (only store names in plugin table)
177+
if "requiredServicesRuntime" in db_data and db_data["requiredServicesRuntime"] is not None:
178+
db_data["required_services_runtime"] = json.dumps([
179+
r["name"] for r in db_data["requiredServicesRuntime"]
180+
])
181+
db_data.pop("requiredServicesRuntime")
165182

166183
return cls(**db_data)
167184

168185

186+
class PluginServiceRuntime(Base):
187+
"""SQLAlchemy model for plugin service runtimes."""
188+
189+
__tablename__ = "plugin_service_runtime"
190+
191+
id = Column(String, primary_key=True, index=True)
192+
plugin_id = Column(String, ForeignKey("plugin.id"), nullable=False, index=True)
193+
plugin_slug = Column(String, nullable=False, index=True)
194+
195+
name = Column(String, nullable=False)
196+
source_url = Column(String)
197+
type = Column(String)
198+
install_command = Column(Text)
199+
start_command = Column(Text)
200+
healthcheck_url = Column(String)
201+
required_env_vars = Column(Text) # store as JSON string
202+
status = Column(String, default="pending")
203+
204+
created_at = Column(DateTime, default=datetime.now(UTC))
205+
updated_at = Column(DateTime, default=datetime.now(UTC), onupdate=datetime.now(UTC))
206+
207+
user_id = Column(String(32), ForeignKey("users.id", name="fk_plugin_service_runtime_user_id"), nullable=False)
208+
user = relationship("User")
209+
210+
# Relationship back to plugin
211+
plugin = relationship("Plugin", back_populates="service_runtimes")
212+
213+
def to_dict(self):
214+
"""
215+
Convert the model instance to a dictionary, handling JSON fields.
216+
"""
217+
return {
218+
"id": self.id,
219+
"plugin_id": self.plugin_id,
220+
"plugin_slug": self.plugin_slug,
221+
"name": self.name,
222+
"source_url": self.source_url,
223+
"type": self.type,
224+
"install_command": self.install_command,
225+
"start_command": self.start_command,
226+
"healthcheck_url": self.healthcheck_url,
227+
"required_env_vars": json.loads(self.required_env_vars) if self.required_env_vars else [],
228+
"status": self.status,
229+
"user_id": self.user_id,
230+
"created_at": self.created_at.isoformat() if self.created_at else None,
231+
"updated_at": self.updated_at.isoformat() if self.updated_at else None,
232+
}
233+
234+
@classmethod
235+
def from_dict(cls, data: dict):
236+
"""
237+
Create a new instance from a dictionary, serializing JSON fields.
238+
"""
239+
db_data = data.copy()
240+
241+
if "required_env_vars" in db_data and db_data["required_env_vars"] is not None:
242+
db_data["required_env_vars"] = json.dumps(db_data["required_env_vars"])
243+
244+
# Handle conversion from camelCase to snake_case if necessary
245+
# For simplicity, we are assuming keys in the incoming dict match model attributes
246+
247+
return cls(**db_data)
248+
249+
169250
class Module(Base):
170251
"""SQLAlchemy model for plugin modules."""
171252

backend/app/plugins/remote_installer.py

Lines changed: 22 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -20,6 +20,8 @@
2020
from typing import Dict, Any, Optional, List, Tuple
2121
from urllib.parse import urlparse
2222
import structlog
23+
from .service_installler.plugin_service_manager import install_and_run_required_services
24+
from .service_installler.service_runtime_extractor import extract_required_services_runtime
2325

2426
logger = structlog.get_logger()
2527

@@ -237,6 +239,17 @@ async def install_from_url(self, repo_url: str, user_id: str, version: str = "la
237239
if install_result['success']:
238240
logger.info(f"Plugin installation successful: {install_result}")
239241

242+
service_runtime: list = validation_result.get("service_runtime", [])
243+
logger.info(f"\n\n>>>>>>>>SERVICE RUNTIME\n\n: {service_runtime}\n\n>>>>>>>>>>")
244+
if service_runtime:
245+
plugin_slug = validation_result["plugin_info"].get("plugin_slug")
246+
await install_and_run_required_services(
247+
service_runtime,
248+
plugin_slug,
249+
install_result['plugin_id'],
250+
user_id
251+
)
252+
240253
# Store installation metadata
241254
try:
242255
await self._store_installation_metadata(
@@ -718,6 +731,7 @@ async def _validate_plugin_structure(self, plugin_dir: Path) -> Dict[str, Any]:
718731

719732
# Try to load plugin metadata
720733
plugin_info = {}
734+
service_runtime = []
721735

722736
# Check package.json
723737
package_json_path = plugin_dir / 'package.json'
@@ -828,6 +842,12 @@ async def _validate_plugin_structure(self, plugin_dir: Path) -> Dict[str, Any]:
828842
extracted_slug = slug_match.group(1)
829843
plugin_info['plugin_slug'] = extracted_slug
830844
logger.info(f"Extracted plugin_slug from source: {extracted_slug}")
845+
846+
# Extract services using the dedicated function
847+
services = extract_required_services_runtime(content, plugin_info.get('plugin_slug'))
848+
if services:
849+
plugin_info['required_services_runtime'] = services
850+
service_runtime.extend(services)
831851
except Exception as extract_error:
832852
logger.warning(f"Could not extract plugin_slug from source: {extract_error}")
833853

@@ -839,7 +859,8 @@ async def _validate_plugin_structure(self, plugin_dir: Path) -> Dict[str, Any]:
839859

840860
return {
841861
'valid': True,
842-
'plugin_info': plugin_info
862+
'plugin_info': plugin_info,
863+
'service_runtime': service_runtime
843864
}
844865

845866
except Exception as e:

backend/app/plugins/repository.py

Lines changed: 22 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -6,7 +6,8 @@
66
from sqlalchemy.ext.asyncio import AsyncSession
77
import structlog
88

9-
from app.models.plugin import Plugin, Module
9+
from app.models.plugin import Plugin, Module, PluginServiceRuntime
10+
from app.dto.plugin import PluginServiceRuntimeDTO
1011

1112
logger = structlog.get_logger()
1213

@@ -85,7 +86,7 @@ async def get_all_plugins_with_modules(self, user_id: str = None) -> List[Dict[s
8586
plugin_dicts = []
8687
for plugin in plugins:
8788
plugin_dict = plugin.to_dict()
88-
89+
8990
# Get modules for this plugin
9091
modules_query = select(Module).where(Module.plugin_id == plugin.id)
9192

@@ -104,6 +105,25 @@ async def get_all_plugins_with_modules(self, user_id: str = None) -> List[Dict[s
104105
except Exception as e:
105106
logger.error("Error getting plugins with modules", error=str(e))
106107
raise
108+
109+
async def get_all_service_runtimes(self) -> List[PluginServiceRuntimeDTO]:
110+
"""
111+
Get all plugin service runtimes for startup and return them as DTOs.
112+
"""
113+
try:
114+
query = select(PluginServiceRuntime).where(
115+
PluginServiceRuntime.status.in_(["pending", "stopped", "running"])
116+
)
117+
118+
result = await self.db.execute(query)
119+
services = result.scalars().all()
120+
121+
# Convert SQLAlchemy models to Pydantic DTOs for a typed return
122+
return [PluginServiceRuntimeDTO(**service.to_dict()) for service in services]
123+
124+
except Exception as e:
125+
logger.error("Error getting service runtimes", error=str(e))
126+
raise
107127

108128
async def get_plugin(self, plugin_id: str) -> Optional[Dict[str, Any]]:
109129
"""Get a specific plugin by ID."""

0 commit comments

Comments
 (0)