Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
The table of contents is too big for display.
Diff view
Diff view
  •  
  •  
  •  
5 changes: 5 additions & 0 deletions .github/workflows/rust.yml
Original file line number Diff line number Diff line change
Expand Up @@ -67,11 +67,16 @@ jobs:

- name: Build console (release with features)
run: cargo build --release --target ${{ matrix.target }} --bin console --features explain --verbose

- name: Build stacker-cli (release)
run: cargo build --release --target ${{ matrix.target }} --bin stacker-cli --verbose

- name: Prepare binaries
run: |
mkdir -p artifacts
cp target/${{ matrix.target }}/release/server artifacts/server
cp target/${{ matrix.target }}/release/console artifacts/console
cp target/${{ matrix.target }}/release/stacker-cli artifacts/stacker-cli
tar -czf ${{ matrix.artifact_name }}.tar.gz -C artifacts .
- name: Upload binaries
uses: actions/upload-artifact@v4
Expand Down

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

91 changes: 91 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,6 +2,97 @@

All notable changes to this project will be documented in this file.

## [0.2.3] — 2026-02-23

### Changed — `stacker init` now generates `.stacker/` directory

- `stacker init` now creates `.stacker/Dockerfile` and `.stacker/docker-compose.yml` alongside `stacker.yml`, so the project is ready to deploy immediately without running `deploy --dry-run` first
- Dockerfile generation is skipped when `app.image` or `app.dockerfile` is set in the config
- Compose generation is skipped when `deploy.compose_file` is set

### Changed — `stacker deploy` reuses existing `.stacker/` artifacts

- `deploy` no longer errors when `.stacker/Dockerfile` or `.stacker/docker-compose.yml` already exist (e.g. from `stacker init`)
- Existing artifacts are reused; pass `--force-rebuild` to regenerate them

### Added — `--ai-provider`, `--ai-model`, `--ai-api-key` flags on `stacker-cli init`

- The `stacker-cli` binary (`console/main.rs`) now supports all AI-related flags that the standalone `stacker` binary already had:
- `--ai-provider <PROVIDER>` — openai, anthropic, ollama, custom
- `--ai-model <MODEL>` — e.g. `qwen2.5-coder`, `deepseek-r1`, `gpt-4o`
- `--ai-api-key <KEY>` — API key for cloud AI providers

### Added — AI troubleshooting suggestions on deploy failures

- On `stacker deploy` failures (`DeployFailed`), CLI now attempts AI-assisted troubleshooting automatically
- It sends the deploy error plus generated `.stacker/Dockerfile` and `.stacker/docker-compose.yml` snippets to the configured AI provider
- If AI is unavailable or not configured, CLI prints deterministic fallback hints for common issues (for example `npm ci` failures, obsolete compose `version`, missing files, permissions, and network timeouts)

### Fixed

- `stacker-cli init --with-ai --ai-model qwen2.5-coder` no longer fails with an unrecognised flag error
- `stacker deploy` after `stacker init` no longer fails with `DockerfileExists` error

## 2026-02-23

### Added - Configurable AI Request Timeout

- New `timeout` field in `ai` config section of `stacker.yml` (default: 300 seconds)
- `STACKER_AI_TIMEOUT` environment variable overrides the config value
- Timeout applies to all AI providers (OpenAI, Anthropic, Ollama, Custom)
- Useful for large models on slower hardware: `STACKER_AI_TIMEOUT=900 stacker init --with-ai`
- Example stacker.yml:
```yaml
ai:
enabled: true
provider: ollama
model: deepseek-r1
timeout: 600 # 10 minutes
```
- 9 new tests for timeout resolution

### Added - Stacker CLI: AI-Powered Project Initialization

#### AI Scanner Module (`src/cli/ai_scanner.rs`)
- New `scan_project()` function performs deep project scanning, reading key config files (`package.json`, `requirements.txt`, `Cargo.toml`, `Dockerfile`, `docker-compose.yml`, `.env`, etc.) to build rich context for AI generation
- `build_generation_prompt()` constructs detailed prompts including detected app type, file contents, existing infrastructure, and env var keys (values redacted for security)
- `generate_config_with_ai()` sends project context to the configured AI provider and returns a tailored `stacker.yml`
- `strip_code_fences()` strips markdown code fences from AI responses
- System prompt encodes the full `stacker.yml` schema so the AI generates valid, deployable configs
- 16 unit tests

#### AI-Powered `stacker init --with-ai` (`src/console/commands/cli/init.rs`)
- `stacker init --with-ai` now scans the project and calls the AI to generate a tailored `stacker.yml` with appropriate services, proxy, monitoring, and hooks
- New CLI flags on `stacker init`:
- `--ai-provider <PROVIDER>` — AI provider: `openai`, `anthropic`, `ollama`, `custom` (default: `ollama`)
- `--ai-model <MODEL>` — Model name (e.g. `gpt-4o`, `claude-sonnet-4-20250514`, `llama3`)
- `--ai-api-key <KEY>` — API key (or use environment variables)
- `resolve_ai_config()` resolves AI configuration with priority: CLI flag → environment variable → defaults
- Environment variable support: `STACKER_AI_PROVIDER`, `STACKER_AI_MODEL`, `STACKER_AI_API_KEY`, `OPENAI_API_KEY`, `ANTHROPIC_API_KEY`
- Graceful fallback: if AI generation fails (provider unreachable, invalid YAML), automatically falls back to template-based generation
- AI-generated configs include a review header noting the provider and model used
- If AI output fails validation, raw draft is saved to `stacker.yml.ai-draft` for manual review
- 8 new unit tests (18 total in init.rs), 3 new integration tests (11 total in cli_init.rs)

#### Usage Examples
```bash
# AI-powered init with local Ollama
stacker init --with-ai

# AI-powered init with OpenAI
stacker init --with-ai --ai-provider openai --ai-api-key sk-...

# AI-powered init with Anthropic (key from env)
export ANTHROPIC_API_KEY=sk-ant-...
stacker init --with-ai --ai-provider anthropic

# Falls back to template if AI fails
stacker init --with-ai # no Ollama running → template fallback
```

### Test Results
- **467 tests** (426 unit + 41 integration), 0 failures

## 2026-02-18

### Fixed
Expand Down
Loading
Loading