skene-growth
A CLI toolkit for analyzing codebases through the lens of Product-Led Growth (PLG) — detecting growth features, revenue leakage, and generating actionable growth plans.
What skene-growth does
- Analyzes your codebase to detect tech stack, growth features, and revenue leakage patterns
- Generates a growth manifest — structured JSON output documenting your product's growth surface area
- Creates growth plans — a Council of Growth Engineers produces 3-5 high-impact growth loops
- Builds implementation prompts that you can send directly to Cursor, Claude, or display in your terminal
- Provides an MCP server exposing 12 tools for AI assistants
- Supports multiple LLM providers: OpenAI, Gemini, Anthropic, LM Studio, Ollama, and any OpenAI-compatible endpoint
Core workflow
# 1. Create a config file
uvx skene-growth config --init
# 2. Set up your LLM provider and API key interactively
uvx skene-growth config
# 3. Analyze your codebase
uvx skene-growth analyze .
# 4. Generate a growth plan
uvx skene-growth plan
# 5. Build an implementation prompt
uvx skene-growth build
Key concepts
Growth manifest (growth-manifest.json) — The primary output of the analyze command. A structured JSON file containing your project's tech stack, existing growth features, growth opportunities, and revenue leakage issues.
Growth template (growth-template.json) — A custom PLG template generated alongside the manifest, with lifecycle stages and metrics tailored to your business type.
Growth plan (growth-plan.md) — A markdown document produced by the plan command. Contains 3-5 selected high-impact growth loops with implementation roadmaps, metrics, and week-by-week timelines.
Growth loops — Individual loop definitions (JSON) generated by the build command. Each loop includes file/function requirements, integration points, telemetry specs, verification commands, and success metrics.
Documentation
Getting started
- Installation — Install via uvx, pip, or from source
- Quickstart — End-to-end walkthrough in 5 commands
Guides
- Analyze — The analyze command in depth
- Plan — Generating growth plans
- Build — Building implementation prompts
- Chat — Interactive terminal chat
- LLM providers — Configuring OpenAI, Gemini, Claude, local LLMs
- Configuration — Config files, env vars, and priority
Integrations
- MCP server — Using skene-growth with AI assistants
Reference
- CLI reference — All commands and flags
- Python API — CodebaseExplorer, analyzers, schemas
- Manifest schema — JSON schema for v1.0 and v2.0 manifests
Help
- Troubleshooting — LM Studio, Ollama, common errors