All updates
frontend
backend
auth
integrations

Per-user LLM configuration and project LLM settings

  • Save and use your own OpenAI/Anthropic API keys per user or per projectUsers can now supply their own OpenAI or Anthropic API keys (BYOK). Keys are stored securely in Vault, returned to the UI only in masked form, and can be selected per project alongside provider and model choices to customize changelog generation.

New Features

  • Added backend endpoints to manage per-user LLM configurations (GET and POST) for OpenAI and Anthropic providers.backend
  • Added project-level LLM settings and UI controls so teams can choose provider, pick a model, and optionally supply a per-project API key.frontend

API

  • Introduced /v1/user/llm-config (GET/POST) and a Next.js proxy route to forward the authenticated Clerk user ID when storing or retrieving keys.backend

Improvements

  • Updated project settings PATCH to accept llm_provider, llm_model, and llm_api_key and to persist provider/model values for each project.backend
  • Updated ProjectSettingsForm to show provider and model selectors, display masked existing keys, and let users submit or replace an optional BYOK API key.frontend
  • Added a small client-side model list library for OpenAI and Anthropic so model options and defaults are consistent in the UI.frontend

Security

  • Stored BYOK API keys in Vault and never saved plain-text keys in the database; GET responses return only masked key fragments to the UI.backend