We’ve shipped Plane v2.4.0, and this release brings one of the most requested capabilities to self-hosted deployments:
Plane AI is now available for self-hosted instances.
This means teams running Plane on their own infrastructure can now use AI inside their workspace — with support for OpenAI, Claude, or even local models depending on how you configure your deployment.
Alongside Plane AI, this release also introduces several improvements across work management and migrations.
Highlights
Plane AI for self-hosted
-
Run AI inside your own Plane instance
-
Use OpenAI, Claude, or compatible/local LLMs
-
Bring AI workflows directly into project management
Workspace-level Kanban & Calendar views
-
Visualize work across projects in new layouts
-
Switch between list, board, and calendar for broader planning
Better epic visibility
-
Group work items by Epics in list and board views
-
Understand how tasks roll up into larger deliverables
Initiative rich filters
-
Filter initiatives by assignee, status, priority, and more
-
Quickly slice large projects into meaningful views
Improved intake workflow
- Intake submissions now have a full detail view for easier triage
Faster work creation
- Create work items instantly via a dedicated URL
Smarter Jira imports
- Apply JQL filters during Jira import to migrate exactly the items you want
Full release notes
You can read the full changelog here:
https://plane.so/changelog/release-v-2-4-0-plane-ai-now-available
We’d love to hear from the community:
-
Are you planning to run Plane AI in a self-hosted environment?
-
Which LLMs are you connecting (OpenAI, Claude, local models)?
-
What workflows would you like AI to automate inside Plane?
Share your thoughts, questions, or feedback below ![]()