Enable javascript in your browser for better experience. Need to know to enable it? Go here.

Curated shared instructions for software teams

Last updated : Apr 15, 2026
Apr 2026
Adopt ?

As teams mature in their use of AI, relying on individual developers to write prompts from scratch is emerging as an anti-pattern. We advocate for curated shared instructions for software teams, treating AI guidance as a collaborative engineering asset rather than a personal workflow.

Initially, this practice focused on maintaining general-purpose prompt libraries for common tasks. We’re now seeing a more effective evolution specifically for coding environments: anchoring these instructions directly into service templates. By placing instruction files such as CLAUDE.md, AGENTS.md or .cursorrules into the baseline repository used to scaffold new services, the template becomes a powerful distribution mechanism for AI guidance.

During our Radar discussions, we also explored a related practice: anchoring coding agents to a reference application. Here, a live, compilable codebase serves as the source of truth. As architecture and coding standards evolve, both the reference application and embedded instructions can be updated. New repositories then inherit the latest agent workflows and rules by default. This approach ensures consistent, high-quality AI assistance is built into every project from day one, while separating general prompt libraries from repository-specific AI configuration.

Nov 2025
Adopt ?

对于积极在软件交付中使用 AI 的团队,下一步是超越个人提示词,转向为软件团队精选共享指令。此做法通过分享经过验证的高质量指令,让你能够将 AI 高效地运用于所有交付任务——而不仅仅是编码。最直接的实现方式,是像提交 AGENTS.md 之类的指令文件一样,将它们直接纳入项目仓库。大多数 AI 编码工具——包括 CursorWindsurfClaude Code——都支持通过自定义斜杠命令或工作流共享指令。对于非编码任务,你可以建立开箱即用的组织级提示词库。这种系统化方法便于持续改进:每当有提示词被优化,全体成员都可受益,从而确保大家始终使用最佳的 AI 指令。

Published : Nov 05, 2025

Download the PDF

 

 

 

English | Português 

Sign up for the Technology Radar newsletter

 

 

Subscribe now

Download the PDF

 

 

 

English | Português 

Sign up for the Technology Radar newsletter

 

 

Subscribe now

Visit our archive to read previous volumes