Pattern Summary
The Multi-Team Review is a combined review of a product, service or solution by multiple teams to obtain feedback on the product and ways to improve it. Multi-Team Review increases transparency, alignment and continuous improvement at the end of an iteration or increment as teams showcase their completed work and demonstrate product features, rather than deliver presentations. This review is crucial for validating system-wide integration, gathering stakeholder feedback, and ensuring development stays aligned with business goals and customer needs.
Related Patterns
Multi-Team Planning
Symptom Categories
Lack of Alignment Across Teams, Poor System Integration & Quality Issues, Limited Transparency & Feedback Loops, Ineffective Course Correction & Risk Management, Reduced Team Collaboration & Knowledge Sharing, Problem of determining progress and status
Symptoms Addressed
Detailed Description
The Multi-Team Review is a combined review of a product, service or solution by multiple teams to obtain feedback on the product and ways to improve it. Multi-Team Review increases transparency, alignment and continuous improvement at the end of an iteration or increment as teams showcase their completed work and demonstrate product features, rather than deliver presentations. This review is crucial for validating system-wide integration, gathering stakeholder feedback, and ensuring development stays aligned with business goals and customer needs.
In this pattern, multiple teams working on different components or features of a system come together at the end of an iteration or increment to showcase their completed work. The primary objectives of the review are to:
The Multi-Team review is often time-constrained (time-boxed) and structured to focus on functionality rather than presentations, emphasizing working product over slides. Stakeholder and team feedback gathered during the review provides valuable insights into user needs, technical challenges, and business priorities. The results of the review inform future iterations, backlog refinements, and necessary course corrections to maintain alignment with overall business goals.
This pattern plays a critical role in scaled agile environments by enhancing transparency, accelerating learning cycles, and ensuring that the developed system meets business and customer expectations through continuous feedback and adaptation.
In Frameworks
Different Agile frameworks provide structured approaches to Multi-Team Review, each aligning with specific review methods:
Scaled Agile Framework (SAFe):
In SAFe, the Multi-Team Review is implemented as the System Demo, which occurs at the end of every iteration (Sprint) and Planning Interval (PI) to validate system-wide integration. Each team within the ART demonstrates their completed features in a real environment, ensuring that multiple components work together as intended. Unlike individual team reviews, the System Demo in SAFe is a cross-team, enterprise-level event focused on end-to-end functionality rather than isolated team outputs.
Scrum@Scale:
In Scrum@Scale, the event analogous to a Multi-Team Review is referred to as the Scaled Sprint Review, which is an optional event depending on the organization’s needs. The Scaled Sprint Review focuses on the integration of features from various Scrum teams, aiming to provide transparency, promote cross-team collaboration, and ensure alignment with business priorities. When used, it typically occurs at the end of each Sprint and serves as a mechanism for inspecting the progress of the entire product increment, rather than individual team outputs.
Nexus:
The Multi-Team Review in Nexus takes place at the end of each Sprint and serves as an opportunity for all Scrum teams within the Nexus to present their incremental contributions. Unlike individual team demos, the Multi-Team Review brings together all teams’ work to ensure the full product increment is functioning as expected and aligns with the product goals. It is a live demonstration of the integrated work, showcasing how the different teams’ efforts fit together into the complete system.
Extreme Programming (XP):
XP’s approach to the Multi-Team Review is built around its core principles of continuous feedback, communication, and simplicity. This event typically takes place at the end of each iteration, with teams showcasing the fully integrated system to stakeholders. Rather than focusing on isolated features, XP’s Multi-Team Review emphasizes the seamless integration of working code, enabling teams to demonstrate the value delivered in the most efficient, functional form.
Product Operating Model (POM):
In the Product Operating Model, the Multi-Team Review is not a fixed event but is integrated into cross-functional collaboration workflows. In the Squad and Tribe Model, reviews happen at the tribe level, where squads align on progress and integration. Feature-based teams conduct cross-team reviews to ensure seamless product integration, while the Triad Model (Product Manager, Designer, Engineering Lead) enables continuous strategic and technical alignment through regular check-ins. POM adapts review mechanisms to fit organizational needs, focusing on continuous feedback, decentralized decision-making, and alignment across autonomous teams.
Timing & Frequency
The Multi-Team Review takes place at the end of an iteration or increment (e.g., sprint, Planning Interval).
It is a time-boxed event, ensuring focused discussions and efficient execution.
Participants
Development Teams: All teams working on different components of the system.
Stakeholders: Product owners, business representatives, end-users, and leadership.
Facilitator (optional): A Scrum Master, Release Train Engineer, or Agile Coach may help structure the session.
Format & Structure
Do Not Use This Pattern When
While the Multi-Team Review is highly effective in ensuring integration and alignment across multiple teams, there are scenarios where it may not be the best approach:
These considerations help ensure that the chosen review method aligns with the team’s size, integration needs, development stage, and overall organizational context.
Scaled Agile Framework (2024). System Demo. Retrieved from https://framework.scaledagile.com/system-demo
Scrum@Scale (2024). Scrum@Scale Guide. Retrieved from https://www.scrumatscale.com/scrum-at-scale-guide/
Scrum.org (2015). Sprint Review with Multiple Teams Developing One Product. Retrieved from https://www.scrum.org/forum/scrum-forum/5354/sprint-review-multiple-teams-developing-one-product
Scrum.org (2024). Nexus Guide. Retrieved from https://www.scrum.org/resources/nexus-guide
Beck, K. (2004). Extreme Programming Explained: Embrace Change (2nd ed.). Addison-Wesley.
Extreme Programming.org (n.d.). Extreme Programming Practices. Retrieved from https://www.extremeprogramming.org
Agile Alliance (n.d.). XP - Extreme Programming. Retrieved from https://www.agilealliance.org/glossary/xp
ThoughtWorks (2022). Building a Modern Digital Product Operating Model. Retrieved from https://www.thoughtworks.com/insights/articles/building-modern-digital-product-operating-model
Fowler, M. (2017). Product Mode. Retrieved from https://martinfowler.com/articles/product-mode.html
PandaDoc (n.d.). How to Run a Successful Multi-Team Sprint Review. Retrieved from https://bamboo.pandadoc.com/issue002/how-to-run-a-successful-multi-team-sprint-review
Labunskiy, E. (2017). Multi-Team Sprint Review – How Do We Do It?. Retrieved from https://www.linkedin.com/pulse/multi-teams-sprint-review-how-do-we-evgeniy-labunskiy/
Beck, K., & Andres, C. (2004). Extreme Programming Explained: Embrace Change (2nd ed.). Addison-Wesley.
Agile Manifesto (2001). Manifesto for Agile Software Development. Retrieved from https://agilemanifesto.org
Continuous Delivery (n.d.). Continuous Delivery and Integration. Retrieved from https://continuousdelivery.com
Product Operating Models: How Top Companies Work Retrieved from https://productschool.com/blog/product-strategy/product-operating-model