The Problem
Customers experience one product. Organizations build many.
A customer who manages their domain, configures email, edits their website, and contacts support doesn't think of those as four products built by four teams. They experience one thing. But organizational structure means those surfaces are designed, tested, and iterated on independently. The patterns diverge. The interaction models diverge. The language diverges. Customers feel the seams.
Conway's Law describes exactly this: organizations produce systems that mirror their communication structures. When teams don't talk, products don't cohere. This isn't theoretical — it's observable in most multi-product organizations. And it's not just a visual consistency issue. It's a learning loss issue.
The Tax Shows Up in Three Dimensions
Pattern divergence. Teams build custom UI for already-solved problems because they don't check what exists or can't access it across systems. The same task — editing settings, managing a list, completing a form — works differently across surfaces.
Learning in silos. Team A runs an experiment and discovers what converts. Team B runs the same experiment months later because the insight never traveled. The organization pays twice for the same learning.
Experience fragmentation. No shared accountability for cross-surface coherence. Each team owns their surface. Nobody owns the journey. The product feels assembled, not designed. Trust erodes at every handoff.
Why the Design System Alone Can't Fix This
The design system is one mechanism for addressing pattern divergence, and the infrastructure investment there is meaningful work. But even a perfect design system won't solve learning silos or experience fragmentation. Those require a shared norm across the design organization — a cultural agreement about how teams operate as a connected system rather than a collection of independent units.
And the problem is accelerating. AI-assisted development and vibecoding have increased the rate at which teams ship independently. The stakes haven't changed. The speed of divergence has.
Key decisions and tradeoffs
We prioritized learning over polish
We focused first on interaction patterns and feedback loops. Branding and visual identity came later, once we validated that the experience worked.
We optimized for speed, not completeness
Roughly six months of roadmap was executed in six weeks, with design completed in about four. This required daily communication, frequent demos, and comfort with changing scope.
We designed for trust, not magic
Early testing showed customers were afraid of losing work when interacting with AI. We introduced non-destructive patterns so exploration felt safe and reversible.
We planned for imperfect output
AI doesn’t always produce good results. We designed clear recovery paths so customers weren’t stuck or confused when the output missed the mark.
This pace and mode of working was effective for discovery, but not sustainable long term.
A Mandatory Minimum: Three Questions Before You Ship
This isn't a new process, a review board, or a governance committee. It's a norm — the lowest-friction intervention that creates organizational accountability for coherence.
Every team, before shipping new work, documents answers to three questions:
1. Did you check what exists?
Have you looked at shared patterns, prior experiment results, production patterns, or research from other teams before building from scratch? If something relevant exists, are you using it, extending it, or intentionally diverging — and why?
A good answer: "We found an existing settings pattern from another team. We're extending it with a multi-select variant and contributing that back."
A red flag: "We didn't have time to look." / "We assumed nothing existed."
2. Will your work travel?
Is what you're building or learning something another team could benefit from? This covers both patterns (could this component be reused?) and learnings (did your experiment produce insights other teams should know about?).
A good answer: "Our new onboarding flow tested 15% better on completion. We're sharing results in the next design review and filed a brief with the systems team for the stepper component."
A red flag: "That's not really our responsibility."
3. Will a customer feel the seam?
Have you considered how this experience sits next to adjacent surfaces in the customer journey? If a customer moves from your surface to another, will the interaction patterns, language, and mental models be consistent?
A good answer: "We mapped the handoff from domain setup to the website builder and aligned the confirmation pattern so the transition feels seamless."
A red flag: "We only own our surface."
Document these answers in whatever format works — a Slack canvas, a line in a design spec, a section in a sprint ticket. The format doesn't matter. The thinking does. Five minutes if you've done the work. If it takes longer, that's signal.
Addressing the Objections
"This will slow teams down." Five minutes of documentation is not a velocity bottleneck. If it is, the problem isn't this norm — it's that teams are optimizing for the appearance of speed over sustainable throughput.
"We're in a 0→1 phase." The three questions don't constrain exploration. They ask you to explore in context. Use shared foundations. Share what you learn. Exploration and accountability aren't mutually exclusive.
"Our surface is too different." Customers don't think in surfaces. They think in tasks. If your settings page works fundamentally differently than another team's settings page, that might be the right call — but the question needs to be asked and the answer documented.
"The design system doesn't have what we need." Then extend it. "It doesn't exist yet" isn't permission to build in a silo. It's an invitation to build together so the next team doesn't start from zero.
"We tried the shared pattern and it didn't work." Great — that's exactly the pressure testing shared solutions need. Document it. Share what you learned. Silently walking away deprives the entire organization of that learning and guarantees the next team hits the same wall.
The Underlying Principle
The Divergence Tax compounds quietly. No single instance of local optimization is catastrophic. But across dozens of teams shipping independently over months and years, the cumulative cost is enormous — in duplicated effort, in missed learning, in customer trust lost at every seam.
The fix isn't centralized control. It's shared awareness. Three questions. Five minutes. A norm that says: we build as a connected organization, not a collection of independent teams.