Every integration has a scorecard. And nearly every integration scorecard is measuring the wrong things. It tracks activities completed (email domains merged, Slack workspaces consolidated, org charts updated) rather than outcomes achieved (teams actually collaborating, decisions being made jointly, knowledge flowing across legacy boundaries).
The result is a phenomenon familiar to every PE operating partner: the integration scorecard shows green across every workstream, the PMO reports the integration is "on track," and six months later the acquired company's best people have left, customers are churning, and the two organizations are operating as parallel entities that happen to share an email domain.
The gap between what integration scorecards measure and what integration success actually looks like is the gap between activity and behavior change.
The Vanity Metrics of Integration
These are the metrics that appear on every integration scorecard and mean almost nothing:
- Systems consolidated — "We merged to a single CRM / ERP / Slack workspace / email domain." This is an IT project, not integration. Two teams using the same Slack workspace but never messaging each other have achieved a technology milestone, not a cultural one.
- Org chart finalized — "Reporting lines are established and all roles are filled." An org chart describes where people sit on a diagram. It says nothing about whether they collaborate, whether information flows across the new reporting lines, or whether the new structure actually functions.
- Synergy targets identified — "We have identified $15M in cost synergies." Identifying synergies is PowerPoint. Capturing synergies requires operational execution that depends on cross-organizational collaboration — which the scorecard does not measure.
- Town halls completed — "We held 12 integration town halls across all offices." Communication sent is not communication received. A town hall where the CEO presents slides and takes three pre-screened questions is not meaningful two-way communication. It is theater.
- Policies harmonized — "PTO policies, expense policies, and performance review processes are now unified." Policy alignment is necessary but trivially insufficient. Two companies can have identical policies and completely incompatible operating cultures.
The Real Metrics of Integration Success
Integration succeeds when the behavioral patterns of the combined organization reflect genuine collaboration, shared decision-making, and operational cohesion. These behaviors are measurable.
Metric 1: Cross-Legacy Communication Density
- What it measures — The volume and frequency of communication between employees from the acquiring company and employees from the acquired company, normalized by team size and proximity of function.
- What success looks like — Within 90 days of close, cross-legacy communication should reach at least 40% of the density of within-legacy communication for teams that are expected to collaborate. By month six, the distinction between "legacy acquirer" and "legacy target" should be fading from the communication data — people communicate based on function and project, not based on which company they came from.
- What failure looks like — Cross-legacy communication remains below 15% of within-legacy density after 90 days. Teams from the two legacy organizations operate as separate entities within the same Slack workspace. Engineering from Company A collaborates with engineering from Company A. Engineering from Company B collaborates with engineering from Company B. The merger exists on paper only.
- Why it matters — Every synergy, every operational improvement, every cross-selling initiative depends on people from the two organizations actually working together. Cross-legacy communication density is the leading indicator for whether that will happen.
Metric 2: Decision Convergence Rate
- What it measures — The speed at which the combined organization develops shared decision-making processes, measured by the reduction in duplicate or conflicting decisions across legacy entities.
- What success looks like — Within 60 days, operational decisions (vendor selection, process changes, tool choices) are being made jointly by cross-legacy teams. By month four, strategic decisions reference shared data and involve stakeholders from both legacy organizations. Decision authority is clearly mapped and respected.
- What failure looks like — Legacy entities continue making parallel decisions. Company A's engineering team selects a monitoring tool. Company B's engineering team selects a different one. Neither consulted the other. Six months in, the combined company is running duplicate systems because nobody established a unified decision process.
- Why it matters — Decision convergence is the behavioral proof that integration authority is real. Without it, the integration PMO can check every box on the scorecard while the organizations operate independently.
Metric 3: Knowledge Flow Velocity
- What it measures — The speed at which institutional knowledge transfers across legacy boundaries, measured by the rate at which employees from one legacy entity begin accessing, contributing to, and referencing knowledge assets created by the other.
- What success looks like — Within 90 days, shared documentation repositories show contributions from both legacy entities. Questions posted by employees from one legacy entity are answered by employees from the other. Knowledge is flowing bidirectionally, not just from acquirer to target.
- What failure looks like — Documentation remains siloed. Company A's wiki and Company B's wiki coexist without cross-pollination. New employees are told to "ask someone from the other side" for information that should be accessible to everyone. Institutional knowledge remains locked in legacy silos.
- Why it matters — Knowledge flow is the mechanism by which operational synergies become real. A combined sales team that cannot access the other legacy entity's customer playbooks, competitive intelligence, or product documentation is not actually a combined sales team.
Metric 4: Meeting Integration Ratio
- What it measures — The proportion of meetings that include participants from both legacy entities versus meetings that are legacy-homogeneous.
- What success looks like — By month three, at least 50% of cross-functional meetings include participants from both legacy organizations. By month six, meeting composition reflects functional alignment rather than legacy affiliation. The concept of an "Company A meeting" or "Company B meeting" becomes meaningless.
- What failure looks like — Meetings remain legacy-segregated. The Monday product meeting includes only legacy acquirer product managers. The Tuesday product meeting includes only legacy target product managers. Both exist because neither side trusts the other's process enough to combine them.
- Why it matters — Meeting composition is a proxy for trust. People invite colleagues they trust and collaborate with to their meetings. Legacy-segregated meetings three months post-close mean trust has not been established across the integration boundary.
Metric 5: Retention Parity
- What it measures — Voluntary attrition rates compared between legacy acquirer employees and legacy target employees, segmented by seniority and function.
- What success looks like — Voluntary attrition rates for the acquired company's employees are within 1.5x of the acquiring company's baseline rate. High performers from the acquired company are retained at rates comparable to high performers from the acquirer.
- What failure looks like — Acquired company attrition exceeds acquirer attrition by 2x or more. The departures are concentrated in senior roles and high-performance tiers. Exit interviews cite "culture," "loss of autonomy," and "unclear direction" — all symptoms of integration failure.
- Why it matters — Attrition parity is the ultimate integration outcome metric. If the people from the acquired company are leaving at significantly higher rates, the integration has failed at the most fundamental level: the combined organization is not a place where acquired employees want to work.
Sequencing: What to Measure When
Integration measurement should follow a timeline that matches the natural progression of organizational behavior change.
- Days 1-30 — Measure cross-legacy communication initiation. Are people reaching out? Are introductions happening? Are the first cross-legacy meetings being scheduled? This is the "handshake" phase.
- Days 30-90 — Measure cross-legacy communication density and decision convergence. Are the handshakes turning into working relationships? Are decisions being made jointly?
- Days 90-180 — Measure knowledge flow velocity and meeting integration ratio. Is institutional knowledge transferring? Are teams genuinely combining their working processes?
- Days 180-365 — Measure retention parity and long-term collaboration patterns. Has the integration produced a genuinely unified organization, or a federation of legacy entities under one brand?
The integration scorecard should shift from activity tracking to behavioral measurement by day 30. Any scorecard that is still reporting "systems migrated" and "town halls completed" after the first month is measuring inputs while the outputs deteriorate unobserved.
The deals that capture their synergy targets are the ones that instrument behavioral change from day one — and adjust their integration plans based on what the data shows rather than what the PMO reports.