Data-Driven Performance Calibration: Beyond Gut Feelings
Calibration is where the "real" performance ratings happen. You sit in a room (or a Zoom) with other managers and try to agree on what "Exceeds Expectations" actually looks like.
Without data, calibration becomes a contest of who can shout the loudest or who is better at storytelling. With data, it becomes a professional discussion about impact.
The Subjectivity Trap
"I think Mark did a great job" is subjective. "Mark led the migration of our auth system, which reduced our error rate by 40% and was completed on schedule despite losing two team members mid-quarter," is objective.
Building Your Calibration Case
You should be preparing for calibration all year, not just the week before.
1. Cumulative Evidence
Collect "artifacts" of impact throughout the quarter:
- Completed Kanban tasks.
- Links to significant PRs or RFCs.
- Feedback from other teams.
- Performance metrics (uptime, latency, velocity) that the engineer directly influenced.
2. The "Growth Delta"
Don't just show where they are—show where they were. Calibration should account for trajectory. An engineer who moved from needing constant supervision to leading features independently has a higher "Exceeds" potential than someone who has been coasting at a high level.
3. Comparison to the Level
Compare the data against your company's career ladder. Does the evidence show behaviors expected at the next level? That is the strongest argument for a top rating or a promotion.
Handling the Group Dynamic
In the calibration room, you will face pushback. Other managers might have different standards.
When challenged, refer back to your notes:
- "Actually, I have a record here from February where Sarah handled that incident exactly using the senior-level protocols we've defined."
- "While his velocity was lower this month, my notes show he spent 30% of his time mentoring the three new hires we just brought on."
The Fairness Factor
Data-driven calibration is the only way to ensure fairness. It protects quiet high-performers and ensures that "glue work" (the work that makes the team better) is valued as highly as "shipping code."
Post-Calibration: Closing the Loop
After calibration, go back to your team. Use the feedback from other managers to refine their growth plans. If someone didn't get the rating they expected, use your data to show them exactly what the gap is.
Calibration isn't about defending a rating—it's about aligning on the truth of impact.
About the Author
Carlos Corrêa da Silva is an Engineering Manager and the builder of Ledger, a tool designed to help engineering managers maintain context on their teams. He focuses on making people management more systematic and less reliant on memory.