Always good to remember "what gets measured gets managed". I.e., every single one of these metrics can easily be gamed in a way that would be hard for a non-technical manager or "bean counter" to determine.
Ex: oh, you're measuring number of commits, well, I'll stop squashing and start breaking commits down to single lines. Boom, I'm a 100x software engineer.
Does anyone have a decent list of individual contributor KPI's that are actually useful or insightful? I've, so far (~20 years of experience), never run into individual KPI's that weren't able to be gamed and were actually meaningful.
In this sense I do consider many KPIs often to be good metrics to find out whether the company is on the right path, or as an early warning system for problems that the company might become confronted with in the future.
But as soon as you use KPIs to assess managers, they will become gamed, and thus loose a lot of their value.
i think that KPIs often make sense at an organization level of a business, such as "We want to boost revenue from this product line by X%".
but when you try to translate that to the individual level, it's almost impossible because it's hard/impossible to properly attribute value to individual actions.
I think this is the key distinction. After two decades in the industry I can easily game any variable. Ant things change constantly. A person who seem unproductive today may surprise you tomorrow. You may consider yourself an excellent problem solver and then bump into something that is very difficult for you - but turns out easy for someone else. And so on and so forth.
...and attitudes like these are why most Software Engineering theory is BS. We refuse to measure things in a reproducible way, and would rather stick with subjective pre-scientific concepts such as "clean code" and hit-or-miss rules of thumb for project estimation. You need an objective way of measuring and testing whether your hypothesis (be it that clean codes improves maintainability, or that X strategy is reducing dev churning) is true, and good KPI's are that.
we use the number of closed tasks/user stories but also came across people gaming those too. Lower the scope and cut corners (shitty design, superficial meaningless unit tests, ...). The point I understood, is stay away from people gaming things and have a management tech savy too.. I wroked in a company where the CTO is tech illeterate and everyone is gaming everything, I worked somewhere else where the CTO is the smartest and most tech savy person I have ever met and gaming anything there becomes obvious and people are shown the door out of nowhere (yet for a good reason).
Measuring commits, loc, tickets, and story points, individually, might be gamble, but seeing all those consistently higher or lower seems like an okay indicator.
Aren't most of these actually anti-patterns? The best example is lines of code written. If we measure individual performance on that, it's trivial for people to game the system. Also, number of lines of code doesn't correlate to business outcomes. I much prefer to solve the same business problem with half the lines of code if possible!
If I had to choose KPIs to measure the performance of teams (yes, teams), I would choose the DORA metrics, or something of the sort.
I think the author wanted to improve on the more flawed idea of counting the number of hours worked.
DORA metrics seems to be an even better improvement on the author’s propositions. Thanks for sharing, I did not knew about it and it is interesting.
If I can share my experience, i know that at the root, any measure is flawed and incomplete. However it is always better to measure than not to measure if something is important to you. I know time spent is not a good measure, but I also know I will make more progress at a task if I allocate and track time spent on it. And engineering output is knowingly hard to measure.
Same. Most technical debt is caused by folks with “impressive velocity” in my experience. If any prospective employer tells you that they are measuring your lines of code written, run. They’re too lazy and cost driven to have understand.
Oh god no. None of these are good metrics. All are gameable. If my workplace enacted this then I would quit.
And KPIs are the truncheon that managers use to smash others' heads in. There should be no metrics, for anyone, only RESULTS. "What gets measured gets managed"... why do they not teach the management class this
They also teach you about leading and lagging KPIs. Results are lagging KPIs: nice to know if you are on track to achieving your goals, but useless if you want to know how to go faster/do better/etc.
Counting lines of code from a developer is like doing a word count on a poet's work.
Every single KPI listed above can be manipulated, means nothing, shows nothing except many will try to game the system by:
- Needlessly adding lines of code.
- Committing excessively.
- Closing too many PRs too early or opening too many PRs too soon.
- Meeting attended, people will needlessly attend meetings or create meetings.
- Prod deployments, people will needlessly push to prod.
your manager likely has some rubric for how they'll measure your success.
it may not even be explicitly formulated in their mind, but they have some sort of subjective/qualitative set of expectations. instead of going through the work of precisely defining those expectations and communicating them to the employee, they "empower" the employee to "set their own definition of success".
however, this turns into a game of the IC guessing what the manager wants to hear, which results in two problems:
- if you guess correctly, the manager is happy because they didn't have to go through the work of defining and communicating their expectations and they get to pretend that they are generous and letting employees "drive your own development"
- if you guess incorrectly, you better hope that you accidentally meet the secret rubric your manager has in their head because you know that that's what is used to evaluate performance and determine raises/promotions regardless of what's written down in your quarterly KPIs
All good, but without being actionable they're just vanity metrics that only get discussed during performance reviews. They don't provide any meaningful insights on how to improve actual performance in day-to-day work.
Fluff article that's mostly rehashing things that people kept trying to implement as metrics in software teams ever since the 1970s -- and failed every single time, as evidenced by the fact that people are still searching for reliable ways to gauge programmers.
- Counting stuff does not work. People will stop doing atomic commits and will start pushing 1-3 lines of code or config all the time. Or will start breaking the issues into much smaller ones so as to seem like they are closing many of them.
- Verifying means trying to engage more of a resource that's already precious and stretched thin i.e. engineer attention and time. You will end up squeezing your already strained people even more unless you hire extra.
- Valuable, okay, but short-term or long-term? I met plenty of "valuable driven engineers with a can-do attitude" whose code I had to fix months later because the product development has ground to a halt. Who's more valuable, the person who makes a customer demo possible that pays $1M one-off investment after the demo, or the person who makes sure the customers with really deep pockets will get their features and start paying $500k a year? (Both are valuable and this was obviously a trick / trap question. Point is, you can't only worship the former group and claim the latter is less valuable.)
- Individually attributable is a solved problem. Just count commits + coding lines and judge by that, if that tickles your fancy. Everything else does require the attention of other engineers which is not an efficient expenditure of time and energy.
---
I'll give you one good KPI: make your programmer happy. Recognize that not all work is glorious because they do recognize that as well BUT periodically give them space and time to go off on small adventures to f.ex. optimize a hot path that's proven to be slower than what's deemed productive, or allow them to automate a process, or write a small library that can be reused between teams, or help engage with others who use their code and are unsure how to do so; things like that.
Programming is a very mentally straining job. Help these people not hate their job. There are many well-documented ways to keep employees happy.
Trying to measure everyone will never work until it gets fully automated, at which point it would be unproductive and illogical to do so -- because if you have such a smart automation then why don't you use it to get your actual work done with it, and not measure your inefficient flesh bag employees?
In a nutshell: by the time you get to being able to gauge programmers in a fully automated matter, you would likely just "hire" a bunch of robots and fire the humans.
Ex: oh, you're measuring number of commits, well, I'll stop squashing and start breaking commits down to single lines. Boom, I'm a 100x software engineer.
Does anyone have a decent list of individual contributor KPI's that are actually useful or insightful? I've, so far (~20 years of experience), never run into individual KPI's that weren't able to be gamed and were actually meaningful.