(
Note: This was posted mistakenly while still being in draft. My apologies.)
Sapience, a start-up based in Pune, is reported to be
developing a product to measure software productivity.
Before we start measuring it, how is software productivity to be defined? Over what period is it to be measured - daily, weekly, project duration?
The user, whether external or internal is interested in executable code that performs the functionality expected - nothing more, nothing less.
So the ratio of functionality - as measured by function points, or executable lines of code - delivered to the user, to the staff-hours expended is, according to me, productivity. And it is the productivity of the team; not of individuals.
How many tests were written? How many reviews were held? How many lines of source code documentation were written? All these are meaningless to the user.
But these do effect quality. As such they should be measured.
Here is my partial list of metrics that should be captured (requirement collection, design phase metrics are not mentioned):
1. Lines of code (as accepted by the team/organization - e.g. the STXLN metric of
QA C) added/changed and checked in.
2. Lines of test code added/changed and checked in.
3. Lines of source code documentation added/changed and checked in. The detailed design is in the source code files in Doxygen annotation.
4. Lines of code reviewed.
5. Number of defects reported/week.
6. Defect turn-around time (from the time it is reported till the time it is cleared).
7. Defect fix time (from the time it is assigned till the time it is cleared).
#1 thru #4 can be implemented using scripts and a version control system. At Acme Technologies we had this system but without automated metric collection.
#5 thru 7 can be obtained by having a MySQL database for defects and PHP scripts. This was implemented in Acme Technologies.
But do these metrics really tell the full story? What about lines of code that will never be executed? What about code that should go in to a function but is "in-lined"? What about lines of documentation that are incorrect, or verbose, or confusing, or just repeated? What about tests whose failures mean the same thing? For that, code, documentation, tests, and other artefacts must be reviewed
diligently. Causal analysis must be done for defects found during product/integration testing. Do the
hansei.
To get the story behind the numbers, a proper
software development culture is required. Tools cannot build that culture. Only people can. As Collins says in
Good to Great: getting the right people "on the bus" is important. Tools, if properly used, can help. However, the real danger of tools that supposedly measure productivity, is that they can be easily misused by management
Individual metrics themselves are not as valuable as their trend over the duration of the project, and over different projects. The balance between the numbers too is important. No lines of code being changed could mean reviews, or tests, are inadequate. Lots of lines of code but few lines of test code could indicate a possible future defective product. It would also mean that test driven development (TDD) is not being followed (Why?).
How much is "lots"? How much is "few"? That calls for judgement. Judgement that is based on past data and its variance.
Take a car factory.
The only productivity figure that matters is the number of defect free cars produced in a given time. But what if the cars cannot be sold in the same time? Is that productivity, or waste? Just-in-time or lean manufacturing considers that waste. Is the cars sold the productivity figure if most have to be recalled (witnesss Toyota)?
Let us go into the factory. What if the stamping macine produces only left doors in record numbers, and there are no right doors. Are the workers manning the stamping machine being highly productive. Lean manufacturing says, no. They are building inventory. And inventory is "muda" - waste.
Metrics are important. But do not put blind faith in metrics. Go and check what exactly is being practised on the ground. To quote Taiichi Ohno - “Data is of course important in manufacturing, but I place the greatest emphasis on facts.”
The complete team delivers business results. Components of the team do not deliver business results. Focus on the productivity of the team.