131
Chapter 9
Agile Metrics
e purpose and character of agile metrics empathize with the Agile Manifesto.
Individuals and interactions over processes and tools.
Agile Manifesto
e proper design and implementation of agile metrics can add value and enable
the agile way of developing software.
Box 9.1 AnAlogy: DeciBels
e human ear transforms sound waves into audio signals. When the sound
wave intensity increases tenfold in strength, the response of the human ear
goes up 1 point. When sound intensity increases a hundred times, the human
ear registers strength of 2 points. e response is logarithmic. Sound level
is measured in decibels, which are essentially logarithms of sound wave
intensity.
132 Simple Statistical Methods for Software Engineering
Classic Metrics: Unpopular Science
Classic software metrics have become an unpopular science, at least in parts, because
of the evolution of software engineering. A few rigorous metrics have rendered truth
less accessible, and such ill-designed metrics turned out to be masks; instead of reveal-
ing, they hid truth. Ambiguous metrics have made things worse being subject to
multiple interpretations. e majority of classic metrics are not direct.
Effort variance, for example, is seldom based on true effort spent but is based on
permissible numbers. An engineer may have spent 14 hours in a day but is asked to
enter only 8 hours because that is the billable amount and the client is not paying
for overtime. In this case, metrics breed hypocrisy, the very evil it seeks to ght.
Users need to “understand” the hidden context and guess the hidden meaning.
Schedule variance in a business governed by a service-level agreement has no
meaning but is still mentioned in the metric plan. Numbers are entered to fill
reports; no one from the project takes such metric report seriously.
Productivity metrics and corporate goals on productivity are a source of peren-
nial conflict. A common goal is set for all categories of tasks, unmindful of the dif-
ferences in the underlying engineering principles and process capabilities. In such a
case, no one believes productivity data.
Programmers have strong reasons for not believing the very definition of certain
metrics. Managers have reasons for not trusting metric data; they fear fudging. Life with
classic metrics goes on with deeply running distrust. Pressure for CMMI certification
Sound Intensity Relative Response of Ear
10 1
100 2
1000 3
10,000 4
100,000 5
e ear can process and respond to a remarkably wide range of sound
intensities with ease.
Earthquake intensities are also expressed in a logarithmic scale called the
Richter scale. When earthquake force goes up one point in the Richter scale,
actual energy released goes up 10 times.
Story points are similar to these logarithmic scales. e story point can
accommodate a wide range of practical software sizes and present them in
single digit numbers.
Agile Metrics 133
has pushed many organizations into this self-defeating situation. Because auditors insist
on well-behaved data as evidence of good process control, people remove outliers”—
and truth. Errors in metric data are eliminated, destroying opportunities for learning.
Pressure is on metrics to present a perfect picture of life, in a totally unsci-
entic manner.
Metrics, when misused, establish illusions of perfection and scientic
superstitions.
ere have been exceptions. ere are genuine metric users, but they are far too
few to turn the tide of opinion on metrics.
I believe in metrics, not politics.
Narayana Murthy
Chairman, Infosys
Two Sides of Classic Metrics
Troubles notwithstanding, classic metrics have made a point. People have learned
to use numbers, at least when they want to, rst by imitating best practices and later
by acquired capability. Classic metrics have always had great potential (untapped
though). ose organizations that followed Personal Software Process (PSP) met-
rics did benefit: they reduced defects. Humphrey’s vision of discipline from data
Box 9.2 MeDicAl AnAlogy: MRiscAn
AnD Pulse ReADing
A doctor asks for an MRI scan of the head to treat a headache. He wants
to eliminate possible cause: clots in the brain. e MRI scan is a costly and
elaborate procedure. e doctor is going through a causal analysis. e scan
is normal, and the doctor pursues further causes of the headache, with the
cost paid by the patient through trial and error. Finally, the patient could not
be cured of his headache.
e patient decides to take naturopathy treatment. e skilled doctor
reads his pulse, understands the problem, and suggests physiotherapy exer-
cises and yoga for a month. e patient is cured.
Agile metrics are similar to reading the pulse. Data are cheap, simple, and
quick; the solution is self-healing. e pulse reading reveals more to a trained
doctor than the MRI scan.
134 Simple Statistical Methods for Software Engineering
worked on the one side; organizations showed Return on Investment (ROI). On
the other side, PSP metrics were not popular and were resisted by teams. ere were
two sides to PSP metrics, the good and the bad. People wanted simpler solutions
and pushed aside PSP metrics.
Automated metrics survived, which depend on tools, and people were asked just
to use data not to collect data. at by itself made a huge difference. e drawback
is that nobody is motivated to use data collected by tools. A seeming gain became a
loss. Such data did not connect well with people except in the form of reports and
business intelligence; the intended recipients were senior managers and customers.
Designers and testers did not care.
Simple core metrics were good and they worked. Fancy special metrics did
not. Metrics planned and metrics used were worlds apart. At least metric helped
in goal setting. Process control loop using metrics were too circuitous and oper-
ated with time lags that by the time reports came projects were nished. Even
the core metrics did not reach teams in time. e feedback loop disappeared on
the way. Care was taken to install a metric, but no one thought about human
communication.
Without communication links with people, metrics fail.
Metrics for Agile: Humanization
Agile projects have put people before processes and humanized metrics. Agile metrics
instantly reach people through displayed charts and daily meetings. Metrics them-
selves are kept light, user-friendly, and direct. Code complexity is measured as high,
medium, or low instead of using the sophisticated McCabe complexity number.
Software sizes come in story points; one does not have to count function points using
time-consuming rules by the International Function Point Users Group. More than
that, metrics are readily used. Graphical elements have become signifiers instead of
numerical indicators, where possible. e burn-down chart has replaced several met-
rics at once. Agile metrics influence day to day decision making.
The Price of Humanization
Agile metric data are mostly in the ordinal scale (like in high–medium–low” judg-
ments). Ordinal data can be collected quickly without hassle. Ordinal data also have
a degree of calculated approximation. e contrast is between precise but less used
classic metric data versus approximate but humanized agile data. Working with a
lower scale suffers information loss, a price we pay for humanization. However, that
loss does not make agile metrics less scientific than classic metrics, if users are aware
of the degree of approximation.
Agile Metrics 135
The aim of science is not to open the door to infinite wisdom,
but to seta limit to infinite error.
Bertolt Brecht
Life of Galileo
Sometimes a clearly defined error is the only way to discover
the truth.
Benjamin Wiker
The Mystery of the Periodic Table
Common Agile Metrics
Velocity
Agile velocity is story points delivered per sprint. is is the rate at which the team
delivers tested features. To obtain the best out of this metric, sprints (or iterations)
must be of consistent length.
Story Point
is is an agile metric of software size, an agile counter part of the classic function
point. Story point is a numerical expression of textual metrics of size: very small,
small, medium, large, very large, and so on. Because human judgment is in a non-
linear scale, we prefer a nonlinear order of numbers: 1, 2, 3, 5, 8, 13, 21, 34, 55, 89,
144, 233,…, the Fibonacci series. In the Fibonacci scale, the following conversion
table can be used to express software size.
Software Size Story Point
Very low 1
Low 2
Medium 3
Large 5
Very large 8
Extremely large 13
Next level 21
..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
13.59.178.179