How Evidence Turns Up in Software Engineering

In crime dramas, the most interesting moments are often when a new piece of evidence appears: someone presents a fact that was not previously known, and the clever detective revises his theory of the crime. Sometimes the new evidence adds an important dimension to our world view (“Oh, so he had intended all these things to happen!”), sometimes it just adds a little more conviction to a belief we already had (“She’s really not as careful as she claims”), and sometimes it overturns something we were sure was reliable (“Golly! And I always thought he had arrived there an hour after her!”). The drama hinges on the critical thinking ability of the detective. Superior fictional detectives take into account every single bit of evidence, they work until they can integrate it all into a coherent whole, and they continually test their theory of the crime and revise it in the light of new evidence, whereas weak detectives get snagged on confirmatory bias, holding fast to theories even when they can’t account for “loose ends” of evidence.

Software development is similar: evidence emerges over time, and the quality of the engineering hinges on the critical faculties of the engineer. If we succumb to confirmatory bias, we pay most attention to evidence that confirms our views. If we’re more critically vigilant, we sometimes find that new information suggests we have overlooked some aspect that has to be added to the picture, or we even find that some dearly held belief is contradicted by reality.

Where does such relevant new information come from? It comes from a variety of sources:

Experience

Software engineers use technology and methods, participate in projects, and learn things along the way.

Other people

Software engineers talk to people they know, and they listen to recommendations from people they respect.

Reflection

Effective software engineers think hard about what they do, about what works, what doesn’t, and why.

Reading

Written materials, both informal (such as high-quality blog posts) or formal (such as scientific articles) transport insights from other parties.

Scientific (or quasi-scientific) exploration

Software engineers conduct software experiments, undertake user studies, and make systematic comparison of alternatives.

Not everyone is satisfied that software engineering has an adequate mastery of evidence. There have been a number of calls for evidence-based software engineering (e.g., [Finkelstein 2003], [Kitchenham et al. 2004]), that is, for providing skills, habits, and infrastructure for integrating empirical research results in order to support decision making in software engineering. The comparison is usually made to evidence-based medicine and its reliance on rigorous clinical evidence. The point is not so much that software engineers ignore evidence (although arguably opinion- and superstition-based software engineering exist as well), but that the discipline lacks an organizational, cultural, and technical infrastructure to support the accumulation and aggregation of knowledge from a number of sources. Barbara Kitchenham presents this movement in Chapter 3.

It’s also clear that there are a number of fine introspective and reflective accounts of engineering that inform software engineering practice. Writers such as Fred Brooks and Walter Vincenti bring practice into focus by distilling experience and by using illustrations from practice and experience to consider the nature of engineering activity. We would hardly reject their accounts just because they based their essays on reflective experience rather than on scientific experiment, although we may think hard about their interpretations, whether we agree with them, whether they fit our own contexts, and whether they match our own experience—that is, whether they are consistent with other evidence with which we’re familiar.

But regardless of whether we resort to scientific studies (on which we will focus in the rest of this chapter) or rely on reports of enlightened practitioners, we argue that, as software engineers, we need a sufficiently sophisticated understanding of evidence to inform our decisions and to assess our evidence needs in terms of the questions we want to answer. It’s not that we need more metrics per se, or a widespread application of scientific method, or a rigidly defined methodology and signed-off lab books. Imposing processes or method does not by itself produce quality and certainly does not guarantee it. Rather, such proposals are usually an attempt to approximate the real goal: systematic, well-informed, evidence-based critical thinking.

The meat of this discussion is the business of “thinking hard” about evidence: how we evaluate evidence in terms of its credibility and fitness for purpose.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
3.142.12.207