Tip 1Beat Up Your Code
White Belt[​​White Belt] As soon as you write production code, you need to prove it can take a beating.

You might think that writing solid code is an obvious job requirement. It’s not like the job post said “Wanted: programmer with good attitude, team player, foosball skills. Optional: writes solid code.” Yet so many programs have bugs. What gives?

Before we get into detailed discussions of day-to-day practices for assuring code quality, let’s discuss what it means to write solid code. It’s not just a list of practices; it’s a mind-set. You must beat up your code, and the product as a whole, before it goes out to customers.

The customer, after all, will beat up your product. They’ll use it in ways you don’t anticipate. They’ll use it for extended periods of time. They’ll use it in environments you didn’t test in. The question you must consider is this: how many bugs do you want your customer to find?

The more you beat up your code right now, before it gets into customers’ hands, the more bugs you’ll flush out, and the fewer you’ll leave for the customer.

Forms of Quality Assurance

Although much of this chapter focuses on code-level quality and unit testing, assuring product quality is a much larger topic. Let’s consider what your product will need to endure.

Code Review

The first obvious, simple way to assure code quality is to have another programmer read it. It doesn’t need to be a fancy review, either—even pair programming is a form of real-time code review. Teams will use code reviews to catch bugs, enforce coding style and standards, and also spread knowledge among team members. We’ll discuss code reviews in Tip 8, Review Code Early and Often.

Unit Tests

As you’re building the business logic of your application, class by class and method by method, there’s no better way to verify your code than with unit tests. These innards-level tests are designed to verify bits of logic in isolation. We’ll discuss them in Tip 2, Insist on Correctness and Tip 3, Design with Tests.

Acceptance Tests

Where unit tests view the product from the inside out, acceptance tests are designed to simulate real-world users as they interact with the system. Ideally, they are automated and written as a narrative of sorts. For example, an automated bank teller application could have an acceptance story like this: given that I have $0 in my checking account, when I go to the ATM and select “Withdrawal” from “Checking Account,” then I should see “Sorry, you’re eating Ramen for dinner tonight.”

Shakespeare it is not, but these tests exercise the whole system from the user interface down to business logic. Whether they’re automated or performed by people, your company needs to know—before any customers play with it—that all system components are cooperating like they should.

Load Testing

Load tests put the product under realistic stress and measure its responsiveness. A website, for example, may need to render a given page in 100 milliseconds when there are a million records in the database. These tests will uncover correct-but-bad behavior, such as code that scales exponentially when it needs to scale linearly.

Directed Exploratory Testing

Acceptance tests cover all of the product’s behavior that was specified, perhaps via a product requirements document or meetings. Yet programmers can usually think of ways to break it—there are always dark corners that the specification overlooks. Directed exploratory testing ferrets out those corner cases.

This testing is often performed by a human, perhaps the programmers themselves, to explore and discover problems. Past the initial exploration, however, any useful tests are added to the acceptance test suite.

There are specialized variations on this theme, such as a security audit. In those cases, a specialized tester uses their domain expertise (and perhaps code review) to direct their testing.

Agency Testing

Hardware products need various agency certifications: the FCC measures electromagnetic emissions to ensure the product doesn’t create radio interference; Underwriter’s Laboratories (UL) looks at what happens when you set the product on fire or lick its battery terminals. These tests are run before a new product is launched and any time a hardware change could affect the certification.

Environmental Testing

Hardware products also need to be pushed to extremes in operating temperature and humidity. These are tested with an environmental chamber that controls both factors; it goes to each of the four extremes while the product is operating inside.

Compatibility Testing

When products need to interoperate with other products—for example, a word processing program needs to exchange documents with other word processors—these compatibility claims need to be verified on a regular basis. They may run against a corpus of saved documents or in real time with your product connected to other products.

Longevity Testing

You’ll notice that most of the tests mentioned here are run as often and as quickly as possible. Some bugs, however, show up only after extended use. Our 49.7-day bug is a good example—that comes from a 32-bit counter that increments every millisecond, and after 49.7 days it rolls over from its maximum value back to zero.[2] You won’t be able to find a bug like that unless you run tests for extended durations.

Beta Test

Here’s where the product goes out to real customers—but they’re customers who know what they’re getting into, and they’ve agreed to submit reports if they find problems. The purpose of a beta test is exactly what we discussed at the beginning of this tip: the beta tester will use the product in ways you don’t anticipate, test it for extended periods of time, and test it in environments you didn’t test in.

Ongoing Testing

Your company may continue to test after a product ships. For hardware products in particular, it’s useful to pull a unit off the manufacturing line once in a while and verify that it works. These ongoing tests are designed to capture problems due to variations in parts or assembly process.

Practices vs. Mind-Set

Your team may have practices like “all code must have unit tests” or “all code must be reviewed before checking in.” But none of these practices will guarantee rock-solid code. Think about what you’d do if there were zero quality practices at your company—how would you beat up your code to make sure it’s solid?

This is the mind-set you need to establish before going further. Commit to solid code. The quality practices are just a means to an end—the ultimate judge will be the product’s reliability in the hands of your customers. Do you want to have your name associated with a product that hit the market as a buggy piece of junk? No, of course not.

Actions

  • Of all the forms of testing mentioned earlier, which of these does your company use? Find the unit tests in the source code, ask the test department for the acceptance test plan, and ask how beta tests are done and where that feedback goes. Also ask a senior engineer’s opinion: is this enough to ensure a smooth experience for the customer?

  • Spend some time doing directed exploratory testing, even if your “direction” is somewhat vague. Really use the product to see whether you can break it. If you can, file bug reports accordingly.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
3.147.62.94