Inside Microsoft: Managing Quality for the Microsoft.com Web Analytics Platform

Microsoft.com is a large, public-facing Web site managed by Microsoft. In addition to being the corporate presence for the company, it is also used to showcase the many Microsoft products ranging from Xbox and Zune to Windows and Office. As of September 2008, Comscore reported that the sites under the Microsoft.com umbrella received around 124 million page views that month, which should provide an indication as to the popularity of the site.

Because the sites within the Microsoft.com umbrella are largely product information pages that are targeted toward consumers of Microsoft products, measuring the success and reach of those pages is critical to understanding the effectiveness of the site. This requires an investment in a business intelligence framework and data analysis toolset that can be used to collect and aggregate data about the usage of individual pages within the Microsoft.com sites. This is primarily the role of one of the Microsoft.com platform feature teams, which builds the logic, infrastructure, and tools for collecting, analyzing, and reporting Web site business analytics data. In addition to the Microsoft.com product-focused sites, the team is also responsible for collecting and aggregating data for such popular sites as MSDN, Codeplex, and Windows Update.

The Importance of Code Quality

For most platform teams at Microsoft, there are numerous other teams depending on the infrastructure, code, or services that the platform team delivers. The quality of their deliverables is expected to be high, and partner teams also require that platform components meet the greater percentage of their usage needs. Therefore, it is incumbent upon platform teams like the Microsoft.com Customer Intelligence team to ensure that their partners receive features and services that both address their needs and are of the highest quality. When considering the scale and importance of the Microsoft.com Customer Intelligence partner teams, this can be a challenging goal to achieve. Fortunately, the team has invested heavily in building strong, automated testing processes that help it produce higher quality deliverables.

The Test Investment

As an avid user and proponent of the Visual Studio Team Foundation (VSTF) and Visual Studio Team Suite (VSTS) products, the Microsoft.com Web Analytics team has built many of its processes, workflows, and quality metrics from the templates provided by VSTF. The team has been using VSTF since the early beta versions and believes strongly in the structure and value that the tool provides in terms of integrating workflow, project tracking, coding, testing, and reporting into one suite of tools. In some cases where the tool did not address the team’s immediate needs, the team simply utilized the built-in extensibility model and extended the functionality. Overall, the team has invested a great deal in managing its end-to-end project tracking through VSTF, especially for its test work. Let’s review some of the tactics the team has applied.

  • Automate as much as possible. The team generally believes that investments in test automation wherever possible pays long-term dividends. Despite TFS lacking automated processes specific to its needs, the team has extended the TFS capabilities using the extensibility tools that are currently available and have increased the team’s productivity as a result. The team recommends that other teams explore the extensibility options, including the Visual Studio Team Foundation Power Tools, and automate their processes wherever possible.

  • Use static code analysis. Static code analysis is a great way to quickly evaluate code before it gets checked in for compliance with predefined best practices or custom rules. The Microsoft.com Customer Intelligence team is currently leveraging both predefined rules and custom rules within its static code analysis practices. The team has invested in developing these custom rules to ensure that certain design principles are consistent across the code base. The team recommends that teams incorporate the static code analysis rules within their processes, especially as part of check-in procedures.

  • Write automated tests and build test suites. In addition to using static code analysis, the team also invests in building out a corpus of unit and functional automated tests. Generally speaking, developers on the team deliver unit tests as part of their feature work. These tests are generally very focused on specific functionality of the specific feature code. By contrast, testers author more functional tests, which often cover broader test scenarios. As automated tests are created, VSTF is used to create test lists or suites of test cases that can be applied at different phases of application testing.

  • Establish check-in policies. Creating check-in policies allows application development teams to enforce a certain level of quality in the code being checked into source control. This ensures that a certain classification of bugs does not get introduced into the source repository, thus reducing the total volume of bugs likely to be found at the end of the development cycle. The team firmly believes that establishing check-in policies and enforcing them through VSTF affords the team increased control over the quality of code being checked in. Through these processes, the team gains increased confidence in the check-in since it can verify that certain test cases have been executed, static code analysis has been run, and code reviews have been conducted.

  • Automated build and test processesThe team automated its application builds and Build Verification Testing (BVT) processes using Visual Studio Team Foundation Build. This has allowed the team to ensure that each automated build that is initiated benefits from a BVT test pass. These processes have the intelligence built in to get application code from source control, copy to the appropriate lab server, execute the build, run through the appropriate test cases, and file bugs when builds fail. Additionally, the team extended the existing functionality of TFS in a way that allows it to correlate test case results to each specific build.

Managing Quality

In addition to the process and automation investments that the Microsoft.com Customer Intelligence team has incorporated within its engineering procedures, the team has also focused on integrating the appropriate quality metrics. The team primarily utilizes these metrics to understand test progress and manage the overall effectiveness of its efforts. Many of the metrics and reports it currently incorporates are available within the reporting capabilities of VSTF. Others have been custom built using tools like Microsoft Excel to query the data warehouse that VSTF makes available through its data analysis tier. Let’s review some examples of the metrics that the team finds most valuable.

  • Code coverage metrics. The team uses the code coverage metrics within VSTF to understand the relative effectiveness of its testing efforts. It studies the combination of the code coverage results for blocks and cyclomatic complexity to better understand how much of the code is being covered by the automated testing. Achieving 100 percent coverage is generally unrealistic, so the team strives to achieve between 70 and 80 percent coverage and assumes that the remaining code will either be covered by manual testing or represent edge case scenarios that are lower priority.

  • Code churn metrics. To understand the quality of the code being checked in each day and appearing in the daily builds, the team relies on code churn metrics. This allows the team to understand the rate of change being introduced into the code each day and therefore be more predictive about the volatility or stability of the build. In an agile development process, code churn metrics help the team balance test priorities each day and manage its test efforts more effectively.

  • Build a test scorecard. To understand the overall quality of the application code, the team constructed a composite scorecard that is composed of multiple metrics. The team built this scorecard by creating Excel spreadsheets that connect to the TFS data warehouse and execute custom queries. These scorecards are then used to quickly evaluate the team’s overall progress for the current release cycle by running one composite report as opposed to several.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
3.14.144.108