The Tools at Our Disposal

Clearly, there are many metrics and KPIs to gather in order to understand and improve your online presence. Fortunately, there’s a wide range of tools available for watching yourself online. The trick is to use the right technologies to collect the metrics that matter the most to your business.

At the broadest level, there are three categories of monitoring technology you can use to understand web activity. You can collect what users do from various points in the web connection; you can use search engines that crawl and index the web, and may send you alerts for changes or keywords; and you can run scripts that test your site directly.

Collection Tools

There are many ways to collect visitor information, depending on how much access you have to your servers, the features of those visitors’ browsers, and the kind of data you wish to collect.

Collection can happen on your own machines, on intermediate devices that collect a copy of traffic, through a browser toolbar, or on the visitor’s browser. It may also happen with third-party services like FeedBurner (for RSS feeds) or Mashery (for APIs and web services) that proxy your content or manage your APIs.

The volume of data collected in these ways grows proportionally with traffic. Collecting data may slow down your servers, and may also pose privacy risks. You also need to consider what various collection methods can see, since they all have different perspectives. An inline sniffer can’t see client-side page load time, for example; similarly, client-side JavaScript can’t see server errors.

Search Systems

Another way to monitor your online presence is through the use of search engines that run scripts—called crawlers—that visit web pages and follow links on sites to collect and index the data they find.

There are hundreds of these crawlers on the Web. Some of them feed search giants like Google, Yahoo!, and MSN. More specialized crawlers also look for security problems, copyright violations, plagiarism, contact information, archiving, and so on.

Crawlers can’t index the entire Web. Many websites are closed to crawlers, either because they require a login, because of their dynamic nature, or because they’ve blocked crawlers from indexing some of the content. This is common for news sites and blog comment threads. As a result, you need to use site-specific internal search tools alongside global search engines for complete coverage of web activity.

While we may run searches to see what’s happening online, a more practical way to manage many search queries is to set up alerts when certain keywords arise or when specific pages change.

Testing Services

In addition to collecting visitor data and setting up searches and alerts, you can run tests against websites to measure their health or check their content. These tests can simulate specific browsers or run from several geographic regions to pinpoint problems with a specific technology or location.

Testing services can also watch your competitors or monitor third-party sites, like payment or mapping, on which your main application depends. There’s a wide range of such services available, from open source, roll-your-own scripting to global testing networks that can verify a site’s health from almost anywhere and run complex, multistep transactions.

Ultimately, the many metrics we’ve looked at above, using the collection, search, and testing approaches outlined here, give us complete web visibility. That visibility amounts to four big questions, which we’ll look at next.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
3.12.108.175