Chapter 1. Introduction to pandas and Data Analysis

In this chapter, we address the following:

  • Motivation for data analysis
  • How Python and pandas can be used for data analysis
  • Description of the pandas library
  • Benefits of using pandas

Motivation for data analysis

In this section, we will discuss the trends that are making data analysis an increasingly important field of endeavor in today's fast-moving technological landscape.

We live in a big data world

The term big data has become one of the hottest technology buzzwords in the past two years. We now increasingly hear about big data in various media outlets, and big data startup companies have increasingly been attracting venture capital. A good example in the area of retail would be Target Corporation, which has invested substantially in big data and is now able to identify potential customers by using big data to analyze people's shopping habits online; refer to a related article at http://nyti.ms/19LT8ic.

Loosely speaking, big data refers to the phenomenon wherein the amount of data exceeds the capability of the recipients of the data to process it. Here is a Wikipedia entry on big data that sums it up nicely: http://en.wikipedia.org/wiki/Big_data.

4 V's of big data

A good way to start thinking about the complexities of big data is along what are called the 4 dimensions, or 4 V's of big data. This model was first introduced as the 3V's by Gartner analyst Doug Laney in 2001. The 3V's stood for Volume, Velocity, and Variety, and the 4th V, Veracity, was added later by IBM. Gartner's official definition is as follows:

 

"Big data is high volume, high velocity, and/or high variety information assets that require new forms of processing to enable enhanced decision making, insight discovery and process optimization."

 
 --Laney, Douglas. "The Importance of 'Big Data': A Definition", Gartner

Volume of big data

The volume of data in the big data age is simply mind-boggling. According to IBM, by 2020, the total amount of data on the planet would have ballooned to 40 zettabytes. You heard that right-40 zettabytes is 43 trillion gigabytes, which is about 4 × 1021 bytes. For more information on this refer to the Wikipedia page on Zettabyte - http://en.wikipedia.org/wiki/Zettabyte.

To get a handle of how much data this would be, let me refer to an EMC press release published in 2010, which stated what 1 zettabyte was approximately equal to:

 

"The digital information created by every man, woman and child on Earth 'Tweeting' continuously for 100 years " or "75 billion fully-loaded 16 GB Apple iPads, which would fill the entire area of Wembley Stadium to the brim 41 times, the Mont Blanc Tunnel 84 times, CERN's Large Hadron Collider tunnel 151 times, Beijing National Stadium 15.5 times or the Taipei 101 Tower 23 times..."

 
 --EMC study projects 45× data growth by 2020

The growth rate of data has been fuelled largely by a few factors, such as the following:

  • The rapid growth of the Internet.
  • The conversion from analog to digital media coupled with an increased capability to capture and store data, which in turn has been made possible with cheaper and more capable storage technology. There has been a proliferation of digital data input devices such as cameras and wearables, and the cost of huge data storage has fallen rapidly. Amazon Web Services is a prime example of the trend toward much cheaper storage.

The Internetification of devices, or rather Internet of Things, is the phenomenon wherein common household devices, such as our refrigerators and cars, will be connected to the Internet. This phenomenon will only accelerate the above trend.

Velocity of big data

From a purely technological point of view, velocity refers to the throughput of big data, or how fast the data is coming in and is being processed. This has ramifications on how fast the recipient of the data needs to process it to keep up. Real-time analytics is one attempt to handle this characteristic. Tools that can help enable this include Amazon Web Services Elastic Map Reduce.

At a more macro level, the velocity of data can also be regarded as the increased speed at which data and information can now be transferred and processed faster and at greater distances than ever before.

The proliferation of high-speed data and communication networks coupled with the advent of cell phones, tablets, and other connected devices, are primary factors driving information velocity. Some measures of velocity include the number of tweets per second and the number of emails per minute.

Variety of big data

The variety of big data comes from having a multiplicity of data sources that generate the data, and the different formats of the data that are produced.

This results in a technological challenge for the recipients of the data who have to process it. Digital cameras, sensors, the web, cell phones, and so on are some of the data generators that produce data in differing formats, and the challenge comes in being able to handle all these formats and extract meaningful information from the data. The ever-changing nature of the data formats with the dawn of the big data era has led to a revolution in the database technology industry, with the rise of NoSQL databases to handle what is known as unstructured data or rather data whose format is fungible or constantly changing. For more information on Couchbase, refer to "Why NoSQL- http://bit.ly/1c3iVEc.

Veracity of big data

The 4th characteristic of big data – veracity, which was added later, refers to the need to validate or confirm the correctness of the data or the fact that the data represents the truth. The sources of data must be verified and the errors kept to a minimum. According to an estimate by IBM, poor data quality costs the US economy about $3.1 trillion dollars a year. For example, medical errors cost the United States $19.5 billion in 2008; for more information you can refer to a related article at http://bit.ly/1CTah5r. Here is an info-graphic by IBM that summarizes the 4V's of big data:

Veracity of big data

IBM on the 4 V's of big data

So much data, so little time for analysis

Data analytics has been described by Eric Schmidt, the former CEO of Google, as the Future of Everything. For reference, you can check out a YouTube video called Why Data Analytics is the Future of Everything at http://bit.ly/1KmqGCP.

The volume and velocity of data will continue to increase in the big data age. Companies that can efficiently collect, filter, and analyze data results in information that allows them to better meet the needs of their customers in a much quicker timeframe will gain a significant competitive advantage over their competitors. For example, data analytics (Culture of Metrics) plays a very key role in the business strategy of http://www.amazon.com/. For more information refer to Amazon.com Case Study, Smart Insights at http://bit.ly/1glnA1u.

The move towards real-time analytics

As technologies and tools have evolved, to meet the ever-increasing demands of business, there has been a move towards what is known as real-time analytics. More information on Insight Everywhere, Intel available at http://intel.ly/1899xqo.

In the big data Internet era, here are some examples:

  • Online businesses demand instantaneous insights into how the new products/features they have introduced in their online market are doing and how they can adjust their online product mix accordingly. Amazon is a prime example of this with their Customers Who Viewed This Item Also Viewed feature.
  • In finance, risk management and trading systems demand almost instantaneous analysis in order to make effective decisions based on data-driven insights.
..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
3.145.202.61