Editor’s Note: Cutting costs, improving government efficiency, boosting effectiveness and slashing red tape are all possible using a new wave of data analytics. In this post, the IBM Center’s John Kamensky discusses a new research project that examines “Analytics 3.0” and the benefits it can bring to both the public and private sectors.
Dr. Thomas Davenport, in a recent Harvard Business Review article, says “Some of us now perceive another shift, fundamental and far-reaching enough that we can fairly call it Analytics 3.0.” What does this mean for leaders of large organizations?
The Three Phases of Analytics
Davenport writes that the field of “analytics” has evolved over the past 60 years in three phases:
Analytics 1.0 was born in the mid-1950s and was often referred to as “business intelligence.” He said it gave managers “the fact-based comprehension to go beyond intuition when making decisions.” It involved examining data from production processes and customer interactions. He said that new computing technologies “were key” and were often custom-built.
Analytics 2.0, he says, emerged in the mid-2000s when internet and social media companies—Amazon, Google, eBay, etc.—“began to amass and analyze new kinds of information.” This was often referred to as “big data.” Davenport says big data differs from small data because big data comes from sources outside the company and was not generated solely by the organization’s own internal systems. He says it comes from sensors and social media as well as video and audio recordings. It could not be stored on a single server. Much of it has to be stored in the cloud.
Analytics 3.0, Davenport claims, involves collecting data on every activity associated with your products, services, and customers, because “every device, shipment and consumer leaves a trail.” This is not done just by information companies but by every organization. He says organizations “have the ability to embed analytics and optimization into every business decision made at the front lines of your operations.”
Private Sector Examples
Davenport offers several examples of large companies that he believes have made the leap to Analytics 3.0:
General Electric not only builds engines and medical devices, it embeds sensors in them so the users can optimize their use. Davenport says: “With sensors streaming data from turbines, locomotives, jet engines, and medical-imaging devices, GE can determine the most efficient and effective service intervals for those machines.”
UPS has installed telemetric sensors in its fleet of 46,000 delivery trucks that track speed, direction, braking, and drivetrain performance. Using these data, UPS redesigns drivers’ routes to cut 85 million miles out of drivers’ routes, saving both time and fuel costs.
Schneider Electric, a French energy management firm, handles energy distribution in utility companies. With its devices, the utilities are able to “integrate millions of data points on network performance and lets engineers use visual analytics to understand the state of the network,” according to Davenport.
Public Sector Examples
As in most trends, government is often behind. While Davenport’s article focuses on the impact of Analytics 3.0 in the private sector, he has written previously about the strategic use of analytics in government and it is not hard to make the translation of Analytics 3.0 to the public sector. There are already numerous examples of local governments and federal agencies putting this idea into practice.
City and regional transportation departments are creating “intelligent transportation systems” that use remote cameras and cell phone data to determine traffic conditions in order to adjust the timing of traffic lights and pre-position emergency equipment.
Public hospitals and the Veterans Health Administration are using mobile devices to input health data in real-time, between doctors and patients so treatment decisions can be made in minutes instead of days. They are also embedding sensors in patient protocols and hospital supplies to track their use and replacement.
The Department of Homeland Security is planning its next steps to help states and localities to employ real-time, continuous monitoring of computer networks to detect and deter cyber-security events.
Several federal agencies are pioneering the concept of “citizens as sensors,” as proposed by Davenport. For example, the Centers for Disease Control and Prevention are using social media data to more quickly predict the spread of the flu than had been possible using traditional methods. And the US Geological Survey is using Twitter and other social media to detect earthquakes, as they are being reported by citizens, sometimes faster and more accurately than their scientific equipment.
Non-profit groups are also using the “citizen sensor” concept. For example one group uses citizen self-reporting for pollution monitoring purposes, and another&mdashOpenStreetMap&mdashuses it to capture detailed mapping data.
In his HBR article, Dr. Davenpport offers a series of steps managers should take to refocus their organizations in order to take advantage of Analytics 3.0. These include actions such as designating chief analytics officers and developing new ways of deciding and managing that reflect the use of data.
- encouraging agencies to collaborate and share in the use of existing common data sources – both programmatic and administrative,
- supporting expanded use of social media interactions (the CDC is a pioneer), and
- designating chief innovation officers to sponsor rapid experimentation and champion the piloting of different approaches to collecting and using data.
Davenport concludes by noting that the push for “big data” has been a huge step forward, but that in the new data economy, organizations must “once again fundamentally rethink how the analysis of data can create value for themselves and their customers.”