SmartChimps -- March 19, 2013 -- By Anukool Lakhina, founder and CEO, Guavus
Most operators think they know how promising their data is. The truth is they don’t realise just how much value is hidden in the massive amounts of data they sit on, even as more data rolls on in. And because of this, the best insights – the ones that can be harnessed for transformative change – are at risk of getting buried in today’s data avalanche.
Back when I was working on my PhD, I worked in the lab of a major telecommunications operator. My job was to run algorithmic possibilities on sensor-generated data so I could identify valuable trends and clues to network performance.
The best part of the day was when the FedEx truck arrived and I could get my hands on boxes of network data storage drives with mountains of months-old data generated by those sensors just waiting to be analysed. Talk about timely insights being dead on arrival!
My employer had no idea what insights lay buried inside those drives. And yet, collecting, storing and sending the data out for analysis was the only option it had at the time. At that point I realised the model for data analysis had to change on a fundamental level, especially if data was going to continue its exponential growth curve. Operators needed to analyse data as the avalanche roared in, and it was going to take some sturdy tools to do it.
The smartphones and tablets we rely on today contain a wealth of information on us; our preferences, our habits, and our behaviour. And this is just one kind of machine interface. There are also cars, for example, which now come equipped with an array of sensors to gauge everything from driving styles, to road conditions and wear-and-tear, all in the interest of making driving safer and more enjoyable. Meanwhile, cities are deploying wireless sensors in stoplights for improved traffic surveillance. In disaster-prone regions, bridges and buildings can even evaluate their own stress points.
This phenomenon gives us an extraordinary opportunity — one that no civilisation has had before — to know the now. If operators act fast enough, they can distil that knowledge into timely, intelligent, data-driven insights for more agile operational and business processes.
So, now that the ability to gather such immediate data from a variety of devices and places exists in our world, it’s imperative we put it to work for our advantage. How can operators parse data in a timely manner to identify trends, glean new insights into customer behaviour, and respond immediately to changing market dynamics or customer habits? How can we best take divergent sources of data and dynamically fuse them together so people, machines and processes make optimal responses at any given moment in time?
To ensure that operators can make the most of their investment they need a business problem to solve. The introduction of 4G provides a clear example of this. Operators could use the historical data from 3G to answer crucial questions: which customers are most likely to move to 4G; which are the biggest data consumers; which applications consume the most bandwidth; and where these customers are geographically concentrated? This would help to identify new revenue opportunities by device type and so much more.
To ensure a return on investment, operators need this information to accurately forecast capacity requirements and put in place profitable pricing tiers linked to customer consumption. Essentially, if you have the questions data can be mined to give you the answers.
In order to save this data from a premature death, and catapult it into a driving force for a data-driven global economy both the operator and the analytics architecture must rise to the occasion. Operators need a new approach to analytics where contextually-aware applications are based on specific user case, like 4G, built on a new data processing stack and backed by a new economic model.
As it stands today, big data analytics technology is comprised of many disparate toolsets and technologies. What’s missing is a foundational architecture to support all these individual tools and technologies; a complete, holistic stack that can help operators get from data ingestion to data decisions in one fell swoop.
This new architecture must recognise that a sensor-rich world creates data continuously, and in order to take immediate action, the analysis too must also be done continuously; rather than after-the-fact once the data is stored away. This new architecture must also combine a variety of data sources instead of keeping them in silos. And it must elastically scale to the petabytes of structured and unstructured data that are now generated on a nonstop basis.
Equally important is the need for a new economic model for data processing. Today, operator customers spend tens of millions of upfront dollars on data projects, the majority of which goes toward capturing and storing the data. They must then wait a year or more to start seeing value from their data assets.
Our data-rich world therefore needs a new paradigm where operators have analytics front of mind rather than storage, taking an agile, interactive approach that demonstrates the value of a particular idea in the first days and weeks of deployment.
Once proven, this use case is swiftly rolled out as an application that any business manager can use to make decisions. This business value-led approach to big data can then be scaled across other functional areas of the business and power data-driven decision-making across the enterprise.
Once operators embrace this new approach, big data’s vast potential will no longer be crushed by its own weight. If data is at risk of being lost in the avalanche, analytics platforms should serve as first responders to the emergency.
Guavus, provider of big data analytics solutions, has been built from the ground up to unlock the value of operational, sensor and network-generated big data to reduce the economic and technology risk associated with deploying a traditional business intelligence solution.