Big data used to be, with hindsight, pedestrian. Conferences on the subject (in those days ‘data warehousing’) used to be about companies showing off. One speaker would say that his data warehouse could hold 8 petabytes of data, the next would up the stakes to 10, and the next would see them both at 12.
Big data seemed to have gone away, or at least gone very quiet. But listening to the short interview with Anthony Behan of Cloudera just shows how the complete opposite is true.
Behan referenced two service provider customers, one of whom is already processing 30 million transactions a minute and another that is gearing up to process a billion transactions every 40 seconds.
A billion transactions every 40 seconds. Now that is vast. Away with your puny petabytes!
That scale of data (vast, huge, overwhelming, exciting) certainly concentrates the mind. How do you begin to manage that level of information, let alone make sense of it to the point of using it to improve your customers’ experience?
The short answer is analytics and clever use of the cloud, at extraordinary scale.
The long answer is, well, a lot longer.
The various pressures on telcos are increasing. For instance, regulations rule where and how you can manipulate customer records. Some of it must be done on your premises, some of it can be managed in a more public cloud. The challenge of managing big data is a serious one.
As Behan says, “data has never been more important. Network, customer, finance, partner, and operational data all contribute to a comprehensive view of business performance and service delivery.”
Get it right, says Behan, and the result will be an organisation intelligently powered by that wealth of information and knowledge, with the ability to improve all of the processes he mentions continually.
Get it wrong and, well, you know the old handle that data is the new oil. The oil spill, if you get it wrong, could be very messy, very expensive and take years to clean up.