Verax can help you reduce IT costs and achieve breakthrough efficiency results. Find out more.

Big Data and Big Process: Old wine, new bottles?

Connie Moore of Forrester Research discusses some hot ideas about Big Data, Big Process and Customers in this nine minute video. Radical new approaches, or yet more hype?

Big Data

Big Data refers to the availability of huge amounts of data about customer behaviours, markets, sales, etc. These huge datasets are both a problem and an opportunity.

Let’s look at the opportunity first. Mckinsey has called Big Data “The next frontier for innovation, competition, and productivity” and identifies five areas where the management of big data sets will become increasingly important:

1 – unlock value by making information transparent and usable at much higher frequency (i.e. exploit opportunity more quickly)

2 – expose variability and boost performance through more accurate and detailed information on everything from product inventories to sick days (i.e. improve management and performance of our businesses)

3 – tailor products or services precisely through ever-narrower segmentation of customers (i.e. sell more)

4 – improve decision-making with more sophisticated analytics (i.e. do the right things at the right time to benefit business)

5 – improve the development of the next generation of products and services (i.e. produce services and products that pre-empt demand and delight customers)

The first problem is that we do not currently have the technologies to effectively manage and exploit the very largest datasets. My sense is that this merely technological challenge will be solved over the next few years à la Moore’s Law.

Big Process

The much more significant problem is, how can we handle vast amounts of information in a way which improves our businesses and the customer experience, while keeping data secure and customer privacy intact? This is where ‘Big Process’ comes in.

Big Process is said to be the way in which we design and transform overall business processes – such as the sales process – in such a way that big data drives decision making and ‘best next steps’ across the whole enterprise. It is the opposite of the silo approach to modular process improvement.

For example, a customer’s online behaviour or past transactions can be matched with new products and services. Or, to take an example that might interest IT Service Providers, the numbers of outstanding service incidents and queued requests can be addressed and managed in real time across the whole enterprise in a way which maximises scarce resources and serves customers more quickly.

But isn’t this what process has always been about – organising resources, activities and people to achieve a desired outcome? Big Process is a new name, but the truth is that all well designed processes should be scalable and resistant or adaptable to increases in the amount of data handled. Successful processes are based upon finer qualitative decisions, not quantitative ones.

It is also true that we have repeatedly seen miracle applications, systems and technologies fail to deliver expected benefits because organisations did not redesign their processes and re-train their people – this is nothing new. In the world of Big Data, process design (small, medium or Big) will define whether we drown in a sea of data or use it to understand how to be more successful.

The Bottom Line

Big Data is a fact of life and needs to be addressed. Big Process is old wine in new bottles. Tightly couple processes and data to stay competitive.

Verax can help you reduce IT costs and achieve breakthrough efficiency results. Find out more.

Speak Your Mind

*